Linus Torvalds worries about how Linux will handle end of Moore’s Law

Discussion in 'all things UNIX' started by lotuseclat79, Sep 18, 2013.

Thread Status:
Not open for further replies.
  1. lotuseclat79

    lotuseclat79 Registered Member

    Joined:
    Jun 16, 2005
    Posts:
    5,390
  2. mattdocs12345

    mattdocs12345 Registered Member

    Joined:
    Mar 23, 2013
    Posts:
    1,892
    Location:
    US
    I don't get it. Excuse my ignorance. But what exactly is there to worry about?
     
  3. mirimir

    mirimir Registered Member

    Joined:
    Oct 1, 2011
    Posts:
    9,252
    They're worried that hardware capability will stagnate in 10-20 years, which will cramp their style doing new things.
     
  4. mattdocs12345

    mattdocs12345 Registered Member

    Joined:
    Mar 23, 2013
    Posts:
    1,892
    Location:
    US
    So that's a good thing isn't it? They won't push anymore bloated software on the market to force me to upgrade to a new hardware.
     
  5. mirimir

    mirimir Registered Member

    Joined:
    Oct 1, 2011
    Posts:
    9,252
    Yes, in that sense, it would be a good thing :)
     
  6. lotuseclat79

    lotuseclat79 Registered Member

    Joined:
    Jun 16, 2005
    Posts:
    5,390
    Note: pushing bloated software is mostly a function of the OEM manufacturers of products related to M$ and has very little/nothing to do with hardware advances related to Moore's law.

    Read this link for a more complete understanding of Moore's law.

    There is also research ongoing for Graphene to replace silicon as the basis for transistors, which would probably extend Moore's law into a new era. (graphene is not mentioned on the Moore's law webpage at Wikipedia).

    At scales of ~10 nm, quantum tunneling, especially through gaps, becomes a significant phenomenon.

    The ultimate limit of 1nm would be reached roughly about 2028 according to 5 nanometer.

    Also, there is Moore's 2nd law (aka Rock's law).

    -- Tom
     
  7. aigle

    aigle Registered Member

    Joined:
    Dec 14, 2005
    Posts:
    11,164
    Location:
    UK / Pakistan
    It will be the time for a totally new type of hardware or a new software from scratch that is more faster with even a weaker hardware.

    I am sure hardware will never stagnate to this extent. Some thing new will always be on the horizon.
     
  8. Hungry Man

    Hungry Man Registered Member

    Joined:
    May 11, 2011
    Posts:
    9,146
    At some point hardware will stagnate. There is simply a limit to how fast something can get within the constraints of our universe. We will likely not reach those constraints for a long time. And, when we do, it will just mean we'll have to implement more efficient algorithms.
     
  9. mirimir

    mirimir Registered Member

    Joined:
    Oct 1, 2011
    Posts:
    9,252
    I'm reminded of The Hitchhiker's Guide to the Galaxy and the various computers built to answer "What is the meaning of Life, the Universe, and Everything?" ;) I vaguely remember the older true SF source of that, but no names :( I think that there's some of that in Vinge's stuff, but I'm guessing maybe Azimov.
     
  10. chronomatic

    chronomatic Registered Member

    Joined:
    Apr 9, 2009
    Posts:
    1,343
    I can almost assure them it wont. Go listen to some of Ray Kurzweil's talks. He mentions that while it's true that we are about to hit a theoretical limit with how small IC's can be there is still a lot of room for improvement if you design the chip in 3D instead of 2D (as they are now).

    3D chips https://en.wikipedia.org/wiki/Three-dimensional_integrated_circuit#Notable_3D_chips

    Also diamond computing might be coming into play soon: http://phys.org/news/2012-03-diamond-brightens-electronic-devices.html#nRlv
     
  11. lotuseclat79

    lotuseclat79 Registered Member

    Joined:
    Jun 16, 2005
    Posts:
    5,390
    First nanotube computer could spark carbon revolution.

    Note: This article ignores graphene which I mentioned in msg #6 above, and which remains as my prime candidate to replace silicon before any other technology. The reason I think that is that today Fab plants cost billions, and if graphene can slide in with minimal cost to convert the Fab Plants to create new wafers out of graphene thwn it will probably be less costly than other methods such as the above linked article opines.

    -- Tom
     
  12. lotuseclat79

    lotuseclat79 Registered Member

    Joined:
    Jun 16, 2005
    Posts:
    5,390
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.