Jump to content

Bust up Your Paradigms or Get Left Behind


Recommended Posts

Excellent series. Thanks.  This is a good article as well:

 

"Software is eating the world, as Marc Andreessen said, but AI is eating software."

 

https://www.linkedin.com/pulse/ai-eating-software-jensen-huang

 

Nice article even though I only understand a tiny fraction of what they're talking about.

 

"The AI revolution has arrived despite the fact Moore’s law – the combined effect of Dennard scaling and CPU architecture advance – began slowing nearly a decade ago. Dennard scaling, whereby reducing transistor size and voltage allowed designers to increase transistor density and speed while maintaining power density, is now limited by device physics."

 

Did the author mean not instead of now?

 

---

 

I wonder if any of the co's in that graphic "A sprawling ecosystem has grown up around the AI revolution", are public & worth looking at (acquisition targets..?)

 

Is Apple doing anything competitive in the San Jose wafer fab & how does Metal fit in?

 

LinkedIn sure is producing different content now (or maybe I just never noticed.)

 

Jeez, I remember looking at NVIDIA back in 2014 when it was around $20 & not buying it because all I saw was their sound processing biz & I didn't think it was anything (didn't see the whole GPU thing...) and I'd already bought Apple in 2013.

 

The question is, have we gotten to 42 yet or is that what AI is going to solve for us?

Link to comment
Share on other sites

Excellent series. Thanks.  This is a good article as well:

 

"Software is eating the world, as Marc Andreessen said, but AI is eating software."

 

https://www.linkedin.com/pulse/ai-eating-software-jensen-huang

 

Nice article even though I only understand a tiny fraction of what they're talking about.

 

"The AI revolution has arrived despite the fact Moore’s law – the combined effect of Dennard scaling and CPU architecture advance – began slowing nearly a decade ago. Dennard scaling, whereby reducing transistor size and voltage allowed designers to increase transistor density and speed while maintaining power density, is now limited by device physics."

 

Did the author mean not instead of now?

 

No he meant now.  Device physics is now a serious limitation to CPU scaling.

 

I wonder if any of the co's in that graphic "A sprawling ecosystem has grown up around the AI revolution", are public & worth looking at (acquisition targets..?)

 

I wonder the same thing myself.  I missed NVIDIA too.

 

Is Apple doing anything competitive in the San Jose wafer fab & how does Metal fit in?

 

I don't know, Apple keeps what it is working on secret, as I'm sure you know, but there are rumors.  Apple Developing 'Apple Neural Engine' Chip to Power AI in iOS Devices

 

LinkedIn sure is producing different content now (or maybe I just never noticed.)

 

I get some pretty good articles emailed to me from Linkedin.  Maybe because I follow a bunch of tech related companies and people on LinkedIn.

 

Jeez, I remember looking at NVIDIA back in 2014 when it was around $20 & not buying it because all I saw was their sound processing biz & I didn't think it was anything (didn't see the whole GPU thing...) and I'd already bought Apple in 2013.

 

The question is, have we gotten to 42 yet or is that what AI is going to solve for us?

 

I read something pretty interesting about what Adam's could have meant by "42".  42 is the ASCII code for the asterisk symbol (*) on the keyboard.  Which is sometimes used as a wildcard symbol when searching or in some computer code such as regular expressions.  So what Deep Thought was actually saying is that the meaning of life, the universe, and everything is whatever you want it to be.

 

Link to comment
Share on other sites

Excellent series. Thanks.  This is a good article as well:

 

"Software is eating the world, as Marc Andreessen said, but AI is eating software."

 

https://www.linkedin.com/pulse/ai-eating-software-jensen-huang

 

Nice article even though I only understand a tiny fraction of what they're talking about.

 

"The AI revolution has arrived despite the fact Moore’s law – the combined effect of Dennard scaling and CPU architecture advance – began slowing nearly a decade ago. Dennard scaling, whereby reducing transistor size and voltage allowed designers to increase transistor density and speed while maintaining power density, is now limited by device physics."

 

Did the author mean not instead of now?

 

No he meant now.  Device physics is now a serious limitation to CPU scaling.

 

I wonder if any of the co's in that graphic "A sprawling ecosystem has grown up around the AI revolution", are public & worth looking at (acquisition targets..?)

 

I wonder the same thing myself.  I missed NVIDIA too.

 

Is Apple doing anything competitive in the San Jose wafer fab & how does Metal fit in?

 

I don't know, Apple keeps what it is working on secret, as I'm sure you know, but there are rumors.  Apple Developing 'Apple Neural Engine' Chip to Power AI in iOS Devices

 

LinkedIn sure is producing different content now (or maybe I just never noticed.)

 

I get some pretty good articles emailed to me from Linkedin.  Maybe because I follow a bunch of tech related companies and people on LinkedIn.

 

Jeez, I remember looking at NVIDIA back in 2014 when it was around $20 & not buying it because all I saw was their sound processing biz & I didn't think it was anything (didn't see the whole GPU thing...) and I'd already bought Apple in 2013.

 

The question is, have we gotten to 42 yet or is that what AI is going to solve for us?

 

I read something pretty interesting about what Adam's could have meant by "42".  42 is the ASCII code for the asterisk symbol (*) on the keyboard.  Which is sometimes used as a wildcard symbol when searching or in some computer code such as regular expressions.  So what Deep Thought was actually saying is that the meaning of life, the universe, and everything is whatever you want it to be.

 

Cool, to everything here & especially the wildcard thing...

Link to comment
Share on other sites

  • 2 weeks later...

Excellent series. Thanks.  This is a good article as well:

 

"Software is eating the world, as Marc Andreessen said, but AI is eating software."

 

https://www.linkedin.com/pulse/ai-eating-software-jensen-huang

 

 

It appears that the revolution isn't AI at all.

 

The big thing here is increasingly powerful alternatives to CPUs. CPU's have hit a dead-end and so the whole industry will evolve sideways. The GPU is only the beginning. We are going to see more and more very cheap alternatives to the CPU that outperform CPU's by orders of magnitude on specific tasks.

Link to comment
Share on other sites

Excellent series. Thanks.  This is a good article as well:

 

"Software is eating the world, as Marc Andreessen said, but AI is eating software."

 

https://www.linkedin.com/pulse/ai-eating-software-jensen-huang

 

 

It appears that the revolution isn't AI at all.

 

The big thing here is increasingly powerful alternatives to CPUs. CPU's have hit a dead-end and so the whole industry will evolve sideways. The GPU is only the beginning. We are going to see more and more very cheap alternatives to the CPU that outperform CPU's by orders of magnitude on specific tasks.

 

The alternatives to CPU's have been going on for decades.  I believe these chips are called ASIC's.  Xilinx is one of the companies that make these.  Altera is another one.

 

One very good example of this is that in mining for Bitcoin, there were (are?) specialized computers that use ASICs and only do one thing...mine for bitcoins.  They

are incredibly efficient at this and are WAY faster than GPU's...Bitmain's "antminers" are a good example of this.

 

 

 

Link to comment
Share on other sites

Excellent series. Thanks.  This is a good article as well:

 

"Software is eating the world, as Marc Andreessen said, but AI is eating software."

 

https://www.linkedin.com/pulse/ai-eating-software-jensen-huang

 

 

It appears that the revolution isn't AI at all.

 

The big thing here is increasingly powerful alternatives to CPUs. CPU's have hit a dead-end and so the whole industry will evolve sideways. The GPU is only the beginning. We are going to see more and more very cheap alternatives to the CPU that outperform CPU's by orders of magnitude on specific tasks.

 

The alternatives to CPU's have been going on for decades.  I believe these chips are called ASIC's.  Xilinx is one of the companies that make these.  Altera is another one.

 

One very good example of this is that in mining for Bitcoin, there were (are?) specialized computers that use ASICs and only do one thing...mine for bitcoins.  They

are incredibly efficient at this and are WAY faster than GPU's...Bitmain's "antminers" are a good example of this.

 

You are mixing up terms a little bit (I'm an ASIC design engineer) Xilinx and Altera make PLDs (Programmable Logic Devices) which can be custom programed (even on the fly to a limited degree) to perform an application specific task/algorithm more quickly and power efficiently than a CPU could do by executing software.  These are much faster than using a CPU to perform the same function, but not as fast as nor as efficient as a pure custom designed ASIC would be.  ASIC is just an Application Specific Integrated Circuit.  Any chip which isn't fully programmable and turing complete could be called an ASIC.  The line is a little blurry sometimes as I've worked on plenty of chips which would be called ASICs yet had embedded CPUs inside them,  yet they are programmed internally so that the chip itself only performs an application specific function and can't be reprogrammed from the outside by the customer (who may not even know there is a CPU in there).  Basically when you are optimizing for programmability software run on a CPU makes sense, when you are optimizing for power efficiency and speed a custom designed circuit makes more sense.  And of course the custom logic can interact with and control the CPU and vise versa

 

A GPU has specialized circuitry to perform a math on matrices quickly, yet still be somewhat programmable which makes them useful in 3D graphics and simulating neural networks.  I'm not sure AI will ever use ASICs rather than GPUs, completely, but a GPU with self programing PLD functionality would be interesting. 

 

The reason bitcoin miners moved from CPUs to GPUs to PLDs and now to fully custom ASICs is that bitcoin mining is just doing the exact same type of calculation over and over again with no need for programmability, so it is a perfect fit for a fully custom circuit.  Turing complete CPUs can do anything the other chips can do, just more slowly and consuming more power.  A fully custom ASIC does whatever it was designed to do more quickly and power efficiently.  A PLD is like a large array of transistors which can be re-wired internally, it isn't as easy to reprogram them as it is to write new software for a CPU, but once configured it will do a specific function more quickly and efficiently than a CPU (but not as quickly or power efficiently as a full custom ASIC). 

 

Now if you think about it a CPU is just a type of ASIC, it is transistors on a chip wired in a specific way.  So a CPU itself can be implemented as fully custom designs or programed onto PLDs.  My senior project in college was designing a CPU with a custom instruction set and programing the design onto a bunch of Altera PLDs (it took 6 of Altera's largest PLDs at the time to fit my CPU design).  It was nowhere near as fast as implementing the design in a custom ASIC like fashion, but it allowed a college student like me to implement my design in less than a year, build it, and start running programs on my processor in the lab.  This is in fact how PLDs are often used (as prototyping devices), they are also used when the expected number of parts needed will be small.  If you only need a few hundred parts and they don't need to run very fast you are probably better off using a PLD, but if you are going to sell a few million chips, or they need to be as fast as possible you are probably better off designing an ASIC.  A CPU or a GPU are really just special kinds of ASICs which can run software.

 

My apologies if this was too rambling and confusing.

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...