Monday, October 25, 2021

g-f(2)600 THE BIG PICTURE OF THE DIGITAL AGE (10/25/2021), MIT Technology Review, How AI is reinventing what computers are




ULTRA-condensed knowledge


"g-f" fishing of golden knowledge (GK) of the fabulous treasure of the digital ageArtificial Intelligence, AI is reinventing computers (10/25/2021)  g-f(2)426 

Lessons learned, MIT Technology Review 


  • Three key ways artificial intelligence is changing what it means to compute.
  • Almost without our noticing, AI has become part of our day-to-day lives. And it’s changing how we think about computing.
  • AI changes that on at least three fronts: how computers are made, how they’re programmed, and how they’re used. Ultimately, it will change what they are for.
  • “The core of computing is changing from number-crunching to decision-­making,” says Pradeep Dubey, director of the parallel computing lab at Intel. Or, as MIT CSAIL director Daniela Rus puts it, AI is freeing computers from their boxes. 


                    Genioux knowledge fact condensed as an image


                    References




                    Extra-condensed knowledge


                    Lessons learned, MIT Technology Review


                    • Now chipmakers like Intel and Arm and Nvidia, which supplied many of the first GPUs, are pivoting to make hardware tailored specifically for AI. Google and Facebook are also forcing their way into this industry for the first time, in a race to find an AI edge through hardware. 
                    • In the last couple of years, Google has made TPUs available to other companies, and these chips—as well as similar ones being developed by others—are becoming the default inside the world’s data centers. 
                    • AI is even helping to design its own computing infrastructure. In 2020, Google used a reinforcement-­learning algorithm—a type of AI that learns how to solve a task through trial and error—to design the layout of a new TPU. 


                    Condensed knowledge




                    Lessons learned, MIT Technology Review 


                    • For the past 40 years we have been programming computers; for the next 40 we will be training them, says Chris Bishop, head of Microsoft Research in the UK. 
                    • With machine learning, programmers no longer write rules. Instead, they create a neural network that learns those rules for itself. It’s a fundamentally different way of thinking. 
                    • For decades, getting a computer to do something meant typing in a command, or at least clicking a button. 
                    • Machines no longer need a keyboard or screen for humans to interact with. Anything can become a computer. Indeed, most household objects, from toothbrushes to light switches to doorbells, already come in a smart version. But as they proliferate, we are going to want to spend less time telling them what to do. They should be able to work out what we need without being told.
                    • This is the shift from number-­crunching to decision-making that Dubey sees as defining the new era of computing.  


                    Some relevant characteristics of this "genioux fact"

                    • Category 2: The Big Picture of the Digital Age
                    • [genioux fact deduced or extracted from MIT Technology Review]
                    • This is a “genioux fact fast solution.”
                    • Tag Opportunities those travelling at high speed on GKPath
                    • Type of essential knowledge of this “genioux fact”: Essential Analyzed Knowledge (EAK).
                    • Type of validity of the "genioux fact". 

                      • Inherited from sources + Supported by the knowledge of one or more experts.


                    References


                    “genioux facts”: The online programme on MASTERING “THE BIG PICTURE OF THE DIGITAL AGE”, g-f(2)600, Fernando Machuca, October 25, 2021, blog.geniouxfacts.comgeniouxfacts.comGenioux.com Corporation.


                    ABOUT THE AUTHORS


                    PhD with awarded honors in computer science in France

                    Fernando is the director of "genioux facts". He is the entrepreneur, researcher and professor who has a disruptive proposal in The Digital Age to improve the world and reduce poverty + ignorance + violence. A critical piece of the solution puzzle is "genioux facts"The Innovation Value of "genioux facts" is exceptional for individuals, companies and any kind of organization.




                    Key “genioux facts”


                    Featured "genioux fact"

                    g-f(2)3127 Mastering the Big Picture: A Three-Month Journey Through the Digital Age

                      Your guide to understanding the evolution of digital transformation knowledge genioux Fact post by  Fernando Machuca  and  Claude Introduc...

                    Popular genioux facts, Last 30 days