Human brain the model of computing future
Updated: 2016-08-01 07:56
By Bruno Michel(China Daily)
|
|||||||||
WANG XIAOYING/CHINA DAILY |
Ever since the American computer scientist John McCarthy coined the term "Artificial Intelligence" in 1955, the public has imagined a future of sentient computers and robots that think and act like humans. But while such a future may indeed arrive, it remains, for the moment, a distant prospect.
And yet the foreseeable frontier of computing is no less exciting. We have entered what we at IBM call the Cognitive Era. Breakthroughs in computing are enhancing our ability to make sense of large bodies of data, providing guidance in some of the world's most important decisions and potentially revolutionizing entire industries.
The term "cognitive computing" refers to systems that, rather than being explicitly programmed, are built to learn from their experiences. By extracting useful information from unstructured data, these systems accelerate the information age, helping their users with a broad range of tasks, from identifying unique market opportunities to discovering new treatments for diseases to crafting creative solutions for cities, companies and communities.
The Cognitive Era marks the next stage in the application of science to understand nature and advance human prosperity. Its beginning dates to early 2011, when the cognitive computing system Watson beat two human champions on Jeopardy!, a game show.
Broadly, cognitive systems offer five core capabilities. First, they create deeper human engagement, using data about an individual to create more fully human interactions. Second, they scale and elevate expertise, learning from experts in various fields and making that know-how available to people. Third, they provide products, such as those connected to the "internet of things", with the ability to sense the world around them and to learn about their users. Fourth, they allow their operators to understand large amounts of data, helping manage workflows, providing context, and allowing for continuous learning, better forecasting and improved operational effectiveness. And, finally-perhaps most important-they allow their users to perceive patterns and opportunities that would be impossible to discover through traditional means.
Cognitive systems are inspired by the human brain, an organ that still has much to teach us. Today, computers consume about 10 percent of the world's electricity output, according to Mark Mills, CEO of the Digital Power Group. To benefit fully from the Cognitive Era, we will have to be able to harness huge amounts of information; during the next 15 years, the amount of "digitally accessible" data is expected to grow by a factor of more than 1,000. Performing the calculations necessary for using such a large amount of data will not be possible without huge strides in improving energy efficiency.
Matching the performance and efficiency of the human brain will likely require us to mimic some of its structures, for which we can arrange computer components in a dense 3D matrix similar to a human brain, maximizing not performance, but energy efficiency.
Arranging computer chips in a 3D environment puts the various elements of the computer closer to one another. This reduces the time they take to communicate and improves energy efficiency by a factor of as much as 5,000, potentially providing computers with efficiency close to that of a biological brain.
But man-made computers are so inefficient not only because they need to power the chips, but also because they need energy to run the air conditioners that remove the heat generated by the processors. The human brain has a lesson to teach here as well. Just as the brain uses sugar and blood to provide energy and cooling to its various regions, a 3D computer could use coolant fluid to deliver energy to the chips.
By adopting some of the characteristics of the human brain, computers have the potential to become far more compact, efficient and powerful. And this, in turn, will allow us to take full advantage of cognitive computing-providing our real brains with new sources of support, stimulus and inspiration.
The author is a scientist at IBM Research, Zurich.
Project Syndicate
- S. Korea to launch WWII 'comfort women' victims foundation
- China to become Australia's biggest tourist source market
- Patient shoots, kills doctor in Berlin then kills himself
- One of church attackers tried to join IS in Syria
- China's coal usage may peak by 2020, experts say
- Bavarian bomber pledged allegiance to Islamic State
- In pictures: Aerial images of Rio's Olympic venues
- Images reveal distinctive Tunpu culture in Guizhou
- Ten photos from around China: July 22 – 28
- Welcome back, daddy!
- Sweat, hard work and pain: Life of model
- Top 10 most profitable companies in the world
- Exhibition showcases Chinese artworks in London
- In pics: Cool ways to beat the heat wave
Most Viewed
Editor's Picks
Anti-graft campaign targets poverty relief |
Cherry blossom signal arrival of spring |
In pictures: Destroying fake and shoddy products |
China's southernmost city to plant 500,000 trees |
Cavers make rare finds in Guangxi expedition |
Cutting hair for Longtaitou Festival |
Today's Top News
Ministry slams US-Korean THAAD deployment
Two police officers shot at protest in Dallas
Abe's blame game reveals his policies failing to get results
Ending wildlife trafficking must be policy priority in Asia
Effects of supply-side reform take time to be seen
Chinese State Councilor Yang Jiechi to meet Kerry
Chinese stocks surge on back of MSCI rumors
Liang avoids jail in shooting death
US Weekly
Geared to go |
The place to be |