on the edge of automation - qualellc.files.wordpress.com€¦ · web viewas a founding partner at...

16
http://www.technologyreview.com/qa/541571/on-the-edge-of-automation/ On the Edge of Automation Five hundred years from now, says venture capitalist Steve Jurvetson, less than 10 percent of people on the planet will be doing paid work. And next year? By Nanette Byrnes on September 28, 2015 Business Report The Future of Work: 2015 Production at Ford Motor Company’s new engine plant in Elabuga, Russia, will be 95 percent automated.

Upload: vankiet

Post on 21-Aug-2018

216 views

Category:

Documents


0 download

TRANSCRIPT

http://www.technologyreview.com/qa/541571/on-the-edge-of-automation/

On the Edge of AutomationFive hundred years from now, says venture capitalist Steve Jurvetson, less than 10 percent of people on the planet will be doing paid work. And next year?

By Nanette Byrnes on September 28, 2015

Business ReportThe Future of Work: 2015

Production at Ford Motor Company’s new engine plant in Elabuga, Russia, will be 95 percent automated.

As a founding partner at the venture capital firm Draper Fisher Jurvetson and a board member at SpaceX and Tesla Motors, Steve Jurvetson spends a lot of time thinking about the future, often the distant future. One of Elon Musk’s biggest backers—Jurvetson boasts that he owns the first Tesla production Model S—he was also a founding investor in Hotmail, the precursor to Microsoft Outlook, and sits on the board of Craig Venter’s Synthetic Genomics, the constructor of the first synthetic cell.

His firm claims to have funded companies that have created more than 20,000 jobs in the past five years, and to have brought nearly two dozen companies to $1 billion in value before exiting. Jurvetson spoke to Business Reports senior editor Nanette Byrnes about why he thinks 90 percent of people will be unemployed in 500 years and how we might transition to that sharply different future.

Are today’s new digital technologies destroying or creating jobs?

I absolutely believe in the near to medium term there is going to be net job creation, as there always has been. Think of all the Uber jobs. The opportunity is not yet fully tapped to, in a sense, distribute [over the Internet] the service economy. The service economy is bigger than the goods economy, so the online equivalent should be even bigger and more powerful than the online marketplace for physical goods.

“Five hundred years from now everyone is going to be involved in some kind of information or entertainment … There will be no farmers, there will be no people working in manufacturing.”

Many of these new jobs, including those at Uber, are taking shape on what you call the “edge of automation.” Do you fear that these jobs might quickly disappear as technology keeps evolving?

Steve Jurvetson

Everything about Uber has been automated except for the driver. The billing, the fetching—every part of it is a modern, information-centric company. Interestingly, what that means is as soon as automated vehicles arrive, that driver is easily removed. You don’t have to restructure any part of that business.

What you’re farming out to humans today are those things that computers just barely can’t do. We know from Moore’s Law and improvements in computing that in two or three years [much of this] work will be automated. ** statements like this tells me he doesn’t understand the issues to autonomy are not compute power **

http://www.independent.co.uk/news/science/intelligent-peoples-brains-wired-differently-to-those-with-fewer-intellectual-abilities-says-study-a6670441.html

Intelligent people's brains wired differently to those with fewer intellectual abilities, says studyScientists claim to have found a correlation between how well wired-up some individuals were to their cognitive abilities and general success in life

The brains of high-achieving individuals are wired up differently to those of people with fewer intellectual or social abilities according to one of the first studies to find a physical link between what goes in the brain and a person’s overall lifestyle.

An analysis of the “connectivity” between different parts of the brain in hundreds of healthy people found a correlation between how well wired-up some individuals were to their cognitive abilities and general success in life, scientists said.

The researchers found that “positive” abilities, such as good vocabulary, memory, life satisfaction, income and years of education, were linked significantly with a greater connectivity between regions of the brain associated with higher cognition. 

This was in contrast to the significantly lower brain connectivity of people who scored high in “negative” traits such a drug abuse, anger, rule-breaking and poor sleep quality, the scientists said. 

White matter fiber architecture from the Connectome Scanner dataset. 

“We’ve tried to see how we can relate what we see in the brain to the behavioural skills we can measure in different people. In doing this, we hope to able to understand what goes on ‘under the bonnet’ of the brain,” said Professor Stephen Smith of Oxford University, who led the study published in Nature Neuroscience.

The scientists were part of the $30m (£20m) Human Connectome Project funded by the US National Institutes of Health to study the neural pathways of the brain. Connectomes have been likened to taking real-time images of the living circuit diagrams governing the communication of signals from one part of the brain to another.

They compared the “connectomes” of 461 health people taken by real-time brain scanners called functional magnetic resonance imaging (fMRI) and attempted to see if there were any significant correlations with 280 different behavioural or demographic measures, such as language vocabulary, education and even income.

http://www.technologyreview.com/news/541736/facebooks-oculus-says-gaming-is-just-one-thing-youll-do-in-virtual-reality/

Facebook’s Oculus Says Gaming is Just One Thing You’ll Do in Virtual RealityThe chief technology officer of Facebook’s virtual reality division predicts that its devices will be a popular way to watch videos.

By Rachel Metz on September 24, 2015

Why It Matters

If virtual reality is to make a large impact on the world it will need to find a wide variety of uses beyond video games.

Video games are an obvious use case for virtual reality headsets like the one Facebook’s Oculus division will launch early next year. But the chief technology officer of Oculus predicted Thursday that gaming will eventually account for less than half of the time people spend using the technology.

John Carmack, well-known in the games industry for helping create the games Doom and Quake, described gaming as “the sharp end of the spear right now,” saying it will establish virtual reality as a technology consumers get excited about. But while many people love playing – and spending money on – games, much more time is spent watching videos and looking at photos.

Carmack predicted that those will also be popular in virtual reality. He said gaming will eventually make up a minority of the time that people spend using virtual reality headsets, although he didn’t specify by when he expected that to happen. “Not

everything has to be interactive,” he said. Carmack spoke at an Oculus event in Los Angeles for software developers.

Oculus plans to release its first consumer headset, the Rift, during the first quarter of next year, though the company hasn’t yet said exactly when it will come out or how much it will cost. The device will require a connection to a high-powered PC, and Oculus said Wednesday that computer makers including Acer and Dell will be rolling out PCs marked as “Oculus Ready” next year for under $1,000.

A more affordable way to enter Oculus’ virtual reality will become available in November. Samsung announced today that it will release a new version of Gear VR, a headset the company worked with Oculus to develop that uses a Samsung smartphone for its display and computing power. The new device will cost $99, half the price of the existing Gear VR device.

Oculus is working with entertainment companies to try and ensure there is plenty of content for people to experience on both the Rift and Gear VR. As well as games, for example Microsoft’s Minecraft, movies and TV shows will be available. On Wednesday, Oculus announced that existing Netflix content can be viewed using the Gear VR. A Netflix app for the device places you on a red couch inside a virtual ski chalet, in front of a giant screen on which ordinary 2-D video appears. Video content from Hulu and Vimeo will become available on Gear VR in the next few months

http://www.technologyreview.com/news/541676/are-you-ready-for-a-robot-colleague/

Are You Ready for a Robot Colleague?Robots are moving into new areas of work, and it isn’t entirely clear how staff and customers will react.

By Will Knight on September 28, 2015

Why It Matters

Employees and customers won’t always react positively to robots.

This robot, developed by Clearpath Robotics, can carry heavy loads around a factory or warehouse.

Those who work in professions from warehouse staff to hotel concierges may soon count a robot among their colleagues.

While Amazon has pioneered the use of robots in its fulfillment centers, its robots are still largely separated from human workers (see “Inside Amazon”). The next generation of workplace bots will work in much closer proximity to regular employees. Some will replace workers entirely, but most will simply take on the more mundane tasks of a human’s job.

Clearpath Robotics, a company based in Ontario, Canada, launched a robotic platform designed to take on the work done by forklift truck drivers in warehouses and factories. The company’s system, called Otto, was on show at the RoboBusiness conference in San Jose, California, last week. Several companies are testing the system, including GE, which has also invested in Clearpath.

Otto first must be driven through a warehouse, so that its laser sensors can map the environment. The user can then load the system with heavy items and command it to go from one part of the warehouse to another. And Otto will navigate around obstacles, including people, as it goes.

Another new robot on display at the show, from a company called Fetch Robotics, based in San Jose, will follow warehouse workers around with a bin, helping them retrieve items from shelves. Once a bin has been filled, the robot will roll away, and another will take its place.

Melonee Wise, founder of Fetch, says careful thought needs to be put into the way other workers will react to a robot (see “Innovators Under 35: Melonee Wise”). “So it’s in our best interest to make that interaction as smooth and enjoyable as possible,” she says. “If they don’t want to work with our product, then no matter how good it is, it isn’t going to make it very far.”

Another version of Fetch’s robot has a torso and a single arm at its center. That robot is meant to take the place of a human worker, retrieving products in concert with a bin-carrying bot.

Human-robot interaction, which has been a theme in academic research for some time, is now becoming more commercially relevant. Wise says, for example, that research has shown that a certain shape of head can be intimidating, and that a voice interface can encourage a particular type of interaction. “When robots talk, it conveys a certain level of intelligence, and people start thinking the robot is smarter than them, so they’re less likely to help the robot,” she says. “When

the robot has nonverbal cues, people are much more willing to help out.”

http://news.yahoo.com/thought-controlled-computer-cursor-takes-leap-forward-182159259.html

Thought-controlled computer cursor takes a leap forward

.

View photo

A scientist presents a Brain Computer Interface system on June 8, 2006 (AFP Photo/Stephane de Sakutin)

Paris (AFP) - Scientists working to perfect a thought-controlled computer cursor said Monday they have achieved their best results yet, and are moving closer to creating a version that paralysis victims can use.

The device works twice as fast as in previous trials after American developers found better ways to read brain activity and fine-tuned its hardware and software, according to results published in the journal Nature Medicine.

Improving the speed and accuracy of mind-controlled prostheses is crucial to taking them from mere lab experiments to helping quadriplegics regain some level of independence.

"The current work is a step towards one of our ultimate goals, which is to provide point-and-click control of any computer system whenever the user desires," study co-author Jaimie Henderson, of Stanford University, told AFP by email.

The results were from the BrainGate clinical trial, which made history in 2011 when a woman named Cathy Hutchinson used only her thoughts to operate a robotic arm to bring a flask of coffee to her lips for a sip.

In this latest study, researchers observed how well and how fast two paralysed people -- a man and woman in their 50s, both with amyotrophic lateral sclerosis (ALS) also known as Lou Gehrig's disease -- could use their minds to guide an arrow-shaped cursor onto a target on a computer screen.

http://www.technologyreview.com/news/541691/ibm-wants-watson-to-teach-robots-some-social-skills/

IBM Wants Watson to Teach Robots Some Social SkillsBecause language is only part of human communication, IBM is using machine learning to teach robots social skills like gestures, eye movements, and voice intonations.

By Will Knight on September 24, 2015

Why It Matters

Interacting with a robot remains an awkward, time-consuming process.

IBM is using some of the artificial-intelligence techniques that emerged from its Watson project to teach robots to better understand and mimic human communication.

During a keynote speech at a conference called RoboBusiness held in San Jose, California, this week, Robert High, chief technology officer of Watson at IBM, demonstrated some of the techniques his team is working on using a small humanoid robot.

The robot, a Nao model from the company Alderbaran, spoke with realistic intonation and made appropriate hand gestures during a conversation with High. It even exhibited a little impatience and sarcasm, miming looking at its watch, for example, when asking High to hurry up with his talk.

Speaking with MIT Technology Review after the demo, High admitted that this interaction was prerecorded, because the system doesn’t always work well in noisy environments. But he said the capabilities demonstrated reflected real research. His team is using machine-learning algorithms to learn from video footage to associate appropriate gestures and intonations with different phrases. High says it is important to do this because language alone is only part of human communications.

“We augment the words with physical gestures to clarify this and that,” High said. “You can bring into the [robot] interface this gesturing, this body language, the eye…