convergence - issue 15

32
The Magazine of Engineering and the Sciences at UC Santa Barbara Light Fantastic Revolutionary photonic technology Oil Shock Science in a catastrophe Freedom Diabetes made easier Life. Preserved. Plant, animal and fossil collections Ahead in the Cloud A computing revolution FIFTEEN, SPRING 2011

Upload: uc-santa-barbara-engineering-the-sciences

Post on 23-Mar-2016

223 views

Category:

Documents


1 download

DESCRIPTION

The Magazine of Engineering and the Sciences at UC Santa Barbara

TRANSCRIPT

Page 1: Convergence - Issue 15

The Magazine of Engineering and the Sciences at UC Santa Barbara

Light FantasticRevolutionary photonic technology

Oil ShockScience in a catastrophe

Freedom Diabetes made easier

Life. Preserved.Plant, animal and fossil collections

Ahead in the Cloud A computing revolution

FIFTEEN, SPRING 2011

Page 2: Convergence - Issue 15

In the last issue of Convergence we featured the work of UC Santa Barbara scientists who study the natural oil and gas seeps off the coast of Santa Barbara (“Goo and Gas,” issue 14). These seeps—among the most prolific on the planet—are a convenient natural laboratory for researchers to study the fate and effects of oil and gases like methane—a potent greenhouse gas—that are released into the ocean.

The importance of that work was brought into tragic focus this past summer, when an explosion on the Deepwater Horizon drilling rig led to the largest accidental oil spill in history. For 86 days, oil and gas gushed into the Gulf of Mexico.

As a massive cleanup effort was mounted, and controversy swelled, UCSB scientists stepped up to contribute their considerable expertise. Their tireless work helped the federal agencies leading the response understand how much oil and gas was spewing into the gulf and how it would spread and dissipate. For the researchers, it was also an opportunity to add to our understanding of, and ability to respond to, catastrophic spills—which are certain to occur again.

In this issue, we hear from three UCSB researchers who were involved in the Deepwater Horizon response. They talk about their experiences at the scene of the spill and about the frustrations, complications and rewards of research in a catastrophe.

Disasters like the Deepwater Horizon spill serve as reminders of the tremendous value of the work of scientists and engineers at UCSB.

Elsewhere in this issue we highlight UCSB’s leading role in two exciting fields that promise revolutionary real-world applications: photonics and cloud computing. We feature research on an “artificial pancreas system” that could make life much easier and healthier for people with diabetes, and explore the university’s extensive collections of plants, animals and fossils, which are yielding valuable insights into important issues like climate change.

We are very grateful for your continued support of engineering and the sciences at UCSB. With your help, the work done here can truly make a difference.

David AwschalomScientific Director, California

NanoSystems Institute

Larry ColdrenActing Dean, College of

Engineering

Pierre WiltziusDean of Science,

College of Letters & Science

A Note From the Top

Page 3: Convergence - Issue 15

CONVERGENCE The Magazine of Engineering and the Sciences at UC Santa Barbara

20

12

16

2

8

Ahead in the CloudsLife. Preserved.

CONTENTSFIFTEEN, spring 2011

q&A Oil ShockScience collided with catastrophe and controversy for researchers who worked on the Deepwater Horizon oil spill

The university’s collections of plants, animals and fossils are an invaluable resource for researchers working in many different areas

Freedom

An artificial pancreas system could mean less hassle and fewer health complications for people with diabetes

Research at UCSB has been crucial to the development of the cloud

Cover StoryLight Fantastic

Researchers are looking to photonic technology to accelerate the Internet and revolutionize computing and communications

Page 4: Convergence - Issue 15
Page 5: Convergence - Issue 15

Dav

id V

alen

tin

e

“It was industrial chaos

on a grand scale. We

had respirators on for

most of the sampling we

did close in. That’s how

unpleasant it was.”David Valentine,

UC Santa Barbara

Oil Shock

Page 6: Convergence - Issue 15

An

na

Dav

iso

n

Ira Leifer, a researcher with the Marine Science Institute and the Department of Chemical Engineering, studies hydrocarbon seeps and is currently investigating remote sensing technologies for detecting methane—a potent greenhouse gas—released from natural seeps. After the Deepwater Horizon accident, Leifer was appointed to a government panel tasked with investigating the oil spill flow rate and became one of the media’s go-to guys.

q u e s t i o n a n d a n s w e r

In the aftermath of the largest offshore oil spill in U.S. history, the Deepwater Horizon accident in April, as oil and gas bubbled out of a ruptured well and into the Gulf of Mexico, UC Santa Barbara researchers mobilized to contribute their expertise to the response effort and to add to our understanding of the complex science of spills.

The prodigious hydrocarbon seeps off the coast of Santa Barbara, which deposit globs of tar on beaches below the UCSB campus, are a kind of natural lab for scientists studying the fate and effects of oil and gas in the ocean. These predictable spills offer tremendous opportunities for research that’s invaluable in the event of a disaster like the Deepwater Horizon accident.

Convergence spoke to three UCSB researchers who were involved in the spill response about their experiences on the scene, the lessons learned and the sleep lost.

David Valentine, a professor of earth science who has studied how microbes break down hydrocarbons released from natural seeps, investigated the fate of the gas and oil that bubbled out of the fractured well and offered insights into the magnitude of the spill and the resulting hydrocarbon plume. He led three research cruises in the gulf in the aftermath of the accident—never far from the epicenter of the spill—and was due to return late in 2010.

Mechanical Engineering Professor Igor Mezic specializes in fluid dynamics and is head of the Buildings & Design Solutions Group in UCSB’s Institute for Energy Efficiency. When oil began chugging into the Gulf of Mexico, Mezic decided to turn his attention to predicting how it would spread. He figured out a new approach to the problem and was able to successfully forecast where and when spilled oil would wash ashore.

Dav

id V

alen

tin

e

An

na

Dav

iso

n

Page 7: Convergence - Issue 15

5

How did you first get involved?

Leifer: When I saw the news that there was an oil well platform on fire in the gulf I contacted my colleagues to see what we could do in terms of remote sensing. That was my first involvement.

Mezic: I also heard about it from the news. Then I started to do some analysis on the oil movement and a company that was working on the cleanup contacted me because they’d heard what I was doing. They flew me down there because they wanted to use those predictions. That was late June.

Valentine: I’d been in contact with a reporter from the Los Angeles Times quite a bit in the weeks preceding the Deepwater Horizon accident because we’d done some work on asphalt volcanoes (on the floor of the Santa Barbara Channel). As soon as this hit we started talking about it. Since I’d just been talking about emanations from the bottom of the sea, I started to get calls about the oil spill. I went down there early in June.

What did you find when you first arrived in the area? How did it compare to your expectations?

Mezic: It was worse, absolutely. I thought it was completely chaotic.

Valentine: It was industrial chaos on a grand scale. I’d talked to several people who’d been out there so I had a pretty good idea of what to expect, but still, out on scene it was pretty bad. We wore respirators for most of the sampling we did close in (to the epicenter). That’s how unpleasant it was. I was never more than 20 miles from the epicenter and the closest I got was 1,500 feet away.

More than the thick scum of oil on the surface, it was the sheer amount of activity. When I got there, there were three rigs and then probably 50 to 60 large vessels milling about. Some of them were running ROVs (remotely operated vehicles, used in underwater exploration), many of them had booms and were collecting oil and pulling it out and burning it, and there were others that did nothing but deliver water because nobody could make water out there. There were a number of research boats, there were crew boats bringing people in and out, just boats everywhere, so at night everything was lit up.

Leifer: It must have been quasi-apocalyptic.

Valentine: It was when they started doing some serious burning (some of the spilled oil was burned as part of the cleanup). We were there on some of the really heavy burn days. Those were the days when there were six, seven, eight burns going on at the same time. The flames were 30 or 40

feet high, just streaming up, and the entire sky got covered in smoke. Then it would rain and all that stuff came back down and your boat got covered in soot and ash.

There was a cloud that sat perpetually over the entire site because of that smoke. We’d have a clear day and there’d just be this one cloud. The funny thing was that it was a bright white cloud, not a dark cloud, so the metaphor failed on some level.

Leifer: I never got to the spill, per se, because it wasn’t necessary. I did go down there but I was at Ellington Airport (in Houston, Texas), helping with the flight planning for the remote sensing work.

What comes to mind when you reflect on the spill?

Leifer: I think I had just about every emotion you can have. I’ll throw in nausea as well, and not from the fumes.

Mezic: I got really angry about some of the things that were being said. After I started doing my analysis I found it increasingly upsetting that people were saying on the news that they’d been told that oil wasn’t going to hit specific places, whereas our analysis was showing it was going to go exactly there.

Speaking of frustration, was it difficult to get the information and resources you needed? How did you deal with that?

Leifer: Clearly the spill response could have been far better implemented. If it had, my life would have been easier, and you can also say the spill response would have been more effective.

Valentine: We knew BP (which leased the Deepwater Horizon) wasn’t going to provide critical information, so our approach was to go and figure it out ourselves.

We tried to measure absolutely everything we could, take samples for everything, so we could pull the story together with information that was meaningful. We knew that even the samples taken on the government side of the response wouldn’t be available for a long, long time.

Leifer: BP was providing convenient information that may or may not have been accurate.

Valentine: I did have to push to get funding to do cruises and get people out there to do all the measurements that needed to be made and all the science that I thought was important. There were a lot of phone calls to D.C. to say, “Hey, we really need this, what about a boat? Can we get a boat?” There was pushing to get a ship out there under our direction instead of somebody else’s so that we could go out there and make these measurements.

VOLUME 15, SPRING 2011

by Anna Davison

Page 8: Convergence - Issue 15

Dav

id M

arti

n

I did get funding from the National Science Foundation and the Department of Energy. Once we published our first Science paper, the National Oceanic and Atmospheric Administration recognized the importance of some of the things we were interested in so then they began to call us. That’s where things stand now.

The media also started calling, didn’t they?

Valentine: The media wanted a lot and I know Ira was in the same boat. They wanted to know what was going on and get some sort of informed opinion.

Leifer: Talking to the media takes up a lot more of your time than you’d imagine. Talking to four or five reporters can end up taking two or three hours. You want to communicate to the media for a whole bunch of different reasons and yet that interferes with what you actually need to do to help the spill.

Valentine: At the same time I found the media to be incredibly useful. If you develop a rapport with reporters, they provide you with information, you provide them with information.

There were a lot of press conferences and so on and there was information being relayed to them that didn’t always get passed on to us. If I have reporters from the Los Angeles Times on speed dial, I can talk to them at 9 or 10 at night and ask them what happened and they’ll say, “Oh they told us this, this and this.” I learned a lot that way.

Was it worth it?

Leifer: From the point of view of helping with the response and the science of it—collecting data—I would have been happy to be even more involved, but it overlapped enormously with politics and I’m not a politician of that kind and that’s the downside. The downside is really powerful people who may really dislike you.

Ask me this again after the Justice Department is done with its review. I don’t worry about hate letters, but I do worry about hateful lawyers. They have teams of people who will pore over everything you’ve done, one side to support you, the other to try to prove your incompetence.

Valentine: I didn’t have the same level of frustration as Ira did. I wasn’t involved in the political side nearly as much. There have been a lot of benefits.

There’s been research money—four grants so far, and they’ll be more—and we had grad students and postdocs involved in some of the expeditions and they got some really valuable science out of this.

The higher profile cuts both ways. I’ve had calls from people from high school who I hadn’t seen for years. I used to explain what I did to people and I’d just get a glazed-over look. All of a sudden that changed. There were some frustrations, but it was satisfying to actually be contributing to a national priority, as grotesque as it was.

Mezic: I don’t know if there was really a downside for me, although it did take us away from our daily jobs.

Leifer: It did take over my life, just like (former BP CEO) Tony Hayward’s, although I didn’t have a yacht to get back to, just a pile of work that I was supposed to have been doing. I’ve been trying to catch up on everything that was promised before there was an oil spill, work that didn’t happen in those six months. You can tell people, “Oh, I’ve been working on the oil spill,” but that excuse gets old after a while.

6VOLUME 15, SPRING 2011

“We have to be prepared in case something like this happens again. Get familiar with Brazil and the North Sea, because who knows where it’s going to happen next? We need to be ready to respond.”

David Valentine

Page 9: Convergence - Issue 15

Pete

r N

ewco

me

Pres

s-R

egis

ter/

Gu

y B

usb

y

7VOLUME 15, SPRING 2011

The oil skimming technology developed by Victoria Broje when she was a doctoral student at UCSB’s Bren School of Environmental Science & Management was put to work in the gulf. Broje came up with a new design for skimmers that use rotating drums to collect floating oil. By adding grooves to the drum surfaces, Broje, who now works for Shell Projects and Technology, made them much more efficient at collecting oil.

A new book by environmental studies scholar William Freudenburg examines the spill and the decisions and policies that led up to it. In “Blowout in the Gulf—The BP Oil Spill Disaster and the Future of Energy in America,” co-written with Robert Gramling of the University of Louisiana at Lafayette, Freudenburg argues that the blowout was an accident waiting to happen, the product of “an atrophy of vigilance.” As well as highlighting the risks taken by the companies involved in offshore oil drilling, Freudenburg takes aim at the federal government, which he says has done a poor job of regulating the oil industry and managing the country’s energy resources.

Link: es.ucsb.edu/freudenburg

Would you do it all over again?

Mezic: Definitely. It was one of the most exciting things I’ve ever done. You don’t often get a chance to test something you’ve been thinking about for a long time. It was an opportunity for me to test something that was brand new and that’s what science is about.

Leifer: After four months of 18-hour days, 7 days a week, personally I’ll be happy if there’s never another mega-blowout.

Valentine: Absolutely, but at the same time, I think we have to be prepared in case something like this happens again. Get familiar with Brazil and the North Sea, because who knows where it’s going to happen next? We need to be ready to respond.

Leifer: Fortunately it’s infrequent, which also means that what happens is relatively unknown. It’s relatively poorly understood on the science side. It’s one of the most complex sciences there is, and it’s pitifully funded.

Aside from planning the science that needs to be done, there also needs to be a plan for the scientists, because burnout is real. You need to have multiple teams, replacement people. It’s an emergency, you get asked to do it, and you do it, but scientists are humans, not super-machines, not robots.

Links:David Valentinegeol.ucsb.edu/faculty/valentine

Ira Leifercoastalresearchcenter.ucsb.edu/cmi

Igor Mezicengr.ucsb.edu/~mgroup/joomla

UCSB hydrocarbon seeps groupseeps.geol.ucsb.edu

Elas

tec

Related Spill Research

Page 10: Convergence - Issue 15

“These coherent PICs will provide a huge increase in the amount of information

that can be transmitted from or received by a single chip as well as a tremendous

reduction in the size, weight and power required by the chips.” Larry Coldren

Although most of the long-distance networks that span the world are optical networks, photonic technology isn’t yet so ubiquitous in smaller scale connections. Data that’s traveled thousands of miles over fiber optic cables must still be converted to electrical signals before it can be detected and processed by switches, routers and other devices that handle data on its way to an end user. That’s a slow, energy-intensive process, and one that researchers are targeting in their efforts to replace sluggish electrical connections with blazing fast photonic links.Researchers also are looking to use optical connections within computers, and, ultimately, on an even smaller scale: on all-optical chips. UC Santa Barbara researchers are building on the university’s strengths in materials science, fabrication and engineering to develop photonic

components and devices for a new generation of high-performance, energy-efficient communications and information processing systems.Two research efforts recently launched at UCSB aim to advance photonic technology for a multitude of potential applications, from remote sensing to communications to terabit Ethernet technologies for high performance Internet and networking.

LIGHT FANTASTIC

A revolution on a chipA new UCSB-led consortium, Photonic Integration for Coherent Optics (PICO), will develop photonic chips for communications and sensing applications. Acting Dean of Engineering Larry Coldren, who’s heading up the PICO group, says these chips could handle massive amounts of data—making it possible to download dozens of feature films in a second—and be the basis of detection systems sensitive enough to read the date on a dime from a mile away.Following a national competition, the multi-university-industry consortium was one of four chosen for funding—a little more than $2 million a year—from the U.S. Defense Advanced Research Program Agency (DARPA). PICO will receive about the same amount from industry sources.The consortium includes researchers from the Massachusetts Institute of Technology, the California Institute of Technology, the University of Virginia, Lehigh University, and 17 industry partners including Hewlett-Packard, Intel and Rockwell-Collins. The other UCSB participants are professors John Bowers and Mark Rodwell and research engineer Leif Johansson.Research at PICO will draw on UCSB’s world-leading expertise in designing and fabricating photonic integrated circuits (PICs). These devices, which pack a great many components onto a single tiny chip, are intended to be the basis of powerful new optical communications and computing systems. PICO researchers aim to develop a new generation of PICs that operate on both the amplitude and phase of lightwaves. “These coherent PICs will provide a huge increase in the amount of information that can be transmitted from or received by a single chip, as well as a tremendous reduction in the size, weight and power required by the chips,” Coldren says.

The telecommunications networks that now stretch around the world transmit information optically, carrying Internet, television, and telephone signals in the form of pulses of light. It’s more efficient to move data that way than as electronic signals, and photonics promises to revolutionize not only telecommunications, but entertainment, medicine, computing and a multitude of other applications.

8VOLUME 15, SPRING 2011

by Anna Davison

Page 11: Convergence - Issue 15

Pete

r A

llen

Photonic

chips could

be the basis

of detection

systems

sensitive

enough to read

the date on a

dime from a

mile away.

Page 12: Convergence - Issue 15

10

The PICs to be developed at PICO will draw on both silicon technology, which would enable them to be fabricated using the CMOS processes that currently dominant chip manufacturing, and on monolithic indium phosphide technology, Coldren (pictured below) says.Among the potential applications that PICO researchers have in mind for the chips is in LIDAR detection systems that use light generated by lasers to map terrestrial objects from the air, and in communications systems using either conventional fibers or narrowly focused beams of light to deliver information to a specific target such as a pilot, minimizing the possibility that the signals will reach some unintended recipient.

Future Internet: 1000x fasterImagine if all the data traversing the world right now—the television broadcasts traveling over long-distance networks to living rooms around the country, the databases and files being exchanged over office networks and all the information moving within computers and other hardware—could be sent through a single fiber the width of a human hair. That’s the vision driving a new research collaboration established at UCSB, with support from industry partners like Google, Intel and Verizon.“We’re going to need much faster networking to handle the explosion in Internet traffic and support new large-scale applications like cloud computing (see feature story on page 20),” says Daniel Blumenthal (pictured right), professor of Electrical and Computer Engineering at UCSB. He directs the Terabit Optical Ethernet Center (TOEC), which is part of UCSB’s Institute for Energy Efficiency. Researchers with TOEC aim to develop the technology necessary for a new generation of Ethernet—the de facto standard for data transmission both on a small scale and across global networks. It will be a thousand times faster and much more energy-efficient than today’s most advanced networks. Blumenthal and colleagues at TOEC are aiming for 1 terabit Ethernet over optical fiber—1 trillion bits per second—by 2015, with the ultimate goal of enabling 100 Terabit Ethernet by 2020.It won’t be easy. Ethernet is constantly evolving, but soon—perhaps in as little as five years, according to some estimates—it won’t be able to keep up with the surge in Internet traffic as private and public enterprises move increasingly massive quantities of data, and consumers stream video, share high definition photos and explore and interact within sophisticated online environments. Millions of people will soon be consuming billions of bits per second in their living rooms—simultaneously.Current Ethernet technologies can’t be pushed much past 100 gigabits per second—the speed that’s beginning to be implemented now—mainly because of the amount of power needed to run and cool the required systems, Blumenthal says. New generations of Ethernet need to be much more energy-efficient and cost-effective in order to overcome the power problem.

VOLUME 15, SPRING 2011

Pete

r A

llen

Pete

r A

llen

Page 13: Convergence - Issue 15

UCSB’s photonics researchers have claimed many firsts, including the world’s first eight-channel monolithic tunable optical router (MOTOR) that operates error-free at 40 Gbits/per port on a single InP/InGaAs chip. It was developed in the labs of professors Coldren and Blumenthal.

11

“Our goal,” Blumenthal says, “is to make energy-saving technologies that will allow applications and the underlying networks to continue to scale as needed. You could think of it as greening future networks, and the systems that rely on those networks.”That, Blumenthal says, will require fundamental improvements in underlying technologies—advances that will be underpinned by photonic technology developed at UCSB. Research at TOEC will draw on UCSB’s world-leading expertise in materials, advanced electronics, photonic integrated circuit technology, silicon photonics and high-speed integrated optical and electronic circuits, and in bridging these new technologies with real networking systems. It will include ongoing work on a DARPA-funded project aimed at eliminating the requirement for optical signals to be converted to electrical signals before they’re redirected or otherwise processed. As part of that project, Blumenthal and colleagues at UCSB have developed an optical router—the monolithic tunable optical router (MOTOR)—on a single chip. It’s an important step toward replacing bulky, energy-hungry router hardware with smaller, more efficient all-optical routers. MOTOR is one of the largest and most complex PICs ever created and it’s helped secure UCSB’s standing as a world leader in the field, says Bowers, who is also part of TEOC and focuses on developing integrated circuit technology on silicon substrates—an approach that would enable chips to be fabricated using CMOS processes.All told, Blumenthal says, TOEC’s vision will require “dramatic breakthroughs across multiple disciplines, not only in the core Ethernet technologies but in Ethernet-based networking and in the engineering and measurement systems used to develop and test these new technologies.”It’s a big ask, but the payoff could be huge. Not only will terabit Ethernet soon be needed to satisfy the demands created by the way we use networks now, but Internet pioneer Dave Farber, a professor at Carnegie Mellon

University, says high-performance, high-speed Ethernet will open up opportunities we couldn’t dream of today: “You build it and they will come,” he says.

Links:Terabit Optical Ethernet Centeriee.ucsb.edu/toec

Dan Blumenthal’s research groupocpn.ece.ucsb.edu/index.php/professor-blumenthal

Larry Coldren’s research groupece.ucsb.edu/Faculty/Coldren/home.htm

UCSB’s Institute for Energy Efficiencyiee.ucsb.edu

“We’re going to need much faster networking to handle the explosion in Internet

traffic and support new large-scale applications like cloud computing.”

Dan Blumenthal

VOLUME 15, SPRING 2011

Pete

r A

llen

Page 14: Convergence - Issue 15

An Artificial Pancreas System now undergoing clinical trials in Santa Barbara could relieve people with diabetes of the burden of constantly monitoring their blood sugar and dosing themselves with insulin.

FREEDOM

Pete

r A

llen

Page 15: Convergence - Issue 15

Sharon Sorensen’s blood glucose level is higher than it should be. After decades of living with diabetes, she’s keenly aware of the potential fluctuations in her blood sugar and adept at limiting them with carefully timed doses of insulin. Today, though, Sorensen is taking part in a clinical trial, and she’s turning over control of her blood sugar levels to an “Artificial Pancreas System” developed by researchers at UC Santa Barbara in collaboration with Sansum Diabetes Research Institute (SDRI).Sorensen is wearing a continuous blood glucose monitor that takes readings every five minutes from a sensor on her stomach. A pump taped to her arm is poised to deliver doses of insulin, when necessary. These devices have freed many people with diabetes from the unpleasantness of insulin injections and given them a better way of keeping tabs on their blood sugar.

Sorensen has used glucose monitors and insulin pumps for years, but she’s had to be the go-between: using the readings from the monitor to figure out the necessary appropriate dose of insulin and then activating the pump to deliver it.Researchers at UCSB and at SDRI, also in Santa Barbara, want to make Sorensen’s life—and the lives of the millions of other people with type 1 diabetes—easier and healthier by automating the process.“We have these two parts and what’s missing is the brain to get the sensor and pump talking to each other,” says Howard Zisser, director of Clinical Research and Diabetes Technology at SDRI. “That’s where the engineers come in”—specifically, a UCSB team led by Chemical Engineering Professor Frank Doyle, who holds the Mellichamp Chair in Process Control and brings a biological control perspective to the problem.A decade ago, Doyle had the idea of taking a model-based control approach that’s been used for decades in the refining industry, and more recently, by automakers in anti-lock braking systems, and adapting it for medical applications.He’s since worked with colleagues at UCSB and SDRI to develop a control system that could be the brains of an automated insulin delivery system. The Artificial Pancreas System (APS) developed in Santa Barbara provides a framework to link a continuous glucose monitor with a sophisticated algorithm to determine the amount of insulin necessary to keep that level within a healthy range, and then sends that information to an insulin delivery device (pump). The components “talk” to each other using Bluetooth or another wireless communications protocol.

The goal of the work, which is sponsored by the Juvenile Diabetes Research Foundation

( JDRF) as part of its Artificial Pancreas Project, is to relieve people with type 1 diabetes of the burden of monitoring their blood sugar levels and determining and administering an appropriate dose of insulin—a responsibility

that can be particularly problematic for children and teenagers with diabetes—and so help them

live longer, healthier lives.“If we have a way to automatically deliver the appropriate dose of insulin,” says Eyal Dassau, a senior investigator at UCSB and lead scientist for the APS project, “that will minimize the risk for long-term complications.”Researchers around the world are working on artificial pancreas systems, but Doyle says the artificial pancreas algorithm developed in Santa Barbara is unique in two respects: it’s completely automatic, and it’s very flexible. It’s the first fully automatic closed loop system—it doesn’t require any user input, unlike other setups that need human help to transfer data, and it’s built upon a software system (the APS) that’s compatible with three kinds of insulin pumps and two glucose monitoring systems, and can be expanded to accommodate other devices. It can also be used with any algorithm, offering researchers around the world a powerful tool for developing artificial pancreas technology.“The collaboration between UCSB and SDRI is an excellent example of how our strengths in engineering and science have impact on an important and growing medical problem of our times,” says SDRI Board Member and UCSB’s Dean of Mathematical, Life and Physical Sciences Pierre Wiltzius.The UCSB/SDRI system is now being tested in clinical trials at SDRI and at 12 other locations around the world. Sorensen, who lives in Solvang, is one of 15 patients participating in the trials in Santa Barbara. “It would be wonderful if this work helps people,” she says, “and if it helps me, that’s even better.”“This is a very, very exciting period for us,” Doyle, says. “These algorithms that we have

been prototyping and developing for the last 10 years are finally moving into the clinical phase.”For Sorensen, that means a full day spent at SDRI, with the APS controller tucked into a pouch next to her. Today, it’s in charge—and on trial.Sorensen is one of about 24 million Americans who have diabetes—roughly 8 percent of the U.S. population, according to the Centers for Disease Control and Prevention (CDC). Type 2 is by far the most common

13

An

na

Dav

iso

n

Sorensen skipped

breakfast as instructed, and by 9 a.m. her blood glucose level is nudging the upper limit of her healthy

range. This high start is

intentional—a challenge to the

system.

VOLUME 15, SPRING 2011

by Anna Davison

Page 16: Convergence - Issue 15

14

form of diabetes, but Sorensen has type 1, an autoimmune disease in which the body attacks the cells in the pancreas that produce insulin, a hormone that regulates blood sugar levels. People with type 1 diabetes must receive insulin delivered by injection or a pump in order to survive. Santa Barbara was one of the first places in the world where it was available, thanks to the work of William Sansum, who was the first physician to purify and administer insulin in the United States back in 1923.Poorly controlled blood sugar levels can lead to serious complications, including heart attack, stroke, blindness, kidney disease and nervous system problems that sometimes necessitate amputation. According to the CDC, people with diabetes have twice the risk of death as others of a similar age.As Sorensen knows from decades of experience—she was diagnosed in 1966, not long after the birth of her first child—living with diabetes requires constant vigilance and commitment.It takes a while for people with diabetes to understand how their blood sugar levels are affected by food, exercise and stress, and to learn how to anticipate those effects and how to counter them by delivering a carefully timed and measured dose of insulin—which is toxic at high concentrations, and can’t be counteracted.”It’s a job,” Sorensen says. “It’s frustrating and it’s tiresome and there are times when you just want to be done with it,” although she does see an upside: “It’s been a blessing in a way because my family and I eat better and we exercise.”

On the morning of the trial, Sorensen skipped breakfast, as instructed by the research team, and by 9 a.m., when the trial begins, her blood glucose level is nudging the upper limit of her healthy range. This high start is intentional, Doyle says, as a “challenge” to the controller—a test of how quickly it can get Sorensen’s blood sugar under control.As the morning progresses and the automated system does its job, Sorensen’s glucose level eases downward. Readings taken every five minutes by the glucose monitor—which are confirmed by blood draws every half hour during the 8-hour trial—are sent wirelessly to a laptop computer, which is connected to a projector screen on a wall. As the readings accumulate, a line on a graph creeps forward. Out ahead of it, another line indicates the blood sugar levels predicted by the controller; “It amazes me how accurate it is,” Sorensen remarks.One of the strengths of the UCSB/SDRI system is its predictive control, which Doyle compares to the strategy of a chess master: moves are planned in advance, with the player constantly incorporating the new information from each opposing move. To fine-tune the control algorithm, Sorensen wore the glucose sensor before the trial, so the researchers could “learn” how her body functions.The system performs admirably during the morning of Sorensen’s trial, and by late morning, her blood sugar level has flat-lined. “That’s not usually a good thing in medicine,” Zisser quips, but in this case it is, particularly for Sorensen, who’s ravenous. It means she can finally eat. This “unannounced meal” is the second challenge for the system.

An

na

Dav

iso

n

By late morning Sorensen’s blood sugar level has flatlined, which means she can finally eat. This “unannounced meal” is

the second challenge for the system.

Howard Zisser (left) of Sansum Diabetes Research Institute and UCSB Engineering Professor Frank Doyle monitor the performance of the artificial pancreas system during a clinical trial in Santa Barbara.

VOLUME 15, SPRING 2011

Page 17: Convergence - Issue 15

As Sorensen works her way through a half-sandwich and salad, her blood sugar level begins to climb, but it’s soon brought under control by the APS, which determines that a burst of insulin is needed and prompts the pump to deliver a dose.A system like this won’t be on the market for at least a few years—a commercially available version would include various safety protections—but as the Santa Barbara researchers put this setup to the test, they’re figuring out ways of improving on it.“If we could get faster and more reliable ways of measuring blood sugar, and faster, more efficacious ways of delivering insulin,” Doyle says, “we could push the system much farther.” To that end, the research team will soon begin two JDRF-funded trials, one using quick-acting inhaled insulin, and another using a delivery device that squirts insulin directly into the abdominal cavity. “That gets the insulin where it needs to be, much like the pancreas does,” Doyle says.“This isn’t a cure,” he adds, “but we want to ease the burden for people with diabetes.”Zisser sees the collaboration between UCSB and SDRI as key to the team’s success and refers to the APS work as “medically inspired engineering—or engineering-inspired medicine. It’s one of the great advantages we have here.”Dassau agrees, saying, “We don’t believe engineers can work in a vacuum. We need a ‘sanity check.’ It’s better, when you design something, to have the voice of the M.D.s.”

15

Eyal Dassau (left), senior investigator at UCSB and lead scientist on the artificial pancreas project, shows off the controller (left) for the system. It receives data from a blood glucose monitor (right) that takes readings from a sensor on Sorensen’s stomach. The pump on her upper arm delivers insulin. A

nn

a D

avis

on

VOLUME 15, SPRING 2011

Links:Sansum Diabetes Research Institutesansum.org

Frank Doyle’s research groupthedoylegroup.org

Juvenile Diabetes Research Foundationjdrf.org

Page 18: Convergence - Issue 15

Life. Preserved.

Page 19: Convergence - Issue 15

17

Much of this collection is the work of Professor Sam Sweet, (left) a herpetologist—he studies reptiles and amphibians—who came to UCSB in 1979. In the decades since, he’s amassed an extensive collection of creatures, most of them from southwest California, although there are a few exotic outliers lurking on the shelves: a sea snake, a cobra, a chameleon.These specimens are part of UCSB’s 350,000-strong collection of plants, animals and fossils, initiated in 1945 and now housed at the Cheadle Center for Biodiversity and Ecological Restoration (CCBER), founded in 2005 and named for former UCSB chancellor and botanist Vernon Cheadle.These collections, some of which reach back more than a hundred years, are a rich resource for teaching and for scientific studies, not just for researchers interested in identifying and classifying specimens, but for scientists working in a host of different areas, from climate change to restoration.“A lot of universities have given away or discarded their collections,” says CCBER director Jennifer Thorsch, “but we have quite a breadth and depth of collections that are used by students and researchers—people from around the world.“They’re valuable,” Thorsch explains, “for studies on evolution, climate change, invasive species, diseases, extinctions”—and for who-knows-what research that might be important in the future.“We don’t know how these collections might be used in 10, 15, 20 or 100 years,” Thorsch says. “One hundred years ago the scientists who were collecting had no idea what the value of these specimens might be. When a lot of these collections were made, for instance, they couldn’t do DNA analysis.”

A small building tucked unassumingly behind UC Santa Barbara’s Harder Stadium is home to a refrigerated menagerie of reptiles and amphibians. Inside, rows of glass jars hold the preserved remains of a multitude of species: colorfully banded king snakes, the coiled, dappled bodies of rattlesnakes, rotund toads, rare California tiger salamanders, as well as an assortment of related specimens: rodents retrieved from the stomachs of snakes, frogs’ eggs, tadpoles.

Professor Emeritus David Chapman, showing off some of the older specimens in the extensive algae collection that he oversees, says, “Changes due to climate and environment just weren’t thought of back when these were collected.”The specimens in CCBER’s archives range from diatoms—microscopic, mostly single-celled algae that inhabit oceans, ponds and soils—to the hide of a bear, stretched out atop the steel cabinets that house the bird collection, its hefty paws poking over the cabinet doors. The vertebrate collection includes skins and skulls (more practical to store than taxidermied mammals), thousands of birds, stuffed and laid out in drawers, along with sundry feathers and wings, and various preserved fish.The herbarium collection encompasses about 100,000 plant specimens, pressed, dried, mounted on archival sheets and stored in brand new cabinets, paid for out of a $270,000 National Science Foundation grant and a gift from the Cheadle family—funds that also covered “desperately needed” cataloguing and preservation work on the plant collection, Thorsch says. Seeds and cones are stored as well, and the herbarium also includes tens of thousands of microscope slides that make up the plant anatomy collection. The university also has a living collection, which CCBER has turned into the Campus Flora Project, with interactive maps as a guide to the several hundred plant species from six of the world’s seven continents.Although CCBER’s collections include specimens from around the world—kelp from New Zealand, birds from Peru, Southern American snakes—the herpetology, bird and mammal collections, in particular, are regionally focused, and therefore unique.

Pho

tos

by

An

na

Dav

iso

n

VOLUME 15, SPRING 2011

by Anna Davison

Page 20: Convergence - Issue 15

“Noone else has collected in this area like Sam has,” Thorsch says of Sweet’s specimens. Other important CCBER holdings are the 10,000 specimens of marine algae; the plant anatomy collection; the extensive archive of oaks built up by the late Professor of Botany Cornelius H. Muller; and the pine collection amassed by Robert Haller, an emeritus UCSB faculty member who visits the herbarium regularly to work on identifying and annotating his specimens. Recently, UCSB Professor Emeritus James Kennett, a marine geologist, donated his collection of ocean sediment cores to CCBER. These sediments, which are filled with microscopic fossils of marine organisms, provide a record of climatic and oceanographic conditions stretching back millions of years. “It’s more than a collection—it’s an archive of material for future work,” Kennett says. “It’s going to be a useful baseline to have these collections for comparative purposes.”Insights into climate change can also be found in the herbarium, which includes specimens gathered over many decades and a few that date back to the 1800s.“If you look at enough specimens you can build up a story,” says Carla D’Antonio, CCBER’s faculty director, who delves into the herbarium as part of her studies on plant and ecosystem ecology.The collection of hundreds of California poppy specimens is being used by a student in the Department of Ecology, Evolution and Marine Biology to investigate whether the plants are now blooming at a different time of the year than they once did. He found a significant shift in flowering time, which could reflect changes in climate. These days, poppies burst into bloom in late March, whereas 50 years ago, this glorious display didn’t get underway until late April or May.

Carla D’Antonio, CCBER faculty director, shows off California poppy specimens from UCSB’s herbarium. A study of hundreds of poppy specimens found that 50 years ago

the plants bloomed in late April or May, whereas this glorious display now begins in late March-—a shift that could reflect changes in climate.

Page 21: Convergence - Issue 15

19

“There’s an enormous amount of information on these sheets,” D’Antonio says, “from straight morphology to tracking these kinds of lifecycle changes.”“The beauty of a collection like this that goes back so far,” Chapman says, “is that we can look at how things have changed in the last 150 years. You can then ask, ‘Are we seeing changes in the flora because of changes in environmental conditions?’ ”D’Antonio’s focus is on how invasive species spread and how they alter the ecosystem structure and functioning of their new habitats. By going back through herbarium specimens she can see when weeds first were collected from a particular area and where they first appeared.“These specimens preserve a history of invasion and change,” she says, and not just by weeds—diseases and insect attacks are evidenced by mottled and chewed leaves.Animal specimens, too, can be a rich source of information on diseases, pollution and other crises. Researchers have analyzed feathers from bird collections to study the historical levels of the now-banned pesticide DDT in the Southern California environment, for instance.The CCBER collections also preserve a record of what once grew, fly, hopped and slithered around areas that are now home to shopping malls, suburbs and vineyards. Sweet holds up a jar of legless lizards collected from parts of northern Santa Barbara County as an example. “Everywhere these things live is becoming a housing development,” he says. When that happens, Sweet takes the opportunity to expand the CCBER collections, lodging a record of what’s being lost. “We’ll do what we can to get samples before the bulldozers go over,” he says—or after. “We’ll be out there walking behind the bulldozers. We pick up the animals that are in two pieces rather than six.” That reflects today’s more conservative approach to collecting, which takes advantage of opportunities presented by bulldozers and busy roads. Roadkilled gopher snakes, for example, are easy to pick off Highway 166 in northern Santa Barbara County in the spring, when young snakes leave their

VOLUME 15, SPRING 2011

(From left) Professor Emeritus David Chapman oversees the extensive collection of algae; Heather Fox, a curatorial assistant in the vertebrate collections, among jars of preserved reptiles and amphibians; CCBER Director Jennifer Thorsch with a plant specimen from the herbarium.

nascent burrows

and slither off in search

of new territory, Sweet says. Many recent

additions to the jars of rare California tiger salamanders

exhibit obvious squish marks.These salamanders might someday be the

only record of the species from a particular area, or even the legacy of a lost species. The

collections, D’Antonio says, “preserve a history.”To make the most of these rich archives, CCBER is taking advantage of 21st century technology. Exact locations where the specimens were gathered, digital photographs and other relevant information is being added to a digital database. Ultimately, it will be accessible to researchers around the world.“We try to capture the information scientists want when they’re looking at specimens,” Thorsch says. “Preserving these collections for possible future uses we can’t even dream of today is so important,” she adds.

Links:CCBERccber.lifesci.ucsb.eduCarla D’Antoniolifesci.ucsb.edu/eemb/faculty/dantonio

Sam Sweetlifesci.ucsb.edu/eemb/faculty/sweet

Campus Flora Projectearth.geog.ucsb.edu/campusflora

Page 22: Convergence - Issue 15

Ahead in the CloudBefore the word “cloud” had escaped the lips of those information technologists who would create it, the rainmakers at UCSB’s Department of Computer Science were helping to shape it.

Pete

r A

llen

Page 23: Convergence - Issue 15

21

Whether migrating services to distant computer clusters, spinning out scholars to found startups like RightScale or Eucalyptus Systems (or bolstering nascent heavyweights like Google), working on grid servers, helping design the Alexandria Digital Library or theorizing about the next generation, UC Santa Barbara computer scientists have had impact on the cloud. Now that the cloud metaphor has established itself in the popular mind—even as its exact definition bedevils the IT crowd and its promise bewitches investors—the university’s impact is more recognized and celebrated.In the field itself, UCSB computer scientists have a strong tradition in both distributed systems—parceling out connected resources among disparate machines connected in a network—and database management.“Not many places have these two disciplines so close,” says Amr El Abbadi, chair of the Department of Computer Science. “We actually have the same faculty working together on distributed systems and on databases, and together this is the magic that makes the cloud exciting—and makes it succeed.” “We think of the cloud as the next level of distributed systems,” says Professor Chandra Krintz, whose own research makes programming on the cloud easier.“From a research perspective,” reflects Professor Divyakant Agrawal, “if you think about people working in the area of networking, people like us who have been working on databases and distributed systems, and such, this was the goal.”

CLOUD METEOROLOGYAt its most basic, the cloud means moving computer hardware and software somewhere else, and being able to easily manage them remotely. Anyone with a Hotmail or Gmail e-mail account—which means logging into a server somewhere else to operate the account—has tasted the cloud. “Although the term cloud may be new,” observes El Abbadi, “we have been living in this world for a while.” A second inroad came through software as a service, programs reached and run over the Web, say a Google application that replaces the latest iteration of a Microsoft product that would be downloaded onto the user’s own computer. Pointing to local software-as-a-service companies like AppFolio and Citrix Online, El Abbadi explains, “You go to them to provide a service. I don’t have it here.”So-called SaaS has been followed by “platform as a service” (exemplified by Krintz’s Research on Adaptive Compilation Environments (RACE) Laboratory’s work on AppScale, which creates Google applications in the cloud) and “infrastructure as a service” (as seen with Eucalyptus, which allows easy connection to the public cloud). System administrators worldwide—from small but data-intensive businesses to the behemoths of social networking—salivate over the cost savings and scalability of the cloud, hoping to move an entire nest of IT resources into a data center “somewhere else,” accessing it on demand via the Internet or a local area network. Although it’s easiest to think just of machines and data storage offsite, the “cloud” of resources—which is where the name hails from—can include a network, operating systems, application programs and even a place, like AppScale, to develop programs.Where is that “somewhere else”? Anywhere you can drop a bunch of servers, as long as there’s sufficient bandwidth and connectivity. As RightScale CEO Michael Crandell told business news network CNBC, “We’re a cloud computing company, so we actually manage, compute and storage resources that can be anywhere in the world, and it really doesn’t matter where we’re located.” Amazon, Google and Microsoft, among others, rent out space in what’s known as the public cloud; with no on-campus data center for a private cloud at present, UCSB’s computer scientists usually frolic at Amazon. In the public cloud, one enterprise’s data and code aren’t alone; in one of the miracles of the cloud-seeding breakthrough known as virtualization, they jostle alongside data and services for lots of other enterprises residing in those same third-party data centers.Private clouds, meanwhile, are common where security, better performance or sheer bulk make it sensible for a business or agency to run its own discrete cluster of machines. In between there are lots of hybrid models and tangoing between public and private clouds.

VOLUME 15, SPRING 2011

by Michael Todd

Page 24: Convergence - Issue 15

“We’ve really got a good group of people here to solve these really complex questions in distributed systems. We’re on the cutting edge of what industry wants to take forward.”

22

Such outsourcing of the heavy computing recalls the old mainframe model, where a (then) supercomputer labored away in an air-conditioned room somewhere in the bowels of the building as users at “dumb” terminals in the same building fed it requests for processing.“We are evolving to where this ‘mainframe’ now is a data center, with huge numbers of machines, racks full of machines,” says El Abbadi. The terminals are now desktops, laptops, maybe even mobile devices, “and whenever they need something, they go and do it ‘over there.’ ”When shoveling data to the cloud, it’s not going to an even more spectacular next-generation mainframe, but to a bunch of smaller, albeit quite powerful, servers that may not even be in the same physical data center. The data is distributed over this cluster in a way that dynamically maximizes the efficiency of the cloud, creating “virtual computers” handling client requests. This virtualization of units of computing—remember the “distributed systems” that UCSB has excelled at—cleared the air for the cloud.Virtualization and access to broadband, Agrawal says, were the critical technological advances that allowed the cloud to form in the last few years. “What has happened with virtualization,” he explains, “is the concern about matching software to the underlying hardware—the type of machine you are running, the type of disk you have—those concerns have become a nonissue.”And while he sees those advances as necessary, they aren’t sufficient for a paradigm change like the cloud, he believes, noting the hopes attached to grid computing a decade ago and the marketplace’s yawn. “In my mind, whenever there is a technology transformation that has happened, it doesn’t happen just because the technology is feasible. It also has to happen from a business model,” and the business model here is to remove “the really non-essential parts of running a business” to the cloud professionals. “The convergence of these two aspects makes the cloud feasible, and I think at this point it will gain more and more momentum,” Agrawal says.

GIVING THEM THE BUSINESSCloud computing’s potential saw Merrill Lynch famously predict the cloud computing marketplace will exceed $150 billion next year. Forrester Research’s Rich Fichera, in a more recent—and measured—prediction for SearchCloudComputing.com, suggested he expects a

majority of enterprises will use a private cloud in the next five years, while a “substantial minority” will be using the public cloud for serious business.A survey by RightScale finds the biggest drivers are scalability and cost: rapidly expanding (or declining) enterprises can find all the computing resources they need as they need them, with someone else worrying about having enough total capacity or up-to-date infrastructure. No more buying much more than you need for a worst-case scenario, or not buying for a worst-case scenario and then facing one. Data centers, by the way, usually putt along using less than half their own prodigious capacity, with the excess capacity set aside for bursts of activity or failures of individual machines.What’s holding IT managers back? Security, for one. Professor Ben Y. Zhao recalls a survey last year in which four out of five executives considering the cloud—but remaining aloof—cited security concerns.Zhao and Professor Christopher Kruegel are collaborating on a way to allow users to move their non-cloud applications to the cloud while protecting the privacy of their data through encryption.“Our tool,” Zhao explains, “will tell you which pieces of logic cannot be encrypted and that you must take care to actually protect. Everything that’s left can be fully encrypted and moved up. …Even if someone was to compromise the cloud— which is not difficult to believe at all—your data would still be safe at night.”

Their research finds that 80 to 90 percent of the data can be fully encrypted, and much of which can’t—say directions to your terminal’s display about what color to make the screen—don’t present security concerns. “We have a very strong security group, and it is a very practical security group and that’s one of the things that make this department kind of special,” says El Abbadi, citing scholars such as Zhao, Kruegel, Giovanni Vigna and Richard A. Kemmerer.Because the department has always had, as El Abbadi says, “a practical bent,” its connection with Main Street has always been exceptionally strong. “We can be traditional, we can be entrepreneurial.”He cited such local lights as RightScale (which offers management of applications in the cloud), AppFolio (Web-based property management service) and Eucalyptus (cloud infrastructure) as cloud-specific success stories born at UCSB.“UCSB has a great history of building great startups, especially in the cloud space,” says Zhao; private industry has shown it agrees.Recently CNBC spotlighted the College of Engineering—and the Department of Computer Science in particular—in a segment dubbed “Welcome to Techtopia.” Guests Kevin O’Connor, founder of Internet advertising pioneer DoubleClick, and cloud CEOs Michael Crandell of

VOLUME 15, SPRING 2011

Pete

r A

llen

Page 25: Convergence - Issue 15

Pete

r A

llen

“We’ve really got a good group of people here to solve these really complex questions in distributed systems. We’re on the cutting edge of what industry wants to take forward.”

23

RightScale and Brian Donahoo of AppFolio stressed the joys of running young tech companies in the shadow of UCSB.That strong connection to private industry can be a garden of delight for scholars. “One of the challenges of the space,” says Zhao, “is that for academics like us it’s hard to find the interesting problems by ourselves because we don’t have the large-scale data and the large-scale machines that companies do. “And the problems are interesting when things are very large. …Industry has the first exposure to some of the interesting problems. We’re relying on them to tell us what the problems are.”“We’ve really got a good group of people here to solve these really complex questions in distributed systems,” notes Krintz. “We’re on the cutting edge of what industry wants to take forward. I think it’s exciting for students as well, because we’re having practical impact immediately, important impact—these are hard problems to solve if we’re really going to get to the next level of technology.”She and Zhao epitomize the joy-of-the-scientific-hunt mentality at UCSB.“I want to solve really hard computer science problems,” says Krintz. “I wake up every morning looking forward to

that. And if I have to be driven by the bottom line, then I necessarily have to solve problems in a particular way. Staying away from that gives me the freedom to find the right solution, and then someone else can commercialize it.”

SHAPING THE CLOUD While it currently has no specific Cloud 101 course to join the Cloud Expos and Cloud Journals that have blown in in the decade or so since “cloud” joined the IT glossary, huge swaths of UCSB’s computer science research sustains, improves, expands or leverages the cloud.One exemplar of serving the cloud without focusing on it is the story of startup Eucalyptus Systems. In a purely academic project sponsored by the National Science Foundation, a team led by Professor Rich Wolski studied how NSF’s supercomputers could be combined with Amazon’s market-leading public cloud to perform complex computations in weather forecasting. Insights from that project, and the open-source software artifacts it generated, paved the way for the commercial enterprise that is Eucalyptus—an open-source software platform that enables companies, universities and other enterprises to turn their computing resources into their own Amazon-compatible clouds.Wolski has not been lost to UCSB; he’s slated to teach again this coming year.

Fred Chong is

leading an effort to develop an experimental

miniature data center for research on increasing energy efficiency in data

centers—essential to the well-being of the cloud.

Krintz’s RACE Lab, through the support of Google, IBM and the NSF, has also been working on making life easier for cloud dwellers—in this case, application developers. Her lab’s AppScale project, also open-source, gives them “a giant virtual machine” in the cloud where they can write, deploy and debug a program against a set of interfaces and have it work anywhere your platform exists, “from a laptop to the Google cloud.” While tempting to private industry, AppScale has not been commercialized.Although it was designed specifically for Google’s popular public cloud, it also works with other clouds and interfaces so a wider range of computation problems can be solved.“Scientists,” Krintz says, “should not be bogged down in how to program their algorithm; they should be able to express it in a very high level language and we should be able to make it very efficient and scale automatically.”Nor should they worry about the hardiness of their data or fret about retrieving it. In the first-generation cloud, Agrawal says, “maintaining the consistency and the reliability and keeping data meaningful was all thrown in the hands of the person who is writing the application at a high level. You can think that as a user, you are responsible if things go wrong. And I think people are now recognizing that as a significant problem.” Noting that “technology over the last 20 years has worked very hard making these things very safe, very stable and very reliable,” he and El Abbadi are working on keeping that so. “When it’s your own data, things are very simple. But when it’s shared data, being accessed by multiple people at the same time, then you have to start worrying about making sure you are getting data that has been modified correctly.”Meanwhile, Krintz sees two big challenges to the cloud going forward: her forte, programming for the cloud, “and how do you do it in a way that’s energy efficient?”While the cloud generally reduces electricity consumed by the end user, data centers are voracious energy users. They are most energy efficient at minimum or maximum use, and both states are the exception, not the rule. At UCSB, the Institute for Energy Efficiency and its Greenscale Center for Energy-Efficient Computing are addressing energy consumption and computer cooling head on.

VOLUME 15, SPRING 2011

Chandra Krintz

Page 26: Convergence - Issue 15

24

“It’s all the pieces of the cloud—databases, the theoretical foundations, the scheduling and the infrastructure and the platforms, the programming languages and the security. In a positive way, it’s been the perfect storm.”

Chandra Krintz

Links:UCSB Department of Computer Sciencecs.ucsb.edu

UCSB Research on Adaptive Compilation Environments (RACE Lab):cs.ucsb.edu/~ckrintz/racelab.html

UCSB Distributed Systems Labwww.cs.ucsb.edu/~dsl

Ben Zhaowww.cs.ucsb.edu/~ravenben

Eucalyptus Systemswww.eucalyptus.com

UCSB Greenscale Center for Energy-Efficient Computingiee.ucsb.edu/greenscale

VOLUME 15, SPRING 2011

Greenscale has harnessed computer scientists like Fred Chong and Timothy Sherwood with computer engineers to essentially redesign the building blocks of the entire chain so that energy use in the cloud is as scalable as its data-handling. Given the amount of energy consumed by data centers—by some estimates they use $30 billion of power a year—companies like Google are sponsoring research to cut the bill without harming the centers’ speed or elasticity. Chong, for instance, is leading an effort to develop an experimental miniature data center where researchers can conduct “radical experiments” impossible at a working center. Those energy hogs meanwhile produce inordinate amounts of heat, which hurts performance and degrades electronics. Greenscale is tackling the issue on two tracks, aiming to reduce the heat generated in the first place and also to find innovative methods of cooling through things like better heat sinks and improved air flow.Some of the biggest challenges, from the hardware point of view, are the network bandwidth, the flow of data, which as

Agrawal notes, underlies the cloud’s raison d’être. Within the College of Engineering, experts in computer networking and photonics are beavering away at those fundamental issues of computer engineering, focusing on more, faster and cheaper.On the software end, Professor Kevin Almeroth and Professor Elizabeth Belding, through their Networking and Multimedia Systems Lab, are examining other networking challenges, including those posed by mobile devices.And as the virtual computers may span multiple sites, so too will the client resources they handle. That’s no problem if the data is pretty much independent of the other data, notes Zhao. But when the data is ‘highly connected,” breaking it apart, reconstituting it, and interpreting it on the fly, gets commensurately hairier. And as luck would have it, two common data sets loaded with interdependencies are graphs and social networks like the uber-popular Facebook.“There are some solutions out there, but none of them are that good,” says Zhao, who is working on a better one. “That’s why our department has risen over the last 10 years, because we focus on hard problems but they also have practical significance,” Krintz says with conviction. “So they’re useful in the near term, but we are still taking risks and figuring out how to evolve to the next generation of hardware and software.“Santa Barbara is unique: there is no other institution in the United States, in the world, that is working on cloud software at this level,” Krintz says. “It’s all the pieces of the cloud—databases, the theoretical foundations, the scheduling and the infrastructure and the platforms, the programming languages and the security. In a positive way, it’s been the perfect storm.”

Page 27: Convergence - Issue 15

15VOLUME 15, SPRING 2011

What is this?

Find the answer on the inside back cover

Pete

r A

llen

Page 28: Convergence - Issue 15

UCSB part of $122-million “Artificial Photosynthesis” projectA Department of Energy-funded effort to develop a cost-effective method of producing energy from sunlight will include contributions from UC Santa Barbara researchers.Eric McFarland, a professor in the Department of Chemical Engineering, is leading UCSB’s part in the ambitious project, which aims to mimick the process plants use to produce energy—photosynthesis.McFarland will help develop automated systems that will enable enormous numbers of chemical compounds to be rapidly synthesized and screened to identify those with the most potential for use in an artificial photosynthesis system. McFarland’s goal, once these ultra-high throughput experimentation systems are up and running, is to synthesize and screen a million compounds in a day.

Computer scientists clean upChristopher Kruegel, associate professor of computer science and a member of UCSB’s Computer Security Group, has come up with a way of “disinfecting” computers after they’re been invaded by viruses or worms. He’s developed new security software that can identify and neutralize viruses after they’ve infected a user’s machine, even if the virus has no known signatures. Kruegel has formed a company—LastLine, Inc., as in “last line of defense”—with UCSB Computer Science Professor Giovanni Vigna and other colleagues to develop the software.

Scientists investigate “brain maps” involved in reaching Scientists at UCSB have discovered that humans use different “brain maps” when they reach for an object versus reaching for a place on the body. Their

News from Engineering and the Sciences at UC Santa BarbaraSHORTS... HaVE yOU HEaRd?

findings could have applications in robotics and in the development of machine-brain interfaces that could help paraplegics.Psychology Professor Scott Grafton, director of UCSB’s Brain Imaging Center, and postdoctoral fellow Pierre-Michel Bernier studied the brains of 18 individuals who made 400 distinct arm reaches as they lay in an MRI scanner. The researchers found that reaching for visible targets involved a visual map in the brain, whereas when subjects reaching for place on their own bodies, they used a different kind of map—a body map. The work, published in the journal Neuron (64(4) pp. 776-788), challenges the prevailing scientific opinion that all reaching movements are planned using a visual map.

Why do birds sing? Scientists use webcams to investigateResearchers at UCSB have used video recordings of birds to study how they use songs and physical displays to communicate with others.Stephen Rothstein, professor of zoology, and Adrian O’Loghlen, a research scientist in the Department of Ecology, Evolution and Marine Biology, used webcams installed in aviaries at UCSB to record male brown-headed cowbirds (like most songbirds, only male cowbirds sing) singing to both females and males. They found that the songs and physical displays the males used to communicate with other males—a form of aggression—were different from those directed at potential mates. The researchers, who published the work in the journals Animal Behavior (79(6), pp. 1285-1292) and The Condor, (112(3) pp. 615-621), say there’s huge potential for the use of video recordings in studying bird communication. The way in which young birds learn to sing has many parallels to language development in humans, O’Loghlen says—“Babies babble. Birds babble. Babies memorize a lot of sounds before they ever try to produce them. Birds do the same”—and many scientists believe birdsong may by the best animal model for the study of cultural evolution in humans.

Bacteria wield “toxic darts” against their opponentsBacteria take down their competitors by impaling them with “toxic darts,” according to work by UCSB scientists that could help researchers develop new ways to control disease-causing pathogens.A team of researchers in the Department of Molecular, Cellular, and Developmental Biology studied many species of bacteria, and found toxic-tipped proteins on their surfaces. When bacteria bump into each other, these stick-like proteins deliver toxic darts to their competitors in a process called contact dependent growth inhibition. The researchers identified dozens of distinct types of toxic darts that disable their bacterial targets in different ways, such as by decimating their DNA so the cells can’t replicate, or cutting up their RNA so they can’t synthesis proteins and grow. Some bacteria are protected from attack by an immunity protein that inactivates the toxic dart.The researchers published their work in the journal Nature (468, pp. 439-442).

Physicists study swirling to understand hurricanes Hurricanes and cyclones involve vortices in the atmosphere; vortices in the earth’s outer core are essential for the formation of the planet’s magnetic field.In order to study these processes in the laboratory, UCSB Physics Professor Guenter Ahlers and postdoctoral fellows Stephan Weiss and Jin-Qiang Zhong, working with colleagues in the Netherlands, filled small cylinders—4 to 40 inches high—with water. The water was heated from below and cooled from above so that the warm water rose and the cooler water sank—the convection phenomenon that’s important in creating vortices—

26VOLUME 15, SPRING 2011

Page 29: Convergence - Issue 15

and the cylinder was rotated on its axis—another important factor in the development of vortices.The researchers discovered a new and unexpected phenomenon: the cylinder had to be rotated at a certain minimum rate for vortices to develop, and this critical speed depended on the dimensions of the cylinder—it was lower for wider cylinders.The researchers published their work in the journal Physical Review Letters (105, 224501).

CNSI launches new center for education, development

The California NanoSystems Institute is building on its success in education and outreach by launching a new center that will support campus-wide efforts to educate, inspire and nurture current and future scientists and engineers. The Center for Science and Engineering Partnerships aims to foster collaboration between faculty and educators and to provide resources for UCSB scientists and engineers wishing to develop, expand or evaluate programs such as undergraduate research internships, professional development efforts for graduate students and postdocs and informal community science programs. The center’s focus on evaluation will help campus entities meet funding agency requirements for information on the impacts of these programs.

Collaboration will focus on macular degeneration An international collaboration between UCSB, the Keck School of Medicine at the University of Southern California, and several other research institutions is bringing together leaders in the fields of stem cell biology, basic science, and ophthalmology to develop a treatment for blindness caused by age-related macular degeneration.

The California Project to Cure Blindness was formed with a $16 million “disease team” grant from the California Institute for Regenerative Medicine to fund development of a stem cell-based treatment for age-related macular degeneration.

Engineering alumnus Abo-Shaeer awarded MacArthur fellowshipAmir Abo-Shaeer, a UCSB engineering alumnus, recently became the first public school teacher to receive a prestigious MacArthur fellowship—a $500,000 no-strings-attached award, popularly known as a “genius grant.” Abo-Shaeer founded the Dos Pueblos Engineering Academy at his Goleta alma mater, and has had great success getting students involved in engineering and science. He teaches physics and engineering, including a robotics class that competes in the international FIRST competition.Abo-Shaeer and his Dos Pueblos Engineering Academy are the subjects of a book, “The New Cool,” by New York Times bestselling author Neal Bascomb, to be released in March 2011.

Awschalom honored by Materials Research SocietyDavid D. Awschalom, professor of physics, electrical and computer engineering, and the Peter J. Clarke Director of the California NanoSystems Institute at UCSB, has received the Turnbull Lecturer Award from the Materials Research Society. The award recognizes the career of a scientist who has made outstanding contributions to understanding materials phenomena and properties through research, writing, and lecturing. Awschalom is cited “for pioneering achievements and leadership in establishing the field of semiconductor spintronics.”

The university’s reputation got another boost from rankings released recently by the National Research Council, U.S. News and World Report and Times Higher Education.

In its annual listing of the nation’s best universities, U.S. News and World Report ranked UCSB 9th among public universities and 39th overall—the highest ever rankings for UCSB in this list. The undergraduate program in UCSB’s College of Engineering was ranked number 36 (tied) on the U.S. News & World Report list of “Best Programs at Engineering Schools Whose Highest Degree is a Doctorate.”

Times Higher Education (THE) completely overhauled its methodology for this year’s World University Rankings, giving more weight to objective measures than subjective reputation. The results were good news for UCSB, which was ranked number 29 in the world overall, and 17th for engineering and technology.The university also fared well in an assessment of doctoral programs by the National Research Council, which is part of the National Academies. Rather than giving each program an absolute rank, the NRC came up with ranking ranges to reflect how rankings can vary depending on how much weight is given to the different kinds of data used in the assessment. Ten UCSB programs achieved rankings ranges that extended into the top five, including the Materials Department, which was ranked first in the country, Chemical Engineering, Computer Science, Electrical and Computer Engineering, Geography, Marine Science, Mechanical Engineering and Physics. UCSB’s program in Ecology, Evolution and Marine Biology received a ranking reaching into the top 10, and Chemistry and Biochemistry had a ranking range that extended into the top 20.

UCSB EXCELS

27VOLUME 15, SPRING 2011

Page 30: Convergence - Issue 15

28VOLUME 15, SPRING 2011

News from Engineering and the Sciences at UC Santa BarbaraSHORTS... HaVE yOU HEaRd?

Engineer receives NIH New Innovator Award Luke Theogarajan, an assistant professor in the Department of Electrical and Computer Engineering, recieved the prestigious 2010 New Innovator Award from the National Institutes of Health.Theogarajan is working on a retinal prosthesis—an electronic implant to restore vision to the blind. He is one of the few engineers designated as direct recipients of the $1.5 million research award.

Nguyen achieves NSF honorThuc-Quyen Nguyen, an assistant professor in UCSB’s Department of Chemisty and Biochemistry, has been honored by the National Science Foundation with an American Competitiveness and Innovation Fellowship, awarded in recognition of outstanding research accomplishments and her contributions toward mentoring and broadening participation of women, underrepresented minorities and/or people with disabilities.

Packard Foundation fellowship for chemical engineerMichael Gordon, assistant professor in UCSB’s Department of Chemical Engineering, received a Packard

Foundation Fellowship for Science and Engineering—one of the largest nongovernmental fellowships in the country. The $875,000 award, given out over five years, will support Gordon’s research on advanced microscopy techniques for examining structures and surfaces at a minute scale. Gordon intends to develop a scanning chemical microscope for detection, identification and high-resolution imaging of biomolecules and surfaces.

Three computer scientists honored by ACMThree UCSB computer scientists have been honored by the Association for Computing Machinery for their outstanding contributions to the field.Computer Science Chair Amr El Abbadi and Professor Subhash Suri were named fellows of the society in recognition of their achievements in computer science and information technology, and Professor Divyakant Agrawal was named an ACM distinguished scientist.

Hawker named ACS fellowCraig Hawker, director of UCSB’s Materials Research Laboratory, has been made a fellow of the American Chemical Society (ACS), the world’s largest scientific society.Hawker’s research focuses on the interface between organic and polymer chemistry. He is a pioneer in the design and synthesis of macromolecules with applications in microelectronics and biotechnology, including nanoparticles that could be deployed in the body to detect and treat cardiovascular disease.ACS fellows “are scientific leaders, improving our lives through the transforming power of chemistry,” ACS President Joseph S. Francisco said in a statement announcing the honors.

New species of sea slug, ribbon worm named for scientistsTwo UCSB scientists have been given an uncommon honor by having new species named after them. A colorful sea slug discovered in tide pools in Carpinteria by Jeff Goddard of the Marine Science Institute has been named Flabellina goddardi by the California Academy of Sciences sea slug expert who determined it was a new species.A new species of ribbon worm bears the name Carcinonemertes kurisi, in honor of Armand Kuris, a professor of zoology. The worm, which lives as a parasite in crabs found from Northern California to Baja California, was named for Kuris by his former student, Patricia S. Sadeghian. “When a species is named after you, that is forever,” Kuris said of the honor.

By Convergence staff, and George Foulsham and Gail Gallessich from the Office of Public Affairs.

Page 31: Convergence - Issue 15

Shown on page 25 is an artist’s rendition of a defect found in silicon carbide (SiC) that may prove useful in the budding field of quantum information science. The defect is known as the nitrogen vacancy (NV) center because it is composed of a vacant silicon site next to a nitrogen atom replacing a carbon atom. The defect has recently been identified by researchers at UCSB as a potential candidate for a qubit, which is the basic building block that is needed to build a quantum computer. Quantum mechanics allows many non-intuitive phenomena to occur, such as the superposition of the “on” and “off ” of conventional bits used in modern-day computers. A

quantum computer would exploit these features to solve certain computational problems with speeds that would be unobtainable with current super-computers. Many previous experiments have shown that a similar defect in diamond (also known as the NV center) can function as a qubit—even at room temperature. As diamond is presently difficult to both grow and process, the identification of another system with similar properties is a promising discovery. The research on SiC did not stop there; it also provided an outline to systematically predict which defects in other systems might have similar quantum features.

CREDIT: “Quantum computing with defects,” J. R. Weber, W. F. Koehl, J. B. Varley, A. Janotti, B. B. Buckley, C. G. Van de Walle, and D. D. Awschalom, Proc. Natl. Acad. Sci. USA 107, 8513 (2010).

FIFTEEN, SPRING 2011Editor-in-Chief: Anna DavisonCreative Director: Peter AllenWriters: Anna Davison and Michael Todd

Editorial Board:Larry Coldren, Acting Dean, College of EngineeringPierre Wiltzius, Dean of Mathematical, Life and Physical Sciences, College of Letters & ScienceRichard Church, Associate Dean of Mathematical, Life & Physical Sciences, College of Letters & Science David Awschalom, Scientific Director, California NanoSystems InstituteFrank Doyle, Associate Dean for Research, College of EngineeringGlenn Beltz, Associate Dean for Undergraduate Studies, College of EngineeringPeter Allen, Marketing Director, Engineering and the SciencesAnna Davison, Acting Communications Manager, College of EngineeringAndrea Huebner, Publications Director, UCSB Alumni AssociationDeirdre O’Shea, Director of Communications, College of Letters & Science

Convergence is a publication of Engineering and the Sciences at the University of California, Santa Barbara, CA 93106-5130.

If you have comments or questions about this publication, please contact Anna Davison: [email protected]

To change your address, contact Alison McElwee: [email protected]

If you need Convergence in another format because of a disability, please contact Alison McElwee: 805.893.2568 or [email protected]

For permission to reproduce material in Convergence and at Convergence Online please contact Anna Davison: [email protected]

The University of California, in accordance with applicable federal and State law and University policy, does not discriminate on the basis of race, color, national origin, religion, sex, gender identity, pregnancy (including childbirth and medical conditions related to pregnancy or childbirth), disability, age, medical condition (cancer-related), ancestry, marital status, citizenship, sexual orientation or status as a Vietnam-era veteran or special disabled veteran. The University also prohibits sexual harassment. This nondiscrimination policy covers admission, access and treatment in University programs and activities. Inquiries regarding the University’s student-related non-discrimination policies may be directed to: Office of Affirmation Action Coordinator, University of California, Santa Barbara, 805.893.3105.

CONVERGENCEThe Magazine of Engineering and the Sciences at

UC Santa Barbara

What is this? Answer from page 25

Support Engineering and the Sciences When you support students through programs, scholarships, and fellowships you’re investing in brilliant young minds who will be tomorrow’s leaders and make a lasting impact on society.

Your support of our faculty and their research creates knowledge with applications in health, energy, computing, communications, the environment and much more.

engineering.ucsb.edu/giving science.ucsb.edu

Stay Current—Visit us Online: convergence.ucsb.edu

News New species of sea slug named for scientists

Hawker named ACS fellow

Collaboration will focus on macular degeneration

Packard Foundation Fellowship for chemical engineer

Light FantasticOil ShockFreedom Life. Preserved.Ahead in the Cloud

Page 32: Convergence - Issue 15

Non Profit Org.U.S. Postage

PAIDSanta Barbara, CA

Permit No. 104

CONVERGENCEUniversity of California Santa Barbara

Santa Barbara, CA 93106-5130