introduction to rehabilitation engineering...introduction to rehabilitation engineering author:...

18
Big data Dr. Ervin Sejdić, Ph.D. Assistant Professor, Department of Electrical and Computer Engineering Assistant Professor, Department of Biongineering Assistant Professor, Department of Biomedical Informatics Assistant Professor, Intelligent Systems Program University of Pittsburgh www.imedlab.org

Upload: others

Post on 07-Jul-2020

20 views

Category:

Documents


0 download

TRANSCRIPT

Big data

Dr. Ervin Sejdić, Ph.D.Assistant Professor, Department of Electrical and Computer Engineering

Assistant Professor, Department of BiongineeringAssistant Professor, Department of Biomedical Informatics

Assistant Professor, Intelligent Systems ProgramUniversity of Pittsburgh

www.imedlab.org

Characteristics of Big Data: 1-Scale (Volume)

Data Volume

44x increase from 2009-2020

From 0.8 zettabytes to 35zb

Data volume is increasing exponentially

Exponential increase in collected/generated data

Characteristics of Big Data: 2-Complexity (Varity)

Various formats, types, and structures

Text, numerical, images, audio, video, sequences, time series, social media data, multi-dim arrays, etc…

Static data vs. streaming data

A single application can be generating/collecting many types of data

To extract knowledge all these types of data need to linked together

Characteristics of Big Data: 3-Speed (Velocity)

Data is begin generated fast and need to be processed fast

Online Data Analytics

Late decisions missing opportunities

Examples E-Promotions: Based on your current location, your purchase

history, what you like send promotions right now for store next to you

Healthcare monitoring: sensors monitoring your activities and body any abnormal measurements require immediate reaction

Big Data: 3V’s

Some Make it 4V’s

The Model Has Changed…

The Model of Generating/Consuming Data has Changed

Old Model: Few companies are generating data, all others are consuming data

New Model: all of us are generating data, and all of us are consuming data

The annual growth rate of big data is 60%.

Many Business Schools call it Business Analytics rather than Big Data.

How much data? Google processes 20 PB a day (2008)

Wayback Machine has 3 PB + 100 TB/month (3/2009)

Facebook has 2.5 PB of user data + 15 TB/day (4/2009)

eBay has 6.5 PB of user data + 50 TB/day (5/2009)

CERN’s Large Hydron Collider (LHC) generates 15 PB a year

640K ought to be enough for anybody.

Computing resources are becoming much cheaper and more powerful.

In 1980 a terabyte of disk storage cost $14 million; now it costs about $30.

Amazon or Google will “rent” a cloud-based supercomputer cluster for only a few hundred dollars an hour.

IBM IBV/MIT Sloan Management Review Study 2011

Copyright Massachusetts Institute of Technology 2011

substantially outperform

Studies show that organizations competing on analytics outperform their peers

1.6x Revenue Growth 2.0x EBITDA

Growth2.5x Stock Price Appreciation

Aspirometer

GOAL: To develop a computational biomarker indicative of aspirations

Gait analysis

GOAL: Using computational biomarkers to understand the effects aging, diseases and environment on human gait

Transcranial Doppler (TCD)

GOAL: TCD as a novel access modality

Handwriting

GOAL: Handwriting as an early assessment tool for various diseases

Brain networks and functional outcomes

GOAL: To establish brain networks associated with functional outcomes

Swallowing

Gait

Handwriting

Thank you.Any questions?

www.imedlab.org