Big Data Challenges

Big Data Challenges Overview

  1. Big Data Business Challenges
  • Evaluating volumes of structured, unstructured and semi structured content.
  • Effective management of data storage and retrieval.
  • Old stored historical data analysis and prediction features based on the analysis results.
  • Storage and Management of content generated through various social media networks like, Facebook, Twitter, YouTube, Review Website, Consumer Forum etc.
  • Understand the customers sentiments through social media channels.

Big-Data-Challenges

2. Big Data Technical Challenges

  • Processes different kind of high volume of content and data in real-time from different data sources.
  • Classification of data based on the context.
  • Evaluating the data and analysis results in real-time.
  • Better visualization of content, data analysis results.
  • Options to customers to create domain specific applications.Easy integration with existing data base infrastructure.

 

Big Data 2012 – Big Data Analytics

Big Data 2012 – Big Data Analytic

On a everyday, the corporate generates regarding fifteen petabytes of knowledge regarding their business and money operations, yet as customers and suppliers. a powerful volume of information additionally circulates in social media and mobile devices. Another volume, as spectacular because it is generated by the increasing range of sensors and alternative embedded devices within the physical world, like roads, vehicles, aircraft, robotic machines, among others. One second of high-definition video generates a pair of,000 times additional bytes than a page of text. Capture, handle and analyze this large volume of information could be a massive challenge.

So does one get a difficulty that begins to awaken attention: the supposed massive knowledge. The term refers to databases of size considerably larger than those sometimes met. additionally, this technologies don’t show terribly appropriate for handling them. after all that’s a really subjective definition and mobile, for a definite size thought-about massive will become little during a few years. Today, our backup disks reception operating with T volume. giant databases area unit already within the scale of petabytes.

Analytically treat these knowledge will generate giant edges for society and for business. Recently, the McKinsey world Institute printed a really fascinating report on the economic potential of victimization massive knowledge, known as “Big Data: consecutive frontier for innovation, competition and productivity” which may be accessed here.

The Big knowledge already spans all sectors of the economy. One study showed that in 2009 every yankee company with over thousand workers keep, on average, over two hundred terabytes of information. And in some sectors, the common volume reached a computer memory unit.

Using massive knowledge is getting down to show itself as a differentiating consider the business situation. Some cases cited within the McKinsey report shows that some firms had substantial competitive benefits exploring Associate in Nursingalytically and during a timely manner an huge volume of information. the massive knowledge works 2 keywords: one volume is (are databases of enormous volumes) and also the alternative is speed (handling and analytically treatment should be done terribly quickly. In some cases, even in real time). this is often thanks to the scope of information which will be handled. a conventional knowledge warehouse collects knowledge from transcriptional systems like ERP. These systems record the operations meted out by the businesses as a purchase, for instance. however they are doing not record data regarding transactions that didn’t occur, though somehow be mirrored in discussions regarding the corporate and its product in social media. the corporate may also record varied data with the digitisation of conversations control by customers with decision centers and also the motion photos recorded on video in stores. This data typically unstructured, area unit already obtainable, which the idea of massive knowledge will is integrate them so as to come up with a additional comprehensive volume of knowledge, permitting the corporate to more and more create choices supported facts and not simply on sampling and intuition.

Of course there area unit still major challenges ahead. One is that the technology to quickly handle this huge volume of information. There area unit some technologies geared toward treating giant volumes, like Hadoop and systems specific databases like Cassandra – Open supply system used nowadays by Facebook, Twitter and by Reddit, that should upset lots of speed huge volumes of information distributed manner. Its largest operational surroundings browse over one hundred terabytes during a cluster of a hundred and fifty servers. Another fascinating technology is Associate in Nursing appliance geared to handle giant databases like Netezza, recently non inheritable by IBM.

Another technology that has a lot of space to grow within the space of Big knowledge is named stream computing. The recently declared IBM InfoSphere Streams, supported a look project at IBM analysis, known as System S. The paper then the System S may be seen at this link (pdf).

The idea of stream computing brings a replacement paradigm, and it’s fantastic. In ancient data processing model, Associate in Nursing enterprise filters knowledge from its varied systems and, once making an information warehouse, shooting “queries”. In apply, it’s mining on static knowledge, that don’t replicate the instant, however the context of hours, days or perhaps weeks past. With stream computing, this mining is completed in real time. rather than shooting queries on a info static, there’s endless stream of information (streaming data) across a group of queries. we are able to consider varied applications, whether or not in finance, health and even producing. We’ll see this latest example: a project with a corporation developing semiconductor producing monitors in real time the method of detection and classification of faults. With stream computing, failures in factory-made chips area unit being detected in minutes, not hours, or perhaps weeks. The defective wafers may be reprocessed and, additional significantly, you’ll be able to create time period changes in their own producing processes.

 

What Is Big Data – How Really Big Is “Big Data” ?

What is big data – How really big is “Big Data” ?

Digital data is growing day to day and present technology not sufficient to manage the rapid growth of the digital data. Now data is not a simple data its now a “big data”.

Computer passing huge data and daily new thousands of applications creating to link and extract the information & link with others on the web. All application producing more and more data.

What is BIG DATA ?

Scientific Data.
Automatic & physics experiment generate 40 TB/S.
Climate information.
Posts and Updates in social media sites.
Digital pictures and videos.
Daily purchase transaction records.
Cell phone GPS signal.
Rapid adoption of GPS deceives.

These all above loosely classified as a big data. Just think all above applications grow very rapid to this decade end, and start generating more then 50% of the todays data. No you can imagine why we need a big data concept. Now we need a more proper cloud technology to store and new software tools capture, manage, sentiment analysis, different analytics, different data strictures, diversity and process the dozen terabytes to many petabytes of data.

Critical Application Areas Of Big Data

Marketing Optimization.
Risk analytic.
Hardware Optimization.
Customer Transactions to Customer Interaction.
Fraud Detection.
Visualizations.
Human Patterns.
Real-Time data integration.
Comparative intelligence.
Smart Descion Making.

How really big is “Big Data” ?

Today
1 TB heath care images not display in real time.
Deep Sophistication, High Volume, Low latency.
1 PB 3D movie cannot edit in real time.

TomorrowBIG DATA “ LESS ABOUT THE SIZE AND MORE ABOUT THE FREEDOM “
40 MB PPT you can’t send via email.
Size, Speed, Volume, Source, Low Cost.
More precise, No ACID, Save Money, High productivity.

Big Data Domains –

1. Health care sector, It is a Public sector in EU and Retail in USA.
2. Industry and business functions.
3. Making better management decisions.
4. Customer segmentation.
5. Fraud detection and preventing.
6. Deep real time information processing.
7. Productivity growth and effective consumer care.
8. Effective information management tin the customer service, finance and insurance and government.
9. Effective policy management these are related to privacy, security, intellectual property, and even liability.
10. Effective management of the critical complex technologies.
11. Effective data mining and storage of the complex data without ACID.
12. Effective IT Management with deep analytical skills.

Conclusion – “BIG DATA” is future :-)