INTEL’S BIG DATA STRATEGY
The number of connected devices is equal to the current global population, and is expected to double by 20151. The increase is spurred largely by billions of networked sensors and intelligent systems – also known as the Internet of Things (IoT). From the proliferation of mobile devices such as smartphones and tablets to RFID readers and sensors, people and machines are producing data at exponential rates.
In fact, Intel estimates that the world generates one petabyte of data every 11 seconds or the equivalent of 13 years of consecutive high definition video. The term “big data” commonly refers to this explosion of data – characterized by its volume, variety and velocity – that offers the potential to enrich our lives through new scientific discoveries, business models and consumer experiences.
Intel’s position is that every individual and organization in the world should be able to unlock the intelligence available in big data. The company aims to address the cost, complexity and confidentiality concerns associated with managing, storing and securing the massive amounts of data.
Intel is addressing these concerns by delivering open data management and analytics software platforms including the Intel Distribution of Apache Hadoop* software (Intel Distribution) and the Intel Enterprise Edition for Lustre* software.
• Intel Distribution of Apache Hadoop software - Hadoop is an open source framework for storing and processing large volumes of diverse data on a scalable cluster of servers. The Intel Distribution is the first to provide complete encryption with support of Intel® AES New Instructions in the Intel® Xeon® processor. By incorporating silicon-based encryption support of the Hadoop Distributed File System*, organizations can now more securely analyze their data sets without compromising performance. The optimizations made for the networking and IO technologies in the Intel Xeon processor platform also enable new levels of performance. Analyzing one terabyte of data, which would previously take more than four hours to fully process, can now be done in seven minutes2 thanks to the data-crunching combination of Intel’s hardware and the Intel Distribution.
• Intel Enterprise Edition for Lustre software - Lustre is an open source parallel distributed file system and key storage technology that ties together data and enables extremely fast access. The Intel Enterprise Edition for Lustre software is a validated and supported distribution of Lustre featuring management tools as well as a new adaptor for the Intel Distribution. When paired with the Intel Distribution, the Intel Enterprise Edition for Lustre software allows Hadoop to be run on top of Lustre, significantly improving speed in which data can be accessed and analyzed. This allows users to access data files directly from the global file system at faster rates and speeds up analytics time, providing more productive use of storage assets as well as simpler storage management.
Intel is committed to contributing its enhancements made to both the Apache Hadoop and Lustre code back to the open source community. The aim is to provide the industry with a better foundation from which it can push the limits of innovation and realize the transformational opportunity of big data.
Investments, led by Intel Labs, are driving academic research in data-intensive computing platforms, machine learning, parallel algorithms, visualization and computer architecture. Intel has established an Intel Science and Technology Center (ISTC) for Big Data. It is hosted by the Massachusetts Institute of Technology’s (MIT) Computer Science and Artificial Intelligence Laboratory (CSAIL). The objective of the research center is to encourage new data-intensive user experiences that accelerate the pace of discoveries across science, medicine and industry.
Intel has also released GraphBuilder, a beta, open software that helps scientists in industry and academia to rapidly develop new applications by constructing graphs that outline relationships within data. Additionally, Intel continues to invest in research and capital to advance the big data ecosystem. Intel Labs is at the forefront of advanced analytics research which includes the development of Intel® Graph Builder for Apache Hadoop* software, a library to construct large data sets into graphs to help visualize relationships between data.
Intel Capital continues making major investments in disruptive big data analytics technologies including MongoDB company 10gen and big data analytics solution provider Guavus Analytics.
DRIVING BIG DATA ADOPTION IN THE REGION
Intel has identified multiple routes to market for its big data solutions: system integrators (SI), independent software vendors (ISV) original equipment manufacturers (OEM) and training partners form the basis of this go to market plan.
By partnering with all key segments of the technology ecosystem, SI’s, ISV’s and OEM’s Intel aims to equip these partners with the capability to address their customers’ big data and cloud challenges and to create new revenue streams. By working with training partners, Intel plans to engage and educate customers about the user case scenarios and opportunities big data holds for a range of businesses.
The recently launched Big Data Innovation Center in Singapore is further evidence of Intel’s commitment to driving big data adoption in Asia. The center, which is a collaborative project between Intel, Dell, and Revolution Analytics, is the first major effort targeted at unlocking the full value of the growing market for big data analytics in Asia Pacific. The establishment brings together expertise across all three organizations to provide customers with extensive training programs, proof-of-concept capabilities and solution development support, therefore equipping participants with the required skills to improve the quality of data mining across a wider range of platforms from a larger pool of sources.
Speaking about the collaborative project at its launch, Boyd David, VP & GM of Intel’s Datacenter Software Division, said:
“Intel is committed to providing the industry with a better foundation from which it can push the limits of innovation and realize the transformational opportunity of big data. The collaboration with Dell and Revolution Analytics are a realization of our vision for helping customers, organizations, governments and ISVs in the Southeast Asia region to maximize the growing market opportunities and drive new projects forward.”
DRIVING CLOUD ADOPTION IN THE REGION
CASE STUDY: Quanta Computer
Earlier this year, as part of their efforts to define the next generation of rack technologies used to power the world's largest data centers, Intel and Facebook unveiled a rack architecture from Quanta Computer to show the total cost, design and reliability improvement potential of disaggregation.
The disaggregated rack architecture includes Intel's new photonic architecture, based on high-bandwidth, 100Gbps Intel® Silicon Photonics Technology, that enables fewer cables, increased bandwidth, farther reach and extreme power efficiency compared to today's copper based interconnects. Silicon photonics is a new approach to using light (photons) to move huge amounts of data at very high speeds with extremely low power over a thin optical fiber rather than using electrical signals over a copper cable. Intel has spent the past two years proving its silicon photonics technology was production-worthy, and has now produced engineering samples. By separating critical components from one another, each computer resource can be upgraded on its own cadence without being coupled to the others. This provides increased lifespan for each resource and enables IT managers to replace just that resource instead of the entire system. This increased serviceability and flexibility drives improved total-cost for infrastructure investments as well as higher levels of resiliency. There are also thermal efficiency opportunities by allowing more optimal component placement within a rack. The mechanical prototype is a demonstration of Intel's photonic rack architecture for interconnecting the various resources, showing one of the ways compute, network and storage resources can be disaggregated within a rack.
Mike Yang, general manager of Quanta QCT, said:
“Data center and network infrastructure providers are under constant pressure to support new, revenue-generating services in the public and private cloud, yet the costs of building the infrastructure are often too high to do so. Working with Intel, we’ve successfully shown the business value that disaggregated rack architectures deliver, specifically in response to cost, design, and reliability. With data volumes continuing to grow in the coming years, the architecture will be a key component enabling users to manage the expansion and subsequent requirements from their cloud deployments.”
COLLABORATING WITH INDUSTRY PARTNERS
Intel and SAP® have extended their decade-long collaboration to bring their combined hardware and software expertise to help businesses meet challenges ahead. The results of their joint engineering efforts are helping businesses accelerate innovation and deliver better value to their customers.
With a large proportion of SAP implementations such as the SAP® Business Suite now on Intel platforms, businesses transform. The increasing adoption of SAP HANA® and SAP Business Suite powered by SAP HANA coupled with the effect of Bring Your Own Device (BYOD) phenomenon into enterprise mobility, accelerates business transformations. Their developments are made possible with efficient scalability to cope with demand fluctuations, robust capability to reduce downtime, real time and unprecedented productivity and cost efficiency from a wide variety of solutions across client devices, data centers and cloud.
• SAP HANA offers real-time business intelligence by accelerating analytics through simplifying underlying infrastructure and operational framework, and providing consistency across private and public clouds to deliver highly optimized platforms. As a data agnostic, it can be implemented without affecting existing applications or systems – a flexible, cost-effective, real-time approach for managing large data volumes, allowing organizations to dramatically reduce hardware and maintenance costs associated with running multiple data warehouses and analytical systems.
• Intel-SAP solutions deliver industry-leading performance: (1) Intelligent performance, delivering superior analytical capabilities on a smaller server footprint, with processor operation running dynamically to the needs of the workload; (2) Automated energy efficiency by automatically regulating power consumption and intelligently adjusting performance according to application needs; (3) Advanced reliability, making the data center secure and reliable through advanced encryption and virtualization; and (4) Unparalleled mobility enabling large enterprises to connect mobile devices into the company network, simplifying IT tasks while reducing TCO associated to buying, deploying, securing and maintaining such devices.
Intel and SAP focus on ensuring that tomorrow’s solutions interoperate smoothly with existing customer implementations, and with current and future SAP applications. With Intel and SAP, businesses achieve greater performance, improved efficiency and cost management, and enhance their position in the SaaS market.
Intel and Red Hat have been supporting mission-critical computing environments for years, helping companies optimize their cost models while maintaining the highest levels of performance and reliability.
With recent breakthroughs at both the hardware and software levels, Intel and Red Hat are delivering exceptional new support for large-scale data center consolidation and for hosting today’s most demanding and mission-critical database and transactional workloads on powerful, resilient enterprise-class servers.
These advances mark a true tipping point in the computing industry. Leading enterprise software vendors are certifying their entire solution stacks on Red Hat Enterprise Linux running on Intel Xeon processor-based servers.
Enterprise IT organizations can now address their most demanding computing requirements on an affordable, standards-based computing infrastructure, and they can do so at a fraction of the cost of UNIX/RISC architectures. With these new solutions, the reasons for continuing to invest in costly, proprietary computing solutions have all but disappeared, even for the largest and most mission-critical enterprise applications.
Speaking about the partnership, Frank Feldmann, director of marketing, Red Hat, said:
“Intel continues to be a phenomenal partner to Red Hat and our collaboration has led to many industry advances that empower our customers to operate large-scale, standards-based infrastructure with the reliability and stability their businesses require. Through extensive collaboration, Intel and Red Hat are delivering breakthrough capabilities for mission-critical computing, extending the exceptional value of Red Hat Enterprise Linux and Intel Xeon processor family combined into the core of the enterprise data center.
Together, Intel and Red Hat are driving innovation into the Linux ecosystem to establish a powerful, open and unified foundation for the computing stack, including virtualization and storage.”
The Taiwan Stock Exchange
The Taiwan Stock Exchange Corporation (TWSE) is a financial institution operating as a stock exchange that provides trading for 758 listed companies in Taiwan. Its primary business drivers include developing new financial products and boosting the number of services it offers. To address its changing business needs and increase the overall trust and security of its cloud infrastructure, it undertook a joint proof of concept (PoC) using Intel® Trusted Execution Technology (Intel® TXT), Cisco Unified Computing System* (UCS*) servers, and software solutions from HyTrust, McAfee, and VMware.
From this initial proof of-concept deployment, TWSE expects many other business units to be able to more effectively use cloud infrastructures to increase business agility, reduce costs, and improve asset utilization without compromising security considerations.
Monday, August 26, 2013
INTEL’S BIG DATA STRATEGY