Intel, increasingly customising server chips for customers, is now tuning chips for workloads in Big Data.
Software is becoming an important building block in chip design, and customisation will help applications gather, manage and analyse data a lot quicker, said Ron Kasabian, general manager of Big Data solutions at Intel.
Through hardware and software improvements, the company is trying to figure out how its chips can perform better in areas like predictive analytics, cloud data collection and specific task processing. The company has already released its own distribution of Hadoop, a scalable computing environment that deals with large data sets, and now chip improvements are in the pipeline.
Kasabian said Intel is starting with the software. "It takes a while to get silicon to market," he said. "We understand where we can optimise for silicon, and there are certain things to improve for performance and optimisation."
The company is taking lessons from software implementations and then looking to enhance the silicon to fill any software gap, Kasabian said, adding that the chip-design process takes about two years.
Server makers have been customising servers specifically to carry out Big Data workloads, and improvements at the chip and instruction-set level could speed up task execution.
The plan includes developing accelerators or cores for Big Data type workloads. For example, Intel is working with Chinese company Bocom to implement the Smart City project, which tries to solve counterfeit license plate problems in China by recognising plates, car makes and models. The project involves sending images through server gateways, and Intel is looking to fill software gaps by enhancing the silicon. One improvement could be implementing accelerators to decode video, Kasabian said.
Intel has a big software organisation, and the appointment of Renee James - formerly head of the software unit - earlier this year as the company's president was a sign of the chip maker's intent to dig deeper into software. The company does not want to become a packaged software distributor, but wants to enable software to work better on Intel architecture hardware. For a long time, Intel has backed open source software and has hundreds of coders contributing to the development of Linux.
Different industries have different implementations of Big Data, Kasabian said. For example, a big data problem in genomics could differ from one in telecommunications.
Intel is also entering the space of the Internet of Things, an emerging field in which networked devices with embedded processors and sensors are used as data-gathering instruments. The company has assets such as McAfee's software and hardware platform and Wind River's real-time operating system for its embedded chips to quickly process and securely collect data.
Outside of the silicon, Intel is focusing on providing the right software tools for data centres. Hadoop was the starting point, and now Intel is looking closely at analytics, Kasabian said.
Attaching Intel's name to Hadoop will "kind of ease the mind of folks in enterprises," Kasabian said, adding that implementation of the platform in data centres will be easier.
A lot of research is also taking place at Intel labs on stream processing and graph analytics as the company designs chips and tweaks software.
"We're looking at all the big industry categories," Kasabian said.