M2M not facing big data challenge, yet

Big data deluge has yet to hit the machine-to-machine (M2M) industry as connected devices are still low in number, note market observers, who anticipate this challenge will come soon and affect M2M at the network, application and storage layers.

By Liau Yun Qing, ZDNet Asia

Sydney-based Telsyte analyst, Alvin Lee, explained that in Australia, for instance, there currently are under 1 million M2M connections so traffic generated by such connections is still “very manageable” when companies know the requirements of each connection.

Rodney Gedda, senior analyst at Telsyte, added that the amount of big data management from M2M will be heavily dependent on the type of application. For example, if an M2M application collects vast volume of data from different points or locations, data collection and management on a big data scale can be a challenge, Gedda said in an e-mail.

According to Ian Koh, practice head of industry verticals at Ericsson Southeast Asia, big data challenges can occur through the growing number of data sources from sensors, smart devices and other Internet-connected devices, as well as through the increase in the size of M2M data being transmitted.

Koh noted that the size of data relayed from M2M connections are still relatively small today, with some exceptions for security-related applications such as alarm systems and vehicle-tracking systems for passenger cars and commercial vehicles.

Alcatel-Lucent, for one, is expecting upcoming challenges in M2M and big data as studies by the company’s research arm Bell Labs determined that global data traffic would grow 30-fold over the next five years.

Said Philippe Gerard, CTO for Alcatel-Lucent’s Singapore and Brunei operations:

“We’re clearly seeing new requirements for applications such as the need to correlate data from multiple sensors together with data from smart phones and smart devices.”

He added that there will be a snowball effect as the number of smart devices and sensors increase, leading to more applications and an “impeding data traffic tsunami”.

M2M bottlenecks in big data challenge

When the big data challenge arrives, bottlenecks can appear at the network, application and storage layers, according to Gedda.

For M2M applications running on mobile networks, the amount of data transferred will be limited by the availability, cost and capacity of network bandwidth, the Telsyte analyst explained.

Lee added that with data volume growing within a carrier’s network, traffic prioritization will become a challenge for operators as more sophisticated logic will need to be applied to better manage traffic generated by users and M2M applications.

Gerard added that current network infrastructures were not designed to support challenges introduced by big data. “There is a need for more flexibility and control to be made available [to support] the new applications and new devices capabilities,” he said.

He added that there is a need for flexible network APIs (application programming interfaces) which, he noted, can provide a good business model for operators as third-party app developers will then be able to drive innovation based on market demand, in a fast and flexible way.

For organizations leveraging M2M, Gedda noted that challenges associated with big data will more likely be at the application and storage layers, especially for businesses that are not used to dealing with collecting “massive amount” of data.

Existing relational database systems may not scale to the level required of big data, therefore, investments in distributed server infrastructure or cloud services to handle the storage and processing, may be required,” he added.

To prepare for big data challenges, Gedda noted that the M2M industry needs to highlight data processing requirements of various applications so that businesses are not caught ill-prepared by big data demands after they have commenced M2M projects.

Ericsson’s Koh agreed, adding that standardization and industry collaborations are critical for the M2M industry to be more prepared to address big data. On this front, Ericsson is involved with various standardization bodies to prepare for big data including 3GPP (Third-Generation Partnership Project) for efforts associated with network infrastructure and optimization for connected devices, IPSO Alliance for unified interface to gateways and modules, as well as European Telecommunications Standards Institute (ETSI) architecture for service brokering, discovery and composition.

Gedda said M2M vendors will also need to integrate their technology with suitable storage infrastructure that can scale as data capture scales.

Source: ZDNetAsia

Related posts