ip rfc

Backing Up Big Data? Chances Are You’re Doing It Wrong

By Peter Smails

The increasing pervasiveness of social networking, multi-cloud applications and Internet of Things (IoT) devices and services continues to drive exponential growth in big data solutions. As businesses become more data driven and larger, more current data sets become important to support the online business processes, analytics, intelligence and decisions. Additionally, data availability and integrity become increasingly critical as more and more businesses and their partners rely on these (near) real-time analytics and insights to drive their business. These big data solutions typically are built upon a new class of hyper-scale, distributed, multi-cloud, data-centric applications.

While these NoSQL, semi-structured, highly distributed data stores are perfect for handling vast amounts of big data on a large number of systems, they can no longer be effectively supported by legacy data management and protection models. Not only based on the sheer data size and the vast number of storage and compute nodes, but also because of built-in data replication, data distribution, and data versioning capabilities – a different approach for backup and recovery is needed. Even though these next-generation data stores have integrated high availability and DR capabilities, events like logical data corruption, application defects, and/or simple user errors still require another level of recoverability.

To meet the requirements of these high-volume and real-time applications in a scale-out, cloud centric environments, a wave of new data stores and persistence models has emerged. Gone are the days of just files, objects and relational databases. The next-generation key-value stores, XML/JSON document stores, arbitrary width column stores and graph-databases (sometimes characterized as NoSQL stores) share several fundamental characteristics that enable the big data driven IT. Almost without exception, all big data repositories are based on a cloud-enabled, scale-out, distributed data persistence model that leverages commodity infrastructure while providing some form of integrated data replication, multi-cloud distribution and high-availability. The big data challenges aren’t limited to just the data ingest, data storage, data processing, data queries, result set capturing, visualization, but also pose increasing difficulties around data integrity, availability, recoverability, accessibility and mobility/movement. Let’s see how this plays out in a couple example case studies.

(Tatiana Shepeleva/Shutterstock)

A first case study revolves around an Identity and Access Management service provider that uses Cassandra as its core persistence technology. The IDaaS (Identity as a service) is a multi-tenant service with a mixture of large enterprise, SMB and development customers and partners. The Cassandra database provides them with a highly scalable, distributed, high available data store that supports per tenant custom user and group profiles (i.e. read dynamic extensible schemas). While the data set may not be very large in absolute storage size, the number of records definitely will be in the 10’s, if not 100’s of millions.

What drives the unique requirements for recoverability is the multi-tenancy and the 100% availability targets of the service. Whether it is through user error, data integration defects and changes, or simply tenant migrations, it may be required to recover a single tenant’s data set without having to restore the whole Cassandra cluster (or replica thereof) in order to restore just one tenant instance. Similarly, the likelihood that the complete Cassandra cluster is corrupt is slim and in order to maintain (close to) 100% availability for most tenant service instances, partial recovery would be required. This drives the need for some level of application aware protection and recovery. In other words, the protection and recovery solution must establish and persist some application data semantic knowledge to be able to recover specific, consistent Cassandra table instances or point-in-times.

The second case study is centered around a Hadoop clustered storage solution, whereby the enterprise application-set persists its time-series data from devices and their end-user activities in the Hadoop filesystem. The Hadoop storage acts as de-facto “data lake” fed from multiple diverse data sources in different formats, whereby the enterprise can now apply various forms of data processing and analysis through map-reduce batch processing, real-time analytics, streaming and/or in-memory queries and transformations. Even though a map-reduce job creates ephemeral intermediate and end results that in principle could be recreated by running the job once more in case of failure or corruption, the data set can be too large (and therefore too expensive to reprocess) and undergoing constant updates.

Even though Hadoop provides replication and erasure encoded duplication (for high-availability and scale-out), there really is no data versioning or snapshots for that matter (given the original ephemeral model of the map-reduce processing). Any logical error, application or service failure or plain user error, coul result in data corruption or data loss. Data loss or corruption could occur to the original ingested data, any intermediate ephemeral data or data streams, as well as any resulting datasets or database instances and tables. Rather than creating a full copy of the Hadoop file-system for backup and recovery of intermediate files and database tables (which would be cost prohibitive and/or too time consuming), a different approach is needed. In order to do so, a better understanding of the application data sets and their schema’s, semantics, dependencies and versioning is required.

Looking at both case studies, there is common thread amongst them driving the need for a different approach to data management and specifically backup and recovery:

  • Both Cassandra and Hadoop provide integrated replication and high-availability support. Neither capability, however, provides sufficient, if any protection against full or partial data corruption or data loss (human, software or system initiated). An actual application data centric or aware backup is needed to support data recovery of specific files, tables, tenant data, intermediate results and/or version thereof
  • However, a storage centric (file or object infrastructure) backup solution is not really feasible. The data set is either too large to repeatedly be copied in full, or a full data set takes too large an infrastructure to recover fully or to extract just specific granular application data items. In addition, storage centric backups (file system snapshot, object copies, volume image clones or otherwise) do not provide any insight into the actual data set or data objects that the application depends on. On top of the fully recovered storage repository, an additional layer of reverse engineered application knowledge would be required as well.
  • Application downtime is critical now more than ever. In both case studies, multiple consuming services or clients depend on the scale-out service and persistence. Whether it’s a true multi-tenant usage pattern, or multitude of diverse data processing and analytics applications, the dataset needs to be available close to 100%. Secondly, a full data-set recovery would simply take too long and the end-users or clients would incur too much downtime. Only specific, partial data recovery would support the required SLA’s.

The requirement for an alternate data management and recovery solution is not limited to just the above described Cassandra and Hadoop case studies. Most big data production instances ultimately do require a data protection and recovery solution that supports incremental data backup and specific partial or granular data recovery. More importantly the data copy and recovery must acquire semantic knowledge of the application data in order to capture consistent data copies with proper integrity and recoverable granularity. This would allow the big data DevOps and/or Production Operations teams to just recover data items that are needed without having to do a full big data set recovery on an alternate infrastructure. For example, the data recovery service must be able to expose the data items in the appropriate format (e.g. Cassandra tables, Hadoop files, Hive tables, etc.) and within a specific application context. At the same time the protection copies must be able to be distributed across on-premise infrastructure as well as public cloud storage to leverage both cost effective protection storage tiering and scaling as well as support alternate cloud infrastructure recovery.

A solution that provides big data protection and recovery in a granular and semantic aware approach not only addresses “Big Data Backup” in the appropriate fashion, but it also creates opportunities to extract and use data copies for other purposes. For example, the ability to extract application specific data copies or critical parts of the big data set enables other users to efficiently get down-stream datasets for test and dev, data integrity tests, in-house analytics, 3rd party analytics or potential data market offerings. Combining this with multi-cloud data distribution, we then get closer to realizing a multi-cloud data management solution that starts to address today’s and tomorrow’s needs for application and data mobility, as well as their full monetization potential.

About the author: Peter Smails is vice president of marketing and business development at Datos IO, provider of a cloud-scale, application-centric, data management platform that enables organizations to protect, mobilize, and monetize their application data across private cloud, hybrid cloud, and public cloud environments. A former Dell EMC veteran, Peter brings a wealth of experience in data storage, data protection and data management.”

Related Items:

Big Data Begets Big Storage

Data Recovery Gets Speed, Security Boost

The post Backing Up Big Data? Chances Are You’re Doing It Wrong appeared first on Datanami.

Read more here:: www.datanami.com/feed/

IoT needs to be secured by the network

By News Aggregator

By Jon Gold

Everyone who has a stake in the internet of things, from device manufacturers to network service providers to implementers to customers themselves, makes important contributions to the security or lack thereof in enterprise IoT, attendees at Security of Things World were told.

“The key to all [IoT devices] is that they are networked,” Jamison Utter, senior business development manager at Palo Alto Networks told a group at the conference. “It’s not just a single thing sitting on the counter like my toaster, it participates with the network because it provides value back to business.”

“I think the media focuses a lot on consumer, because people reading their articles and watching the news … think about it, but they’re not thinking about the impact of the factory that built that consumer device, that has 10,000 or 20,000 robots and sensors that are all IoT and made this happen.”

To read this article in full, please click here

Read more here:: www.networkworld.com/category/lan-wan/index.rss

The post IoT needs to be secured by the network appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

IoT needs to be secured by the network

By Jon Gold

Everyone who has a stake in the internet of things, from device manufacturers to network service providers to implementers to customers themselves, makes important contributions to the security or lack thereof in enterprise IoT, attendees at Security of Things World were told.

“The key to all [IoT devices] is that they are networked,” Jamison Utter, senior business development manager at Palo Alto Networks told a group at the conference. “It’s not just a single thing sitting on the counter like my toaster, it participates with the network because it provides value back to business.”

“I think the media focuses a lot on consumer, because people reading their articles and watching the news … think about it, but they’re not thinking about the impact of the factory that built that consumer device, that has 10,000 or 20,000 robots and sensors that are all IoT and made this happen.”

To read this article in full, please click here

Read more here:: www.networkworld.com/category/lan-wan/index.rss

China Telecom and China Unicom certify u-blox NB-IoT modules

By Zenobia Hegde

u-blox, a global provider of wireless and positioning modules and chips, announced that its SARA-N2 NB-IoT modules have successfully completed AVL certification with China Telecom and validation with China Unicom.

This process includes the SARA-N200 (900 MHz) and SARA-N201 (850 MHz) product variants, which were both designed to meet China market requirements.

The certifications offer greater flexibility to manufacturers of end devices integrating u-blox NB-IoT modules, guaranteeing optimal performance of the devices under China Telecom and China Unicom networks and full compliance to all operator requirements. Both modules are now commercially available.

“We are glad to announce that our SARA-N2 NB-IoT modules have been approved for use in China,” says Perry Zhang, principal strategic partnerships at u-blox. “We can now offer full NB-IoT connectivity to our customers in this market.”

The u-blox SARA-N2 series NB-IoT modules feature extremely low power consumption delivering 10+ years of battery life and have been designed explicitly for the needs of applications that need to communicate for long periods of time in challenging radio propagation conditions, such as in buildings and underground, achieving a maximum coupling loss (MCL) of 164 dB. Measuring only 16 x 26mm, the modules are industrial grade, offering an extended temperature range of -40 to +85º C and ISO/TS16949 manufacturing.

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post China Telecom and China Unicom certify u-blox NB-IoT modules appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

China Telecom and China Unicom certify u-blox NB-IoT modules

By News Aggregator

By Zenobia Hegde

u-blox, a global provider of wireless and positioning modules and chips, announced that its SARA-N2 NB-IoT modules have successfully completed AVL certification with China Telecom and validation with China Unicom.

This process includes the SARA-N200 (900 MHz) and SARA-N201 (850 MHz) product variants, which were both designed to meet China market requirements.

The certifications offer greater flexibility to manufacturers of end devices integrating u-blox NB-IoT modules, guaranteeing optimal performance of the devices under China Telecom and China Unicom networks and full compliance to all operator requirements. Both modules are now commercially available.

“We are glad to announce that our SARA-N2 NB-IoT modules have been approved for use in China,” says Perry Zhang, principal strategic partnerships at u-blox. “We can now offer full NB-IoT connectivity to our customers in this market.”

The u-blox SARA-N2 series NB-IoT modules feature extremely low power consumption delivering 10+ years of battery life and have been designed explicitly for the needs of applications that need to communicate for long periods of time in challenging radio propagation conditions, such as in buildings and underground, achieving a maximum coupling loss (MCL) of 164 dB. Measuring only 16 x 26mm, the modules are industrial grade, offering an extended temperature range of -40 to +85º C and ISO/TS16949 manufacturing.

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post China Telecom and China Unicom certify u-blox NB-IoT modules appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

The post China Telecom and China Unicom certify u-blox NB-IoT modules appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

Gemalto and Huawei claim a ‘cost-effective’ narrowband solution for mass market IoT

By Zenobia Hegde

To help device manufacturers meet a growing demand for long-lasting low-power NarrowBand (NB) IoT modules, Gemalto and Huawei – via its semiconductor arm, HiSilicon – are working together to develop the next generation of modules that combine an extra level of security and consume very low power. By combining the expertise from both companies, these NB-IoT modules will help manufacturers reduce the cost and size of their devices, and lengthen the battery life of the devices to up to ten years.

NB IoT has been developed to address lower bit rates and lower cost segments, and works virtually anywhere. It offers ultra-low power consumption enabling devices to be battery operated for periods of up to 10 years. Applications include smart parking sensors, intruder and fire alarms, personal healthcare appliances, tracking devices, and street lamps to name a few. According to ABI Research, NB IoT modules connecting objects to networks are forecast to represent over 20% of all cellular shipments by 2021.

“2017 is the year of commercial NB IoT rollouts for us, and we will be building 30 such networks in 20 countries worldwide by the end of the year. Huawei has been a major player in this arena, and we continue to capitalise on this vast opportunity,”said XiongWei, president of LTE solution, Huawei. “We look to supply the market with solutions that provide stable connectivity, low energy consumption, and cost efficiency. The network roll-out will now come with an enhanced integration and flexibility thanks to this collaboration with Gemalto.”

“The combination of our expertise in IoT cellular connectivity, and digital s​ecurity, and Huawei’s high-performance NB IoT chipsets will help device manufacturers and service providers take the plunge into cellular IoT mass deployment thanks to a standardised solution,” said Suzanne Tong-Li, SVP Greater China and Korea for Mobile Services and IoT and China president, Gemalto. “Our collaboration simplifies the implementatio​​​n of NB IoT projects combining solid security and flexibility.”

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post Gemalto and Huawei claim a ‘cost-effective’ narrowband solution for mass market IoT appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Gemalto and Huawei claim a ‘cost-effective’ narrowband solution for mass market IoT

By News Aggregator

By Zenobia Hegde

To help device manufacturers meet a growing demand for long-lasting low-power NarrowBand (NB) IoT modules, Gemalto and Huawei – via its semiconductor arm, HiSilicon – are working together to develop the next generation of modules that combine an extra level of security and consume very low power. By combining the expertise from both companies, these NB-IoT modules will help manufacturers reduce the cost and size of their devices, and lengthen the battery life of the devices to up to ten years.

NB IoT has been developed to address lower bit rates and lower cost segments, and works virtually anywhere. It offers ultra-low power consumption enabling devices to be battery operated for periods of up to 10 years. Applications include smart parking sensors, intruder and fire alarms, personal healthcare appliances, tracking devices, and street lamps to name a few. According to ABI Research, NB IoT modules connecting objects to networks are forecast to represent over 20% of all cellular shipments by 2021.

“2017 is the year of commercial NB IoT rollouts for us, and we will be building 30 such networks in 20 countries worldwide by the end of the year. Huawei has been a major player in this arena, and we continue to capitalise on this vast opportunity,”said XiongWei, president of LTE solution, Huawei. “We look to supply the market with solutions that provide stable connectivity, low energy consumption, and cost efficiency. The network roll-out will now come with an enhanced integration and flexibility thanks to this collaboration with Gemalto.”

“The combination of our expertise in IoT cellular connectivity, and digital s​ecurity, and Huawei’s high-performance NB IoT chipsets will help device manufacturers and service providers take the plunge into cellular IoT mass deployment thanks to a standardised solution,” said Suzanne Tong-Li, SVP Greater China and Korea for Mobile Services and IoT and China president, Gemalto. “Our collaboration simplifies the implementatio​​​n of NB IoT projects combining solid security and flexibility.”

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post Gemalto and Huawei claim a ‘cost-effective’ narrowband solution for mass market IoT appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

The post Gemalto and Huawei claim a ‘cost-effective’ narrowband solution for mass market IoT appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

Libelium releases a cloud programming software service for the IoT

By Zenobia Hegde

At Smart City Expo in Barcelona (Spain), Libelium has unveiled a cloud software service to enhance its wireless sensor platform with intelligence based solutions. The Spanish company has released the first Programming Cloud Service that will reduce development costs, increase security and speed up the time to market of Waspmote and Waspmote Plug & Sense! users by removing both the code training and the “try and test” stages in the deployment process.

On the one hand, the new service facilitates the access to develop new IoT projects to technology companies that do not count with a special engineering team for the IoT. On the other hand, the new Programming Cloud Service is thought too for those companies that want to save time and resources getting the IoT technology ready for their applications.

The new strategic tool allows to create binary files for the Libelium’s Plug & Sense! sensor nodes in minutes by just filling in an online form with all the working options like sleeping cycle, data to include in the sensor frame, Cloud destination URL, networking options, etc. “After more than 10 years of experience in one of the most complete IoT programming API’s in the world, we have realised that our clients looks for simplicity: they just want to use the full programming potential in one click.

This is how we came with the idea of relaying all this complexity to a Cloud service that could create code and compile it for them in seconds”, David Gascón, Libelium’s CTO, points out.

Due to the importance of the security in the IoT, it is crucial for companies managing real deployments to have the ability to configure nodes correctly and easily with the appropriate encryption options.

The Programming Cloud Service creates the binary files based on solid and tested source codes generated during years by the Libelium engineering team. Then it generates the algorithm specified by the user and compile it on the Cloud using the last version of the API and libraries. This way users may be confident that all the binaries generated contains all the improvements of the latest API versions.

“Smart Cities projects based on IoT need less costs and more simplicity on development stages to see the light and Libelium is working in both objectives to ease the IoT really take off”, Alicia Asín, Libelium’s CEO.

Alicia Asín

With this launching, Libelium offers different types of licenses for small, medium and large IoT deployments. Licenses titled as “Basic” and “PRO”, that enable the management from 5 to 20 nodes, are perfect to create binary files one by one. Besides, “Elite” licenses allow to create up to 100 binary files in batch by just one click. With the new service no SDK’s, API’s or compilers are needed any more. “Now you can program the sensor nodes using a mobile phone or a tablet, as just a web browser is required to fill the programming options form”, Libelium’s CTO says.

Anyway, the libraries and compiler keep accessible for experienced developers who want to keep coding and using all the API options and the flexibility of programming their own binaries.

In the near future, Libelium will offer on its cloud services platform new licenses for the management of MySignals […]

The post Libelium releases a cloud programming software service for the IoT appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Libelium releases a cloud programming software service for the IoT

By News Aggregator

By Zenobia Hegde

At Smart City Expo in Barcelona (Spain), Libelium has unveiled a cloud software service to enhance its wireless sensor platform with intelligence based solutions. The Spanish company has released the first Programming Cloud Service that will reduce development costs, increase security and speed up the time to market of Waspmote and Waspmote Plug & Sense! users by removing both the code training and the “try and test” stages in the deployment process.

On the one hand, the new service facilitates the access to develop new IoT projects to technology companies that do not count with a special engineering team for the IoT. On the other hand, the new Programming Cloud Service is thought too for those companies that want to save time and resources getting the IoT technology ready for their applications.

The new strategic tool allows to create binary files for the Libelium’s Plug & Sense! sensor nodes in minutes by just filling in an online form with all the working options like sleeping cycle, data to include in the sensor frame, Cloud destination URL, networking options, etc. “After more than 10 years of experience in one of the most complete IoT programming API’s in the world, we have realised that our clients looks for simplicity: they just want to use the full programming potential in one click.

This is how we came with the idea of relaying all this complexity to a Cloud service that could create code and compile it for them in seconds”, David Gascón, Libelium’s CTO, points out.

Due to the importance of the security in the IoT, it is crucial for companies managing real deployments to have the ability to configure nodes correctly and easily with the appropriate encryption options.

The Programming Cloud Service creates the binary files based on solid and tested source codes generated during years by the Libelium engineering team. Then it generates the algorithm specified by the user and compile it on the Cloud using the last version of the API and libraries. This way users may be confident that all the binaries generated contains all the improvements of the latest API versions.

“Smart Cities projects based on IoT need less costs and more simplicity on development stages to see the light and Libelium is working in both objectives to ease the IoT really take off”, Alicia Asín, Libelium’s CEO.

Alicia Asín

With this launching, Libelium offers different types of licenses for small, medium and large IoT deployments. Licenses titled as “Basic” and “PRO”, that enable the management from 5 to 20 nodes, are perfect to create binary files one by one. Besides, “Elite” licenses allow to create up to 100 binary files in batch by just one click. With the new service no SDK’s, API’s or compilers are needed any more. “Now you can program the sensor nodes using a mobile phone or a tablet, as just a web browser is required to fill the programming options form”, Libelium’s CTO says.

Anyway, the libraries and compiler keep accessible for experienced developers who want to keep coding and using all the API options and the flexibility of programming their own binaries.

In the near future, Libelium will offer on its cloud services platform new licenses for the management of MySignals […]

The post Libelium releases a cloud programming software service for the IoT appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

The post Libelium releases a cloud programming software service for the IoT appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

New research shows industrial organisations increasingly focused on IoT adoption, but most are still in early stages

By Zenobia Hedge

Bsquare, a provider of Industrial Internet of Things (IIoT) solutions, released the findings of its first annual IIoT Maturity Study. This explores the current IoT adoption progress of business buyers in Manufacturing, Transportation, and Oil and Gas (O&G).

According to the 2017 study, 86% of industrial organisations are currently adopting IoT solutions and 84% believe those solutions are very or extremely effective. In addition, 95% believe that IoT has a significant or tremendous impact on their industry. However, the study also shows that most IIoT investments are focused on connectivity (78%) and data visualisation (83%). In addition, only 48% are doing advanced analytics on that data and only a small number (28%) are automating the application of insights derived from analytics.

“Our study shows that while industrial organisations have enthusiastically adopted IIoT, a majority have not yet moved to more advanced analytics-driven orchestration of data insights,” said Kevin Walsh, vice president of marketing at Bsquare.

“These later stages of IIoT maturity—analytics, orchestration and true edge computing—tend to be where most of the ROI is realised. This is especially important because, according to our study, the number one reason cited for IIoT adoption is cost reduction.”

Bsquare’s 2017 Annual IIoT Maturity Study was conducted in the United States in August 2017, and reached more than 300 respondents at companies with annual revenues in excess of $250 million (€214.53 million). Participants were evenly divided among three industry groups (Manufacturing, Transportation, and O&G) and titles covered a wide spectrum of senior-level personnel with operational responsibilities, most of whom had spent an average of six years in their organisations.

Key highlights from the report include:

The vast majority (86%) of organisations are deploying IIoT solutions, led by Construction/Transportation (93%) and followed by O&G (89%) and Manufacturing (77%).
Nearly two-thirds (73%) of all businesses plan to increase their IoT investments over the next 12 months, despite almost every respondent acknowledging that IoT deployments are complex.
Nine out of 10 decision-makers feel it is very or somewhat important for their organisation to adopt IoT solutions. And 95% perceive IoT as having either a significant or tremendous impact on their industry at a global level.
Industrial organisations are using IoT most frequently for device connectivity and data forwarding (78%), real-time monitoring (56%), and advanced data analytics (48%). More mature uses of IoT, such as automation and enhanced on-board intelligence, are also prevalent in industrial settings.

Kevin Walsh

More than 90% of IIoT adopters cite device-health as the primary reason for IoT adoption followed by logistics (67%), reducing operating costs (24%) and increasing production volume (18%).
More than half of organisations are using annual subscription models for their IIoT solutions, and 77% use a cloud-based model. Amazon and Microsoft were tied (14%) for the preferred cloud service provider.

The IoT Maturity Index outlines the stages commonly associated with Industrial IoT technology adoption. Each phase typically builds on the previous one, allowing organisations to drive maximum value as they progress through the index.

The stages include:

Device connectivity – on-board logic to collect data and transmit to cloud databases
Data monitoring – dashboard and […]

The post New research shows industrial organisations increasingly focused on IoT adoption, but most are still in early stages appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/