rfc 2460 pdf

AI Investment Up, ROI Remains Iffy

By George Leopold

Real-world applications for artificial intelligence are emerging in areas such as boosting the productivity of dispersed workforces. However, early adopters are still struggling to determine the return on initial AI investments, according to a pair of new vendor reports.

Red Hat released research this week indicating that AI deployments have yielded some tangible results in areas such as transportation and utilities that rely heavily on field workers. A separate forecast released Wednesday (Jan.17) by Narrative Science found growing enterprise adoption of AI technologies but little in the way of investment returns.

Chicago-based Narrative Science, which sells natural language generation technology, found that 61 percent of those companies it surveyed deployed AI technologies in 2017. Early deployments focused on business intelligence, finance and product management. “In 2018, the focus will be on ensuring enterprises get value from their AI investments,” company CEO Stuart Frankel noted in releasing the survey.

Early adopters are also encountering many of the hurdles associated with a “first mover” advantage. “More and more organizations are deploying AI-powered technologies, with goals such as improving worker productivity and enhancing the customer experience that are not only laudable, but achievable,” Narrative Science concluded. “A focus on realistic deployment timeframes and accurately measuring the effectiveness and [return on investment] of AI is critical to keeping the current momentum around the technology moving forward.”

Meanwhile, the Red Hat (NYSE: RHT) survey also found an uptick in AI deployments, with 30 percent of respondents planning to implement AI for “field service workers” this year. Other applications include predictive analytics, machine learning and robotics.

While issues such as securing data access and a lack of standards persist, Red Hat found that field workers are “now at the forefront of digital transformation where artificial intelligence, smart mobile devices, the Internet of Things (IoT) and business process management technologies have created new opportunities to better streamline and transform traditional workflows and workforce management practices.”

A predicted 25 percent increase in AI investment through November 2018 is seen transforming field service operations, Red Hat noted in a blog posted on Thursday (Jan. 18). Early movers cited increase field worker productivity (46 percent), streamlining field operations (40 percent) and improving customer service (37 percent) as the top business factors for investing in AI.

Along with a lack of standards, respondents said deployment challenges include keeping pace with technological change and integrating AI deployments with legacy systems. The survey notes that industry groups are focusing on standards and interoperability among IoT devices along with data security while improving integration technologies.

Earlier vendor surveys also have identified barriers to implementation ranging from a lack of IT infrastructure suited to AI applications to a lack AI expertise. For instance, a survey released last fall by data analytics vendor Teradata Corp. (NYSE: TDC) found that 30 percent of those it polled said greater investments would be required to expand AI deployments.

Despite the promise and pitfalls of AI—ranging from freeing workers from drudgery to displacing those same workers—early AI deployments appear to underscore the reality that the technology remains a solution in search of a problem.

Recent items:

AI Seen Better Suited to IoT Than Big Data

AI Adopters Confront Barriers to ROI

The post AI Investment Up, ROI Remains Iffy appeared first on Datanami.

Read more here:: www.datanami.com/feed/

BlackBerry launches new cybersecurity services to safeguard people, privacy and assets

By Zenobia Hedge

BlackBerry Ltd has introduced new cybersecurity consulting services aimed at enabling enterprise GDPR compliance and mitigating security risks in connected automobiles that threaten personal and public safety.

General Data Protection Regulation (GDPR)

Set to go into effect May 2018, and applicable to any enterprise controlling or processing Personally Identifiable Information (PII) of European Union residents, GDPR demands major changes to the ways organisations may collect, use, and store PII about customers and employees.

BlackBerry Cybersecurity Consulting will guide organisations through the process of understanding how to manage company data, how GDPR applies to the organisation, and how to achieve a competitive readiness posture.

“Having been engaged with the EU Justice Directorate-General since 2012, we understand the GDPR requirements and have developed expertise to help address the full range of GDPR implications for enterprises, from situational assessment to offering DPO (Data Protection Officer) -as-a-service,” said Carl Wiese, global head of sales, BlackBerry. “In addition to consulting services, we provide many necessary software solutions, making BlackBerry a one-stop shop for GDPR compliance.”

Article 37 of the GDPR requires organisations to have a dedicated DPO to oversee the company’s data protection strategy. The IAPP estimates over 27,000 DPOs will be needed to address that requirement.

Information about BlackBerry’s new GDPR consulting services is available here.

Automotive cybersecurity consulting services

Cybersecurity is top of mind for automakers as new technology and more connectivity is introduced to modern cars. According to Automotive Cybersecurity and Connected Car Report from IHS Automotive, there are nearly 112 million vehicles now connected around the world and the global market for automotive cybersecurity is expected to grow to $759 million (€644.45 million) in 2023.

Carl Wiese

BlackBerry will now offer new services directly and through a new partner program aimed at helping to eliminate security vulnerabilities within connected and autonomous vehicles.

“When it comes to connected cars, there is no safety without security,” continued Weise. “BlackBerry’s cybersecurity consulting practice builds on decades of experience in information security, data protection and cyber-resilience to support our clients in protecting their most valuable assets. As hacking evolves and new threats arise, our new cybersecurity consulting services will help play a critical role in the development of secure connected and autonomous vehicles.”

Spring Cloud, a supplier of autonomous driving AI platforms in South Korea, will be the first partner to work with BlackBerry to provide the new cybersecurity consulting services to a range of automotive technology providers.

BlackBerry has provided cybersecurity solutions to clients in government and banking since the mid-90s. With the acquisition of QNX in 2010, the company’s expertise expanded into safety-critical systems in automotive, healthcare, energy and manufacturing. Today, BlackBerry Cybersecurity Consulting provides a holistic cybersecurity approach to help all enterprises manage complex security structures and mission-critical systems around the world.

For more information on BlackBerry Cybersecurity Consulting and the company’s services, please click here.

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post BlackBerry launches new cybersecurity services to safeguard people, privacy and assets appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Analytics Remains Top Data Tech, Survey Says

By George Leopold

Artificial intelligence is all the rage, but as companies continue to struggle with data complexity characterized by a variety and number of data sources, analytics remains the most important technology for squeezing value from far-flung data.

Analytics was cited by an overwhelming majority (96 percent) of respondents to a recent poll sponsored by SAP when asked to name the most important technology for a data-driven enterprise. AI and machine learning were cited by 81 percent, while 85 percent picked the Internet of Things.

IoT was ranked as the most important data source, followed by machine learning and AI, in that order.

Issues related to data complexity and the growing requirements for data management stem in part from the shift to data in the cloud. Nearly half of respondents to the SAP (NYSE: SAP) survey identified public and private clouds as the “most challenging data sources” followed by data warehouses, management and data visualization tools and data lakes. Hadoop databases were seen as least challenging, cited by only 26 percent of those polled.

In an attempt to reduce data complexity, 37 percent said data is stored on premises while 26 percent rely on private or public cloud storage. While enterprise data infrastructure remains siloed, adding to complexity, “enterprises are taking the necessary first steps to improve data discovery and governance,” the survey notes.

With analytics remaining a key technology for most businesses, the SAP survey echoes the need for more data scientists. Even as tools emerge designed to make data more accessible across organizations, the study found that 79 percent believe data scientists remain important, and an equal number worry about a data skills shortage.

“Big data has been coined the new gold, and companies believe that it’s time to make data scientists the new gold miners,” survey authors asserted.

As enterprises turn inward to squeeze value from the growing number of data sources, a majority of survey respondents said data scientists should focus on “data inside the enterprise” while only 38 percent said their focus should be on external data.

Moreover, IT operations are seen as the mostly likely enterprise user of data analytics, thereby making them the “key data stakeholders,” SAP reported.

SAP’s “State of Big Data” survey was conducted in August 2017.

The findings illustrate “a data management landscape ripe with opportunity,” the company further asserted. Among those opportunities are emerging data management approaches that go beyond relational database. Citing MongoDB’s (Nasdaq: MDB) stock offering on Thursday (Oct. 19), Datastax CEO Billy Bosworth noted in a statement: “There is a critical need for a new era of operational data management.”

Declining cloud storage and processing costs coupled with the growing data gravity of the cloud means companies are moving more operational and analytical workloads to the cloud.

Recent items:

Iron Mountain Adds Cloud Data Management

MongoDB Takes Another Big Step Into Clouds

The post Analytics Remains Top Data Tech, Survey Says appeared first on Datanami.

Read more here:: www.datanami.com/feed/

New BlackBerry-commissioned research confirms cybersecurity is top concern in corporate IoT deployments

By Sheetal Kumbhar

BlackBerry Limited announced findings from a new global research whitepaper, which surveyed IT decision makers on corporate IoT deployments. Conducted by 451 Research, the whitepaper titled, “Securing the Enterprise of Things: Opportunity for securing IoT with a unified platform emerging as IoT popularity grows,” reveals that huge opportunities are balanced against significant cybersecurity concerns.

“The proliferation of IoT is being led by enterprises, and they continue to require a unified endpoint management strategy that is capable of scaling to handle billions of connected devices,” said Marty Beard, chief operating officer, BlackBerry. “We are focused on securing the EoT because for all its promise, the expanding adoption of connected things means that companies are only as secure as their most vulnerable endpoint.”

Marty Beard

Survey respondents represent a wide range of vertical industries, including financial services, government and healthcare. Below are some key themes from the research:

78% of respondents indicated interest in a solution that allows them to manage all their endpoints in one place.
63% noted that security is the “top” concern regarding digital technologies and processes. However, only a little over one-third (37%) actually have a formal digital transformation strategy in place.
Organisations are least prepared against external threats, with nearly two-thirds (61%) citing hackers and cyberwarfare as top concerns.
39% of respondents from very large organisations (more than 10,000 employees) revealed that a lack of collaboration among internal departments is a potential barrier to unified endpoint management, while 51% of mid-sized organisations felt the same way.

The new whitepaper is available for download.

For more information about BlackBerry’s EOT solutions, please click here.

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post New BlackBerry-commissioned research confirms cybersecurity is top concern in corporate IoT deployments appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Flexera issues warning about cyberattacks like the Equifax Breach: they’re probably just the first known victim

By Sheetal Kumbhar

As 143 million Equifax consumers continue to pick up the pieces from stolen Social Security numbers, birth dates, drivers’ licenses, addresses and credit card numbers, Flexera has another warning – expect a long tail of incidents and breaches in the months and years to come.

Flexera, the company that’s reimaging how software is bought, sold, managed and secured, surveyed over 400 software suppliers, Internet of Things (IoT) manufacturers and in-house development teams to publish its Open Source Risk – Fact or Fiction? report. Though open source software (OSS) helps software suppliers be nimble and build products faster – today’s report reveals hidden software supply chain risks all software suppliers and IoT manufacturers should know about.

For instance, criminals who potentially gained access to the personal data of the Equifax customers exploited an Apache Struts CVE-2017-5638 vulnerability. Apache Struts is a widely used open source component – a framework for Web servers – used by companies in commercial and in-house systems to take in and serve up data. The use case of this open-source component makes it a prime target for cyberattacks.

Case in point? While as much as 50% of all code found in commercial and IoT software products is open source, according to the Flexera report:

No OSS Policy is Bad Policy: Only 37 % of respondents have an open source acquisition or usage policy. 63% say either their companies don’t have an open source acquisition or usage policy, or they don’t know if one exists.
No One’s in Charge of OSS: 39% of respondents said that either no one within their company is responsible for open source compliance – or that they don’t know who is.
OSS Contributors Aren’t Following Best Practices: 33% of respondents say their companies contribute to open source projects. But, of the 63% who say their companies don’t have an open source acquisition or usage policy, 43% said they contribute to open source projects.

“We can’t lose sight that open source is indeed a clear win. Ready-to-go code gets products out the door faster, which is important given the lightning pace of the software space,” said Jeff Luszcz, vice president of product management at Flexera. “However, most software engineers don’t track open source use, and most software executives don’t realise there’s a gap and a security/compliance risk.”

Report takeaway for software and IoT companies? Your processes for managing open source security and licensing haven’t kept pace with open source’s rapid adoption – and it’s putting you and your customers at risk.

“Open source processes protect products and brand reputation. But, most software and IoT vendors don’t realise there is a problem, so they’re not protecting themselves and their customers,” said Luszcz. “This endangers the entire software supply chain – for the vendors whose products are exposed to compliance and vulnerability risk. And also for their customers who most likely don’t even know they’re running open source and other third-party software, or that it may contain software vulnerabilities.”

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post Flexera issues warning about cyberattacks like the Equifax Breach: they’re probably just the first known victim appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

How will M2M services make more billions in a few years?

By Sheetal Kumbhar

On the back of this development in connectivity for vehicles will come other business opportunities which will add to the demand for traffic on this network. The research agency Technavio prepared a report on the market of machine to machine (M2M)-services, defining them as Value Added -services for operators.

The scope of the research included all world markets except for those in European countries. According to Technavio, in 2021 M2M-services will earn $38 billion(€32.37), and the average annual growth of the market will be 41%, says Suren Arustamyan, COO at JeraSoft.

The main factor behind this growth will be demand from business and city authorities for new mobile services, which will cause serious replenishment in the revenue streams of players in the mobile market. Mobile operators will be the winners here , since they will own the road and traffic networks, while satellite operators are assigned the sector of Internetisation of aircraft and provision of communications in the sky.

Another important direction for the market will be monitoring of transport, where all cars will include sensors to allow tracking of movement via satellite.

The researchers see the penetration of M2M-services into vehicle retail as new directions in mobile network markets. First of all, we are talking about marketing tools to inform consumers around or inside different locations and and involve them in the buying process. In due course everything else will follow – from advertising to special marketing offers.

In addition to the more traditional outdoor advertising (stands, leaflets, posters), mobile M2M-services will allow advertisers to instantly change the message content according to the tracked location of a vehicle and personalise it for the driver’s known preferences.

The entire market research has been divided into several types of industries: providers of M2M-applications and solutions, communication operators, equipment manufacturers, system integrators and manufacturers of sensors. Some mobile operators are already working to occupy specific niches in these new markets, which increases the level of competition. Companies form strategic alliances and jointly develop a solution based on the convergence of their competencies.

In addition, the researchers note that the market enters the stage of mergers and acquisitions, which in turn will allow large players to increase the geographical coverage of their solutions and services. The largest players in the M2M-market according to these researchers are; AT & T, Sprint, Verizon and Vodafone. In addition to these, Amdocs, China Mobile, China Telecom, Digi International, Gemalto, KDDI, Numerex, Orange Business Services, Sierra Wireless, Rogers Communications, Tech Mahindra, Telefónica, Telenor, Telit and T-Mobile USA are noted.

The author of this blog is Suren Arustamyan, COO at JeraSoft

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post How will M2M services make more billions in a few years? appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

RS Components and Legal & General join forces to tackle £900m water leakage problem

By Sheetal Kumbhar

RS Components (RS), a global distributor for engineers and the trading brand of Electrocomponents plc, and Legal & General(L&G), the FTSE100 financial services group, have joined forces in an open call to the world’s engineering and maker community with a new IoT design competition, the ‘LeakKiller Challenge’. RS and Legal & General are offering a £15,000 (€16857.37) prize for […]

The post RS Components and Legal & General join forces to tackle £900m water leakage problem appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Plume Design Gets Funding

By Katie Nale

Plume Design received $63 million in equity funding from investors to accelerate its adaptive WiFi deployments. $37.5 million of total funding came from new investors Comcast Cable, Samsung Venture Investment Corp and Presidio Ventures. Existing investors Liberty Global Ventures, Shaw Ventures and Jackson Square Ventures also participated.

In addition to using funding to accelerate Plume’s deployment and integration of Adaptive WiFi in ISP networks and third party software, it will also be used for the introduction of new products and for the company’s Silicon Valley and European expansion.

Plume launched its Adaptive WiFi at the end of 2016 as a way to improve the “reliability, consistency, and coverage” of in-home WiFi. The Plum Cloud makes decisions on the WiFi channel, frequency, client connections, and whole-home coverage topology both in real time and preemptively. It is also built to manage and optimize performance across multiple apartments in a multi-dwelling unit simultaneously.

Comcast is currently working alongside Plume to combine its Xfinity xFi platform with Plume technologies. Later this year, Comcast will launch no-configuration, adaptive xFi Pods that can be auto-paired with either the xFi Wireless Gateway or the xFi Advanced Gateway.

The Plume Cloud service is accessed via Plume Pods or Plume’s software Agent embedded into third party hardware such as ISP wireless gateways and modems, wireless set-top-boxes, WiFi access points, OTT media players, IoT devices, and consumer electronics.

The post Plume Design Gets Funding appeared first on Cablefax.

Read more here:: feeds.feedburner.com/cable360/ct/operations?format=xml

Speed of the Essence for ‘Machine Data’ Analytics

By George Leopold

The ability to automate the analysis of “machine data” is most widely leveraged to speed IT operations along with use cases such as data security, a new industry survey finds, while accelerating the analysis of ephemeral machine data is emerging as a priority.

The market survey on the impact of machine data analytics conducted by 451 Research was commissioned by data-in-motion specialist Logtrust. The report authors defined machine data analytics as a set of technologies specifically “designed to help with the analysis of data created by machines, web servers, mobile devices, sensors and other smart devices.”

While a hefty 94 percent of the 200 IT managers surveyed said they use the data analytics to manage their IT operations, the technology is also gaining traction for analyzing the data firehouse created by the Internet of Things. An equal number (51 percent) said they use machine data analytics to crunch big data while 60 percent cited security applications as IT managers look to boost real-time threat detection and response.

Nevertheless, IT managers remain frustrated by a performance gaps in current analytics platforms as they tackle more real-time data and attempt to blend it with batch and historical data analysis. The imperative, the report’s authors note, is straightforward: “The faster you can run some analytics on data, and subsequently respond to the findings, the greater the chance of having achieved something that adds business value…”

So how fast is “fast”? More than two-thirds of survey respondents said they require a “machine real time” capability, that is, response times in the milliseconds. Just over half said they would settle for “human” real time, in the range of five seconds to five minutes latency.

While machine real-time capability is seen as essential for survival, 53 percent of those polled said their current data tools were incapable of achieving the “human” real time threshold. “It couldn’t be clearer… that faster data analytics really is better,” the report emphasized.

The survey also confirms the growing trend toward cloud-based data analytics, with 37 percent of those polled saying they run machine data analytics in the cloud. “Perceived” security risks have prevented 43 percent from moving analytics to the cloud, keeping them on-premise for now. But about two-thirds of respondents said they would move data analytics to the public cloud in the future.

The survey findings also challenged the notion that open source approaches to data analytics cost less. IT managers were evenly divided on the merits of open source versus proprietary approaches. Data platform vendors such as Logtrust, Sunnyvale, Calif., predictably argue in favor of proprietary solutions, stressing that the 451 Research survey found that 67 percent expect to adopt proprietary machine data analytics tools in the future.

As real-time data analytics becomes a necessity, the crunching of unstructured data, including images and video, also is emerging as a key application. Still, 89 percent of respondents said their main focus was analyzing and visualizing structured data.

Recent items:

Text Analytics and Machine Learning: A Virtuous Combination

Inside IBM ML: Real-Time Analytics on the Mainframe

The post Speed of the Essence for ‘Machine Data’ Analytics appeared first on Datanami.

Read more here:: www.datanami.com/feed/

Investors Bullish on GPU-Based Database Startup

By George Leopold

MapD Technologies, the big data analytics platform startup developing a parallel SQL database that runs on GPUs, has more than doubled its venture-funding total with the close of its latest investment round led by New Enterprise Associates (NEA).

San Francisco-based MapD, which leverages GPU technology to speed its SQL query engine, said Wednesday (March 29) it latest funding round garnered $25 million from NEA and existing investors Nvidia (NASDAQ: NVDA), Vanedge Capital and Verizon Ventures. The Series B round brings MapD’s total venture funding to just over $37 million.

The analytics startup said it would use the funding to accelerate analytics platform development as it seeks to make inroads in the enterprise big data market. The company uses Nvidia’s GPUs to run enterprise applications that include machine learning and numerical computations. An upgraded platform will expand data analytics capabilities.

The startup’s GPU-based SQL query engine platform combines with data visualization designed to allow analysts and data scientists to crunch multi-billion-row data sets. “GPU-powered analytics is going to radically change the data analytics market,” predicted Greg Papadopoulos, a venture partner at NEA.

Since announcing a Series A funding round in March 2016, MapD said it has unveiled new query engine features, including the second version of its Core database and Immerse visual analytics platforms. Meanwhile, Amazon Web Services, Google, Microsoft Azure and IBM SoftLayer have all launched GPU cloud instances that have increased enterprise access to the MapD platform.

Amazon (NASDAQ: AMZN) Web Services announced last fall it would offer public cloud services based on the Tesla K80 GPUs from MapD investor Nvidia.

Founded in 2013, MapD Technologies originated from research at the MIT Computer Science and Artificial Intelligence Laboratory. Other seed investors include In-Q-Tel, the CIA’s venture capital arm.

The startup places heavy emphasis on leveraging fast hardware to maximize speed. “The first thing is we try to cache the hot data across multiple GPUs,” MapD CEO Todd Mostak stressed in an interview last fall. “We’re a column store. We’re compressing the data, so you can have many, many billions of rows in that GPU memory.”

MapD investor and GPU leader Nvidia has been pitching its latest GPU technology as a way to speed SQL workloads. In one example, MapD’s platform fuses its GPU-based database with a collection of visualization tools to enable users to work with huge geospatial data sets.

Another area ripe for GPU-backed databases is Internet of Things deployments that continue to generate huge troves of data. Mostak has argued that current CPU-based approaches won’t scale as new requirements emerge for merging streamed data and historical analysis extending out to 90 days.

Mostak predicted recently that infrastructure-heavy industries such as telecommunications would be among the early adopters of the GPU-based analytics. That prediction is supported by early investments by Verizon Ventures.

Recent items:

Why America’s Spy Agencies Are Investing in MapD

GPUs Seen Lifting SQL Analytic Constraints

The post Investors Bullish on GPU-Based Database Startup appeared first on Datanami.

Read more here:: www.datanami.com/feed/