android ipv6 support

If you’re looking to ‘do’ digital transformation, read this first

By Jon Collins

Barely a day goes past in the tech press without some mention of the importance of digital transformation to businesses; each accompanied by a caveat that nobody really knows what it is. Without engaging further in this debate, what are the absolutes and what really matters?

1. That it’s all about the data. Everything.

How ever we phrase things, the singular, most significant change that technology has brought over the past 100 years is the ability to generate, store, process and transmit inordinate quantities of data. Whatever ‘revolution’ or ‘wave’ we might want to say we are in right now, be it digital, industrial or whatever, there is only really one — the information revolution.

Despite exponential appearances (and resulting perceived impetus for dramatic change), this trend continues with a certain linearity: even as we double the number of pixels on a sensor for example, or transistors on a processor, our abilities increase at a more steady pace. In business terms, the challenges of integration, capacity planning or service level management are the much the same now as they were a decade ago; we are simply working at a higher level of resolution.

2. That technology is enabling us to do new things

This still leaves room for breakthroughs, when technology passes certain thresholds. We saw, for example, the quite sudden demise of the cathode-ray television in favour of LCD screens, or indeed that of film versus digital cameras. What we see as waves are quite often technologies passing these thresholds — so, for example, the Internet of Things is a consequence of having sufficient connectivity, with low-cost sensors and ‘edge’ processing. We may be seeing another approaching with machine learning and AI.

It’s useful to compare these moments of “release of innovation” with the previous point, that many consequences are subject to evolutionary, not revolutionary impact. This dichotomy drives much technology-related marketing: a new advance can have significant specific impacts even if it does not change the world; however it will be presented as enabling the latter, even if it will only really achieve the former. Case in point — digital cameras have not made us better photographers, and nor has CRM made organisations better at customer service.

3. That we tend to do the easy or cheap stuff, as consumers and businesses

Many innovations happen through ‘pull’ rather than ‘push’. We can spend our lives putting together complex business cases that demonstrate clear ROI, but even as we do we know they are really lip service to due diligence. In work as at home, a great deal of technology adoption happens because it makes our lives easier — those explaining the extraordinary rise of Amazon, Facebook and so on emphasise ecosystems, platforms and networks and treat our own laziness and desire for a simple life as an afterthought. In business meanwhile, money saving is a far greater enabler to technology adoption than the potential for business growth.

The CBA factor is of inordinate importance, and yet gets little mention: it’s like we are embarrassed to admit our own weaknesses. Interestingly, its corollary (that of “Resistance to Change”) does get a mention when looking to explain large project failures. But here’s the fact: many of the great technology advances occur because they are easier, and they stumble when they are not. The fact people still like books or printed reports can be explained as much through ease of use, as through the need to hold something physical. The perceived need for ‘transformation’ comes from the idea that against such inertia, some big, aspirational change is necessary.

4. That nobody knows what the next big thing will be

As my old boss and mentor once said however, innovations are like route markers — it’s important to see them as points on a journey rather than a destination. However, doing so goes against two major schools of thought. The first comes from technology vendors who want (you) to believe that their latest box of tricks will indeed bring nirvana. And the second, from consulting firms, whose thought leadership role diminishes significantly if their advice is framed in terms of observational mentoring (a good thing) as opposed to somehow holding the keys to the kingdom.

There is no promised land, and neither is there a crevasse we will all fall into, but we still persist in looking through the wrong end of the telescope, framing business needs in terms of solutions rather than putting the former first. Sometimes this is done so subtly by marketers it can be difficult to spot: back in the days of “service oriented architecture” for example, it took me a while to realise that its main proponents happened to have a specific product in mind (an “enterprise service bus”). Doing so isn’t necessarily wrong, but it’s worth following the money.

5. That we are not yet “there”, nor will we ever be

As a species, particularly in times of great change, we need a level of certainty at a very deep, psychological level. And it is messing with our ability to act. It’s too easy to pander to the need for a clear answer, buying into current rhetoric with a hope that the latest advance might really work this time. All sides are at fault — those purveying solutions, those buying them and those acting as trusted third parties — but who wants to hear anyone say “it’s not going to work”?Each time round the cycle, we come up with new terms and subtly change their definitions — industry 4.0 or smart manufacturing might mean the same, or very different things depending on who you ask, a symptom of our desperation to understand, and adapt to what is going on (after all, haven’t we been told to ‘adapt or die’?).

Interestingly, the companies that we applaud, or fear the most, may well be those who care the least. Amazon, Uber, Tesla, the rest of them don’t know what’s around the corner, and what is more they don’t see this as a priority — they simply want to still be in the game this time next year. Rightly so, as they are were born into uncertainly, forged through some indecipherable and unrepeatable combination of factors. Why did Facebook succeed when Myspace, Bebo or any other high-valuation predecessor did not, for example? Above all, these organisations have an attitude to change, a mindset that sees uncertainty and therefore responsiveness, as a norm. Jeff Bezos’ articulation of Amazon’s “Day One” approach to business strategy offers a fantastically simple, yet utterly profound illustration.

6. Responsiveness is the answer, however you package it

Where does this leave us? The bottom line is that “digital transformation” is the latest attempt to provide a solid response to uncertain times, and as such remains empty words for many. It isn’t actually relevant what it is, other than a touchstone term which will soon be replaced (you can thank the marketers for that). So debate it by all means, just as you might once have debated business process management, or social networking, or hybrid cloud, or whatever tickles your fancy. As you do so however, recognise such procrastination for what it is.

And then, once you are done, take action, over and over again. Transformation doesn’t matter, unless we are talking about the transformation of mindsets and attitudes, from build-to-last to do-it-fast. That’s why agile methodologies such as DevOps are so important, not in themselves (yes, that would be putting the cart before the horse again) but because they give businesses an approach to innovate at high speed. As we continue on this data-driven journey, as complexity becomes the norm, traditional attitudes to change at the top of business, or indeed our institutions, become less and less tenable. The bets we make on the future become less and less important; what matters more is our ability to make new ones.

Terminology matters not a jot. But are you and your colleagues, at whatever level in your organisation, prepared to change? If the answer is anything other than yes, you have a bigger challenge on your hands than understanding the latest set of buzzwords.

Read more here:: gigaom.com/feed/

NEC highlights 5G deployment for creating a future beyond imagination at MWC 2018

By Zenobia Hegde

NEC Corporation announced the presentation of technologies and solutions for working together with telecom carriers in the co-creation of new business models and the implementation of 5G solutions at Mobile World Congress (MWC) 2018 at the Fira Gran Via, Barcelona, from February 26 to March 1, in Hall 3, stand #3M30.

In today’s business and social climate, telecom carriers are constantly being required to process greater volumes of data at increasingly faster speeds, while also ensuring that transmissions are secure. At the same time, the rapid growth of the Internet of Things (IoT), Artificial Intelligence (AI) and robotics are placing even greater demand on carrier resources.

At MWC 2018, NEC is demonstrating solutions and technologies that help address the needs of both telecom carriers and businesses alike through its “5G. A Future Beyond Imagination,” concept, which positions NEC and telecom carriers as service enablers for the co-creation of new business models for a wide variety of vertical industries, including the security, agriculture and transportation fields, that maximise resources and reinforce earnings.

At the NEC booth, the company’s cutting-edge portfolio of AI technologies, “NEC the WISE,” will be introduced, as well as NEC’s series of biometric authentication solutions, “Bio-IDiom,” which includes some of the world’s fastest and most accurate facial and fingerprint authentication technologies. This is in addition to highlighting NEC’s participation in the FIDO Alliance, which aims to standardise Fast IDentity Online (FIDO).

Moreover, NEC will demonstrate its advanced solutions for being a leader in mobile backhaul, network optimisation through traffic management solutions (TMS) and edge computing, as well as software-defined networking (SDN) / network functions virtualisation (NFV) that contribute to the growth of telecom carriers.

For more detail on NEC’s participation in Mobile World Congress 2018, please click here.

Comment on this article below or via Twitter: @ VanillaPlus OR @jcvplus

The post NEC highlights 5G deployment for creating a future beyond imagination at MWC 2018 appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Antenova to show two new high performing 4G/LTE diversity antennas for small PCBs

By Zenobia Hegde

Antenova Ltd, manufacturer of antennas and RF antenna modules, is showing a brand new pair of high performing 4G/LTE antennas which are suitable for PCBs as small as 60mm, at the consumer electronics show CES. The two antennas can also be used in 3G and MIMO applications.

The two antennas are similar – the difference being that Inversa is built for the USA market while Integra is for European and Asian markets.

Both antennas are available in left and right versions to provide more options for placement on the PCB, and can be used singly or in pairs for MIMO. Both use beam steering to ensure good isolation and cross correlation, and achieve high performance.

Inversa, part numbers SR4L034-L /SR4L034-R, measures 28.0 x 8.0 x 3.3mm and covers the USA bands 698-798 MHz, 824-960 MHz, 1710-2170 MHz, 2300-2400 MHz and 2500-2690MHz.

Integra, part numbers SR4L049-L/SR4L049-R measures 23.0 x 8.0 x 3.3mm and covers the bands 791-960 MHz, 1710-2170 MHz, 2300-2400 MHz and 2500-2600 MHz, used in Europe and Asia.

Antenova has designed these antennas for use in small trackers, OBDs and other similar devices where space is limited. For more details, antenna samples and evaluation boards, please click here.

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post Antenova to show two new high performing 4G/LTE diversity antennas for small PCBs appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Fueled by Kafka, Stream Processing Poised for Growth

By Alex Woodie

Once a niche technique used only by the largest organizations, stream processing is emerging as legitimate technique for dealing with massive amounts of data generated every day. While it’s not needed for every data challenges, organizations are increasingly finding ways to incorporate stream processing into their plans — particularly with the rise of Kafka.

Stream processing is just that – processing data as soon as it arrives, as opposed to processing it after it lands. The amount of processing that is applied to the data as it flows can vary greatly. On the one hand, users may do very little besides a simple transformation, such as converting temperatures from Celsius into Fahrenheit or combining it with another stream, while at the upper end, stream processors may apply real-time analytics or machine learning algorithms.

Almost any type of data can be used in stream processing. Sources can include a database event from RDBMs or NoSQL, sensor data from the IoT, comments made on social media, or a credit card swipe. The data’s destination similarly can be diverse – it could be headed to a traditional file system, a relational or NoSQL database, a Hadoop data lake, or a cloud-based object store.

What happens in between that initial data creation event and when the data written to some type of permanent repository is collectively referred to as stream processing. Initially, proprietary products developed by the likes of TIBCO, Software AG, IBM, and others were developed to handle streaming data. But more recently, distributed, open source frameworks have emerged to deal with the massive surge in data generation.

Apache Kafka — a distributed publish and subscribe message queue that’s open source and relatively easy-to-use –by far is the most popular of these open source frameworks, and Kafka is seen today by industry insiders as helping to fuel the ongoing surge in demand for tools to work with stream data processing.

Steve Wilkes, the CTO and founder of Striim, says Kafka’s popularity is helping to push stream processing into the center stage. “Kafka is driving a lot of our market,” he says. “A good majority of our customers are utilizing Kafka in one way, shape, or form.”

The underlying trend driving investment in stream processing is that customers need access to the latest data, Wilkes says. “It’s the recognition that, no matter how you’re doing analytics — whether you’re doing them in streaming fashion or whether you’re doing them after the fact through some sort of Hadoop jobs or big data analytics you need that up-to-date data,” he tells Datanami.

Striim this week unveiled a new release of its stream data processing solution, Striim version 3.8, that features better support for Kafka. This includes the capability to automatically scale Striim to more efficiently read from and write to Kafka as users scale up their real-time streaming architecture.

Many Kafka users are using the core Kafka product, along with the open source Kafka Connect software, to rapidly move data from its source to another destination, such as Hadoop or a data lake hosted on the cloud. Fewer shops are using the Kafka Streams API to write application logic on top of the message bus, a niche that third-party vendors are moving to fill.

According to a recent report from Confluent, the company behind open source Kafka and developer of the Confluent Platform, 81% of Kafka customers are using it to build data pipelines. Other common use case include real-time monitoring, ETL, microservices, and building Internet of Things (IoT) products.

Keeping the data lake updated with fresh data is an increasingly difficult task – and one that stream processing is being asked to fill as a sort of modern ETL role. According to Syncsort‘s recent 2018 Big Data Trends survey, 75% of respondents say that keeping their data lake updated with changing data sources is either “somewhat” or “very difficult.”

Another vendor that’s seeing the Kafka impact is Streamsets, a software vendor that bills itself as the “air traffic control” for data in motion. Streamsets’ initial product was a data collector that automated some of the nitty gritty work involved in capturing and moving data, often atop the Kafka message queue. The vendor recently debuted a low-footprint data collector that works in CPU- and network-constrained environments, and cloud-based console for managing the entire flow of customer’s data.

Streamsets Vice President of Marketing Rick Bilodeau says Kafka is driving a lot of the company’s business. “We do a lot of work with customers for Kafka, for real-time event streaming,” he tells Datanami. “We see fairly broad Kafka adoption as a message queue, where people are using [Streamsets software] primarily to broker data in and out of the Kafka bus.”

Some of Streamsets customers have a million data pipelines running at the same time, which can lead to serious management challenges. “Companies will say, ‘We built a bunch of pipelines with Kafka, but now have a scalability problem. We can’t keep throwing people at it. It’s just taking us too long to put these things together,’” Bilodeau says. “So they use data collector to accelerate that process.”

Today, Streamsets sees lots of customers implementing real-time stream processing for Customer 360, cybersecurity, fraud detection, and industrial IoT use cases. Stream processing is still relatively new, but it’s beginning to grow in maturity rapidly, Bilodeau says.

“It’s not the first inning, for sure. It’s maybe the third inning,” he says. “On the Gartner Hype Cycle, it’s approaching early maturity. Every company seems to have something they want to do to with streaming data.”

Striim’s Wilkes agrees. Fewer than half of enterprises are working with streaming data pipelines, he estimates, but it’s growing solidly. “Streaming data wasn’t even really being talked about a few years ago,” he says. “But it’s really starting to get up to speed. There is a steady progression.”

We’re still mostly in the pipeline-building phase, where identifying data sources and creating data integrations dominates real-time discussions, Wilkes says. That period will give way to more advanced use cases and people become comfortable with the technology.

“We’re seeing that a lot of customers are still at the point of obtaining streaming sources. They understand the need to get a real-time data infrastructure,” he says. “The integration piece always comes first. The next stage after you have access to the streaming data is starting to think about the analytics.”

Related Items:

Spark Streaming: What Is It and Who’s Using It?

How Kafka Redefined Data Processing for the Streaming Age

The post Fueled by Kafka, Stream Processing Poised for Growth appeared first on Datanami.

Read more here:: www.datanami.com/feed/

Why is the connection piece so hard?

By IoT Now Magazine

Increased usage of digital technologies by the manufacturing industry is inevitable but, while the shift is gradual, the pressure to go faster is greatWhen Hewlett-Packard Enterprise (HPE) and the Industry of Things World Conference conducted a survey to find out how successful industrial IoT (IIoT) projects have been in the last 12 months, the responses uncovered that only 53% of respondents thought their IIoT projects had met or exceeded goals. The remaining 47% said their goals had not been reached.

IIoT isn’t about companies buying a technology and suddenly they’re digital. It’s an entire architecture which encompasses an ecosystem with careful communication across various touchpoints within an organisation, all of which requires common standards as well as new technology architectures to create convergence of information technology (IT) and operational technology (OT).

One critical and common chokepoint is a lack of understanding about device connection. Even if devices are connected, there are often no simple tools to manage the devices in order to extract and transmit the data out of one language into another; for example the transmission and translation of programmable logic controller (PLC) data into enterprise resource planning (ERP) systems.

Let’s look at the top five things to consider about the connection puzzle and how to weave them into your overall plan.

1. Getting things connected is easier said than done

During the discovery phase, many IIoT vendors gloss over this. Once manufacturers decide to take the plunge, they suddenly realise that connecting to all these different devices – legacy and modern, proprietary and open source – is really difficult, and results in significant delays which blow up the original projected timeframe.

If you’ve ever engineered systems on the plant floor, you know there are those things down on the plant floor that are a nightmare to connect to and integrate with a variety of other applications. A simple data collection task can end up taking weeks of custom coding.

2. Embrace complexity

There is no single standard way of connecting everything together. Over time, the industrial plant floor has evolved as technology has changed. For better or for worse, this advancement also means more complexity and it is not going away; in fact, it will increase.

As a result, plant floors have a mixture of device brands, different protocols, and different proprietary data sets. Embracing complexity means we accept there are a lot of moving parts in IIoT solutions that we need to link together for success and that requires a level of expertise which is better addressed as a holistic solution versus a complicated system of patch work.

3. Prepare for latency

One piece that addresses complexity is open platform communications (OPC). OPC was designed to provide industrial automation with a standard networking protocol that requires polling to receive data from devices. Polling is where the system must ask the device for data at a preset rate, such as once every second or once every half hour.

OPC requires multiple steps to send data; it is not point A to point B. A typical path looks like this: PLC […]

The post Why is the connection piece so hard? appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Spirent adds BeiDou Phase-3 signals to its GNSS RF constellation simulators

By Zenobia Hegde

Spirent Communications plc, the provider in BeiDou, GPS and other global navigation satellite systems (GNSS) testing solutions, announced that BeiDou Phase 3 signals have been added to its GNSS RF constellation simulators.The addition of these new signals to the GSS7000 and GSS9000 simulators follows the launch of the first two Beidou-3 satellites in November 2017.

Phase 3 of the Chinese BeiDou system will extend its coverage from Asia to the whole world, and will provide receiver developers and integrators with additional GNSS signals to make positioning, navigation and timing systems more accurate, and help to support new applications, such as autonomous vehicles.

The new signals will use the same carrier frequencies as the GPS and Galileo systems, so chipset manufacturers and device developers will need to test integrated designs to avoid problems caused by confusing data from different GNSS.

“By offering the BeiDou Phase 3 signals, our customers can test their designs well before the system is fully operational, which is expected in 2020,” said Stuart Smith, lead product manager. “With signals already starting to appear, it is important for developers to have test tools that can ensure devices will successfully make use of all GNSS signals.”

BeiDou Phase 3 signals are available immediately on the GSS7000 and GSS9000 simulators, and existing users can obtain the software upgrade by contacting Spirent.

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post Spirent adds BeiDou Phase-3 signals to its GNSS RF constellation simulators appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

What’s Behind Qualcomm’s Huge New Profit Promise

By Aaron Pressman

It seemed like quite the stunner: besieged mobile chipmaker Qualcomm promising this week that it could earn as much as $7.50 per share next year, almost twice what Wall Street expects. But dig into the numbers and the assumptions and the promises, and there may be less there than meets the eye.

Qualcomm CEO Steve Mollenkopf is already dancing as fast as he can. He has to revive the company’s core mobile business that has seen revenue shrink for three consecutive years. He’s also fighting off an unwanted $105 billion takeover from Broadcom and its hard charging CEO Hock Tan. Oh, and Mollenkopf still needs to close on Qualcomm’s own $47 billion acquisition of NXP Semiconductors, and all the while engaging in a fierce legal battle with Apple, once the company’s best customer.

So there was Mollenkopf, looking sharp in a navy suit and blue tie, perched on a high stool alongside his top lieutenants on Tuesday, appearing in a surprise, half hour video presentation to investors. The CEO promised Qualcomm would bring in $6.75 to $7.50 per share in adjusted profit in 2019 (versus the current average analyst forecast of just $3.77 according to FactSet), largely due to the coming adoption of the next generation of wireless technology known as 5G, he said.

“Qualcomm today is at an important inflection point. Think about a world in which everything is connected…This is the world of 5G, which will impact almost every facet of people’s lives,” Mollenkopf explained. “Only a small handful of companies invest in the R&D that enables each generation of mobile technology. As we did with 3G and 4G, Qualcomm has been leading the development of 5G.”

Get Data Sheet, Fortune’s technology newsletter.

The aggressive stance perked up investors. Qualcomm


qcom



with its shares gaining almost 5% in mid-day trading on Wednesday. But at $68.57, the stock remains slightly below Broadcom’s current $70 offer price. It’s hard to tease out just what the price level means-some investors could be discounting the takeover bid because it would take a long time to get antitrust approvals, while other may believe Qualcomm will successfully fend off the suitors and bring in the larger profits it is promising.

Broadcom’s


avgo



stock price has risen about 1% since the presentation.

And the unwanted suitor also issued its own response to the presentation by saying that Qualcomm management had “repeatedly overpromised and under-delivered” on past financial forecasts. “Qualcomm’s approach is a transparent attempt to sell a quick fix by the Qualcomm Board of Directors and management team and an obvious tactic to deny its own stockholders the opportunity to receive a compelling premium for their shares and significant upside potential in the combined company,” the company said in a statement.

When analysts dug into Qualcomm’s 2019 profit promise, however, they discovered that it relied on several key assumptions that had nothing to do with 5G.

First, about $1.50 per share relied on completing the acquisition of NXP, which sells chips to automakers and makers of connected gadgets in the Internet of Things market. Though European regulators where NXP is based are reportedly on the verge of approving the transaction, some NXP


nxpi



stockholders have complained that Qualcomm’s $110 per share offer is too low and may have to be raised. If the deal doesn’t go through, Qualcomm said it could boost earnings per share via a massive stock buyback instead.

Analysts typically don’t include the profit contribution from a company being acquired until the deal is closed, so that component of Mollenkopf’s profit promise wasn’t really much of an upside surprise.

Another almost 60 cents would come from a new $1 billion cost-cutting program Mollenkopf unveiled on Tuesday without giving many details.

And, most uncertain of all, a final $1.50 to $2.25 would come from settling outstanding legal disputes with Apple


aapl



and another unidentified phonemaker. There has been no evidence that Apple is much interested in coming to the table without huge concessions from Qualcomm that may undermine its entire business of collecting royalties from licensing its technology to phonemakers, however.

Deduct those three less-than-definite components, and Qualcomm’s promise is equal to only about $3.18 per share in 2019, analyst Stacy Rasgon of Bernstein Research calculated–considerably less than Wall Street currently expects. That’s because Qualcomm may have to pay higher taxes under the new corporate tax rules adopted last year that limit the use of some kinds of international tax avoidance strategies. And Apple is likely to cut its purchases from Qualcomm further as it looks for alternative chip suppliers for the iPhone, Rasgon noted.

Even Qualcomm’s projected gains from 5G may be overly-optimistic, Tim Long, an analyst at BMO Capital wrote. “Management likely believes that the core mobile business will grow faster, but we are more cautious on 5G,” Long said in a report after the presentation. “Management expects (earnings per share) to grow at twice faster than revenues, though the company has struggled in the past growing EPS faster than sales.”

Qualcomm’s recent history doesn’t give investors much confidence, either, the analysts said. With the burgeoning legal battles over royalties paid by phonemakers and a slowing of sales growth of mobile phones, Qualcomm’s revenue has dropped from $26.5 billion and adjusted earnings per share of $5.27 at its peak in 2014 to $23.2 billion and $4.28 per share last year. Wall Street expects revenue to bottom out at $22.8 billion this year and then rebound to $23.4 billion next year, according to FactSet (not including the NXP deal, which would add another $9 billion or more of annual revenue). Adjusted earnings are forecast to hit a low of $3.48 this year and then rise to $3.77 in 2019.

But the fight with Broadcom, which has nominated its own slate of candidates for Qualcomm’s upcoming board election, seems to have lit a fire under underperforming Qualcomm, Nomura Instinet analyst Romit Shah noted.

“Qualcomm leadership is very smart, but over the last several years, the San Diego-based management team at times has been unassertive and complacent,” Shah wrote in a report on Tuesday. “Though now with Broadcom’s hostile takeover attempt analogous to a ‘gun to the head,’ we expect the company to more aggressively focus on driving shareholder value in order to remain a standalone franchise.”

Read more here:: fortune.com/tech/feed/

Driving the transition to Smart Transport Networks

By Zenobia Hegde

Global surveillance solutions provider Synectics has published a white paper to help transport operators gear up for an increasingly urbanised future. With estimates suggesting that 70% of the world’s population will be living in towns and cities in just three years’ time – the free resource aims to help operators handle and secure ever-increasing urban flows, and implement significant safety improvements, towards the goal of a Smart Transport Network.

The Synectics white paper – entitled ‘Smart Transport Networks: Integration, Interoperability and IoT’ ‒ looks at how evolving surveillance, data management, and edge-device technologies can be used to unify disparate technologies and systems, to create Smart Transport Networks, meet Smart City objectives and deliver connected services to customers.

The paper helps operators make the most of current data, surveillance and safety assets by providing practical advice about integrating both IP and analogue technologies, particularly those responsible for the operation of bus, rail, and light rail transport networks. It also illustrates potential customer improvements by taking the reader on a fully-connected passenger journey, highlighting where converged technology can play an important role such as sending alerts to an individual’s phone if their luggage is unexpectedly moved.

Iain Stringer, divisional director – Mobile Systems at Synectics said: “Transport is perhaps the most critical of all urban services given the imperative need to maintain the flow of people and goods. As our transport systems get busier, technology frameworks that unify systems and technologies are providing live, 360-degree oversight of journeys, as well as a platform to communicate more effectively with passengers and third-party operators.

“Not only can this streamline operations by delivering all relevant information at a glance, such as during an incident, but it can also help operators to reduce costs and more efficiently handle information requests from Police and other authorities.

“This white paper explains the practical steps towards systems convergence for those charged with the management of transport or surveillance data.”

Synectics designs and deploys field-proven systems for both infrastructure (stations, stops, control rooms) and on-vehicle (trams, trains, buses, coaches), making the company one of only a handful of suppliers able to deliver end-to-end, surveillance and security solutions spanning all aspects of transport.

Every year its solutions protect over 1 billion passengers travelling on one of Europe’s busiest metro networks, and over 3 billion worldwide, providing Synectics with a frontline view of changing industry requirements and expectations.

For more information on Synectics’ surveillance solutions for transport operators, please click here.

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post Driving the transition to Smart Transport Networks appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Machfu Announces Release of MACH-3 Industrial Internet of Things Platform and Gateway

By IoT – Internet of Things

Machfu, an Industrial Internet of Things (IIoT) technology company, announced today the release of its MACH-3 IIoT Gateway, a device allowing companies in energy, water, oil and gas as well as other industrial segments to seamlessly connect legacy infrastructure to cloud based IoT and existing SCADA systems. The product was designed to both simplify the […]

The post Machfu Announces Release of MACH-3 Industrial Internet of Things Platform and Gateway appeared first on IoT – Internet of Things.

Read more here:: iot.do/feed

Preventing ‘Techlash’ in 2018: Regulatory Threats

By Megan L. Brown

U.S. Chamber of Commerce President Thomas J. Donohue on January 10, 2018, warned that “techlash” is a threat to prosperity in 2018. What was he getting at? A “backlash against major tech companies is gaining strength — both at home and abroad, and among consumers and governments alike.” “Techlash” is a shorthand reference to a variety of impulses by government and others to shape markets, services, and products; protect local interests; and step in early to prevent potential harm to competition or consumers.

These impulses lead to a variety of actions and legal standards that can slow or change the trajectory of innovations from artificial intelligence to the Internet of Things (IoT) to business process improvements. According to Mr. Donohue, “[w]e must be careful that this ‘techlash’ doesn’t result in broad regulatory overreach that stifles innovation and stops positive advancements in their tracks.” Here are a few examples of the challenges ahead:

  • Global privacy and security regulations impose compliance obligations and erect barriers to the free flow of data, products, and services. Examples include the European Union’s General Data Protection Regulation (GDPR), its Network Information Security Directive (NIS Directive), e-Privacy initiative, and a nascent effort on IoT certifications. “A growing number of countries are making it more expensive and time consuming, if not illegal, to transfer data overseas.” [1] China’s new cyber law “requires local and overseas firms to submit to security checks and store user data within the country.” [2] Such efforts may be intended to level the playing field with large U.S. technology companies, but whatever their impetus, they create enormous compliance costs and impediments to multinational operations. [3] Emerging regulation around the world may do more harm than good, particularly to U.S.-based organizations.
  • Premature regulation and oversight drives up the costs of doing business, particularly for new entrants or disruptors. Government should act only when it has evidence of actual harms to consumers or competition and the benefits outweigh the costs. When government rushes in with a technical mandate, innovation suffers. Likewise, if the government demands business changes without evidence of anti-competitive effects, it distorts the marketplace. Premature regulations impose unnecessary compliance burdens, so governments should exercise “regulatory humility” and wait for experience and evidence.
  • Unjustified class action litigation over technology strikes fear in the hearts of innovators. The growth of “no injury” lawsuits in targeting the technology sector likewise is a concern. Class action plaintiffs were quick to sue GM and Toyota after news reports of a vulnerability in Jeeps, and dozens of plaintiffs immediately sued Intel after chip processor vulnerabilities named Meltdown and Spectre were reported. [4] While courts have generally rejected suits based on “risk of hacking,” [5] plaintiffs continue to push these theories, along with novel “economic loss” claims from “overpaying for” [6] vulnerable devices. Legal uncertainty about such claims, and the rush to obtain damages awards and attorneys’ fees, threatens to increase costs and chills companies’ willingness to engage.
  • State laws, such as those attempting to impose “net neutrality” and online privacy obligations at the state level, threaten to balkanize regulation of technology. “Lawmakers in at least six states, including California and New York, have introduced bills in recent weeks that would forbid internet providers to block or slow down sites or online services.” [7] State-by-state regulation of global ISP and carrier network practices is likely to create major inefficiencies. Likewise, state privacy laws create complexity for organizations whose operations, products, and customers cross state lines. Industry has decried “balkanized privacy regulation at the state level” which creates “a hazardous web of conflicting state-by-state laws for any company operating in the online space.” [8]
  • Local barriers, like restrictive zoning regimes, stunt technology deployment and innovation. Tomorrow’s innovations in health care, transportation, conservation, entertainment, and more depend on a robust technology infrastructure, including telecommunications facilities. [9] But many local jurisdictions are hesitant to allow deployment in public rights-of-way, and others see the explosion of small cell telecommunications facilities as a revenue stream. [10] Local barriers to deployment will slow innovation in communications technology, which may make many communities, and the United States at large, less competitive in the global economy. This is particularly troubling as other countries, like Japan and South Korea, welcome the next generation of communications technology.

2018 will be an important year for global regulation of technology, as issues from privacy to cybersecurity to competition percolate in legislatures around the world. As we enter what some call the Fourth Industrial Revolution, governments have to consider their role in supporting innovation. Hopefully the United States continues to lead by example, resisting “techlash” with a light regulatory touch and a lot of humility. The United States likewise should urge other countries not to punish success, and instead let innovators — not regulators — create the future.

[1] Cross-Border Data Flows: Where Are the Barriers, and What Do They Cost? https://itif.org/publications/2017/05/01/cross-border-data-flows-where-are-barriers-and-what-do-they-cost

[2] T. Miles, U.S. asks China not to enforce cyber security law, Reuters (Sept. 26, 2017) https://www.reuters.com/article/us-usa-china-cyber-trade/u-s-asks-china-not-to-enforce-cyber-security-law-idUSKCN1C11D1

[3] Ann M. Beauchesne, Megan Brown, Sean Heather, Principles for IoT Security; The IoT Revolution and Our Digital Security (Sept. 2017), https://www.uschamber.com/IoT-security

[4] See S. Czarnecki, Intel faces dozen class action lawsuits over chip flaws, https://www.prweek.com/article/1454201/intel-faces-dozen-class-action-lawsuits-chip-flaws (Jan. 10, 2018).

[5] Cahen v. Toyota Motor Corp., No. 16-15496 (9th Cir. Dec. 21, 2017) https://scholar.google.com/scholar_case?case=7591856924921942948&hl=en&as_sdt=6&as_vis=1&oi=scholarr

[6] Id. While the court in Cahen found that the “economic loss theory is not credible, as the allegations that the vehicles are worth less are conclusory and unsupported by any facts,” a future Plaintiff may survive a motion to dismiss with stronger allegations.

[7] C. Kang, States Push Back After Net Neutrality Repeal, N.Y. Times (Jan. 11, 2018) https://www.nytimes.com/2018/01/11/technology/net-neutrality-states.html

[8] Et tu, California? ISP Privacy Bill Moving through the Legislature (June 21, 2017) https://www.ana.net/blogs/show/id/rr-blog-2017-06-et-tu-california

[9] Thomas K. Sawanobori & Paul V. Anuszkiewicz, CTIA, High Band Spectrum: The Key to Unlocking the Next Generation of Wireless, 1, (June 13, 2016), https://www.ctia.org/docs/default-source/default-document-library/5g-high-band-white-paper.pdf

[10] See Jonathan Babcock, Joshua Turner, and Anna Gomez, 5G Deployment Faces Unique Challenges Across The US, Law360 (Aug. 1, 2017) https://www.law360.com/articles/950330/5g-deployment-faces-unique-challenges-across-the-us

Written by Megan L. Brown, Partner at Wiley Rein LLP

Follow CircleID on Twitter

More under: Cybersecurity, Internet Governance, Law, Policy & Regulation, Privacy

Read more here:: feeds.circleid.com/cid_sections/blogs?format=xml