“summit broadband”

If you’re looking to ‘do’ digital transformation, read this first

By Jon Collins

Barely a day goes past in the tech press without some mention of the importance of digital transformation to businesses; each accompanied by a caveat that nobody really knows what it is. Without engaging further in this debate, what are the absolutes and what really matters?

1. That it’s all about the data. Everything.

How ever we phrase things, the singular, most significant change that technology has brought over the past 100 years is the ability to generate, store, process and transmit inordinate quantities of data. Whatever ‘revolution’ or ‘wave’ we might want to say we are in right now, be it digital, industrial or whatever, there is only really one — the information revolution.

Despite exponential appearances (and resulting perceived impetus for dramatic change), this trend continues with a certain linearity: even as we double the number of pixels on a sensor for example, or transistors on a processor, our abilities increase at a more steady pace. In business terms, the challenges of integration, capacity planning or service level management are the much the same now as they were a decade ago; we are simply working at a higher level of resolution.

2. That technology is enabling us to do new things

This still leaves room for breakthroughs, when technology passes certain thresholds. We saw, for example, the quite sudden demise of the cathode-ray television in favour of LCD screens, or indeed that of film versus digital cameras. What we see as waves are quite often technologies passing these thresholds — so, for example, the Internet of Things is a consequence of having sufficient connectivity, with low-cost sensors and ‘edge’ processing. We may be seeing another approaching with machine learning and AI.

It’s useful to compare these moments of “release of innovation” with the previous point, that many consequences are subject to evolutionary, not revolutionary impact. This dichotomy drives much technology-related marketing: a new advance can have significant specific impacts even if it does not change the world; however it will be presented as enabling the latter, even if it will only really achieve the former. Case in point — digital cameras have not made us better photographers, and nor has CRM made organisations better at customer service.

3. That we tend to do the easy or cheap stuff, as consumers and businesses

Many innovations happen through ‘pull’ rather than ‘push’. We can spend our lives putting together complex business cases that demonstrate clear ROI, but even as we do we know they are really lip service to due diligence. In work as at home, a great deal of technology adoption happens because it makes our lives easier — those explaining the extraordinary rise of Amazon, Facebook and so on emphasise ecosystems, platforms and networks and treat our own laziness and desire for a simple life as an afterthought. In business meanwhile, money saving is a far greater enabler to technology adoption than the potential for business growth.

The CBA factor is of inordinate importance, and yet gets little mention: it’s like we are embarrassed to admit our own weaknesses. Interestingly, its corollary (that of “Resistance to Change”) does get a mention when looking to explain large project failures. But here’s the fact: many of the great technology advances occur because they are easier, and they stumble when they are not. The fact people still like books or printed reports can be explained as much through ease of use, as through the need to hold something physical. The perceived need for ‘transformation’ comes from the idea that against such inertia, some big, aspirational change is necessary.

4. That nobody knows what the next big thing will be

As my old boss and mentor once said however, innovations are like route markers — it’s important to see them as points on a journey rather than a destination. However, doing so goes against two major schools of thought. The first comes from technology vendors who want (you) to believe that their latest box of tricks will indeed bring nirvana. And the second, from consulting firms, whose thought leadership role diminishes significantly if their advice is framed in terms of observational mentoring (a good thing) as opposed to somehow holding the keys to the kingdom.

There is no promised land, and neither is there a crevasse we will all fall into, but we still persist in looking through the wrong end of the telescope, framing business needs in terms of solutions rather than putting the former first. Sometimes this is done so subtly by marketers it can be difficult to spot: back in the days of “service oriented architecture” for example, it took me a while to realise that its main proponents happened to have a specific product in mind (an “enterprise service bus”). Doing so isn’t necessarily wrong, but it’s worth following the money.

5. That we are not yet “there”, nor will we ever be

As a species, particularly in times of great change, we need a level of certainty at a very deep, psychological level. And it is messing with our ability to act. It’s too easy to pander to the need for a clear answer, buying into current rhetoric with a hope that the latest advance might really work this time. All sides are at fault — those purveying solutions, those buying them and those acting as trusted third parties — but who wants to hear anyone say “it’s not going to work”?Each time round the cycle, we come up with new terms and subtly change their definitions — industry 4.0 or smart manufacturing might mean the same, or very different things depending on who you ask, a symptom of our desperation to understand, and adapt to what is going on (after all, haven’t we been told to ‘adapt or die’?).

Interestingly, the companies that we applaud, or fear the most, may well be those who care the least. Amazon, Uber, Tesla, the rest of them don’t know what’s around the corner, and what is more they don’t see this as a priority — they simply want to still be in the game this time next year. Rightly so, as they are were born into uncertainly, forged through some indecipherable and unrepeatable combination of factors. Why did Facebook succeed when Myspace, Bebo or any other high-valuation predecessor did not, for example? Above all, these organisations have an attitude to change, a mindset that sees uncertainty and therefore responsiveness, as a norm. Jeff Bezos’ articulation of Amazon’s “Day One” approach to business strategy offers a fantastically simple, yet utterly profound illustration.

6. Responsiveness is the answer, however you package it

Where does this leave us? The bottom line is that “digital transformation” is the latest attempt to provide a solid response to uncertain times, and as such remains empty words for many. It isn’t actually relevant what it is, other than a touchstone term which will soon be replaced (you can thank the marketers for that). So debate it by all means, just as you might once have debated business process management, or social networking, or hybrid cloud, or whatever tickles your fancy. As you do so however, recognise such procrastination for what it is.

And then, once you are done, take action, over and over again. Transformation doesn’t matter, unless we are talking about the transformation of mindsets and attitudes, from build-to-last to do-it-fast. That’s why agile methodologies such as DevOps are so important, not in themselves (yes, that would be putting the cart before the horse again) but because they give businesses an approach to innovate at high speed. As we continue on this data-driven journey, as complexity becomes the norm, traditional attitudes to change at the top of business, or indeed our institutions, become less and less tenable. The bets we make on the future become less and less important; what matters more is our ability to make new ones.

Terminology matters not a jot. But are you and your colleagues, at whatever level in your organisation, prepared to change? If the answer is anything other than yes, you have a bigger challenge on your hands than understanding the latest set of buzzwords.

Read more here:: gigaom.com/feed/

NEC highlights 5G deployment for creating a future beyond imagination at MWC 2018

By Zenobia Hegde

NEC Corporation announced the presentation of technologies and solutions for working together with telecom carriers in the co-creation of new business models and the implementation of 5G solutions at Mobile World Congress (MWC) 2018 at the Fira Gran Via, Barcelona, from February 26 to March 1, in Hall 3, stand #3M30.

In today’s business and social climate, telecom carriers are constantly being required to process greater volumes of data at increasingly faster speeds, while also ensuring that transmissions are secure. At the same time, the rapid growth of the Internet of Things (IoT), Artificial Intelligence (AI) and robotics are placing even greater demand on carrier resources.

At MWC 2018, NEC is demonstrating solutions and technologies that help address the needs of both telecom carriers and businesses alike through its “5G. A Future Beyond Imagination,” concept, which positions NEC and telecom carriers as service enablers for the co-creation of new business models for a wide variety of vertical industries, including the security, agriculture and transportation fields, that maximise resources and reinforce earnings.

At the NEC booth, the company’s cutting-edge portfolio of AI technologies, “NEC the WISE,” will be introduced, as well as NEC’s series of biometric authentication solutions, “Bio-IDiom,” which includes some of the world’s fastest and most accurate facial and fingerprint authentication technologies. This is in addition to highlighting NEC’s participation in the FIDO Alliance, which aims to standardise Fast IDentity Online (FIDO).

Moreover, NEC will demonstrate its advanced solutions for being a leader in mobile backhaul, network optimisation through traffic management solutions (TMS) and edge computing, as well as software-defined networking (SDN) / network functions virtualisation (NFV) that contribute to the growth of telecom carriers.

For more detail on NEC’s participation in Mobile World Congress 2018, please click here.

Comment on this article below or via Twitter: @ VanillaPlus OR @jcvplus

The post NEC highlights 5G deployment for creating a future beyond imagination at MWC 2018 appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Antenova to show two new high performing 4G/LTE diversity antennas for small PCBs

By Zenobia Hegde

Antenova Ltd, manufacturer of antennas and RF antenna modules, is showing a brand new pair of high performing 4G/LTE antennas which are suitable for PCBs as small as 60mm, at the consumer electronics show CES. The two antennas can also be used in 3G and MIMO applications.

The two antennas are similar – the difference being that Inversa is built for the USA market while Integra is for European and Asian markets.

Both antennas are available in left and right versions to provide more options for placement on the PCB, and can be used singly or in pairs for MIMO. Both use beam steering to ensure good isolation and cross correlation, and achieve high performance.

Inversa, part numbers SR4L034-L /SR4L034-R, measures 28.0 x 8.0 x 3.3mm and covers the USA bands 698-798 MHz, 824-960 MHz, 1710-2170 MHz, 2300-2400 MHz and 2500-2690MHz.

Integra, part numbers SR4L049-L/SR4L049-R measures 23.0 x 8.0 x 3.3mm and covers the bands 791-960 MHz, 1710-2170 MHz, 2300-2400 MHz and 2500-2600 MHz, used in Europe and Asia.

Antenova has designed these antennas for use in small trackers, OBDs and other similar devices where space is limited. For more details, antenna samples and evaluation boards, please click here.

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post Antenova to show two new high performing 4G/LTE diversity antennas for small PCBs appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Fueled by Kafka, Stream Processing Poised for Growth

By Alex Woodie

Once a niche technique used only by the largest organizations, stream processing is emerging as legitimate technique for dealing with massive amounts of data generated every day. While it’s not needed for every data challenges, organizations are increasingly finding ways to incorporate stream processing into their plans — particularly with the rise of Kafka.

Stream processing is just that – processing data as soon as it arrives, as opposed to processing it after it lands. The amount of processing that is applied to the data as it flows can vary greatly. On the one hand, users may do very little besides a simple transformation, such as converting temperatures from Celsius into Fahrenheit or combining it with another stream, while at the upper end, stream processors may apply real-time analytics or machine learning algorithms.

Almost any type of data can be used in stream processing. Sources can include a database event from RDBMs or NoSQL, sensor data from the IoT, comments made on social media, or a credit card swipe. The data’s destination similarly can be diverse – it could be headed to a traditional file system, a relational or NoSQL database, a Hadoop data lake, or a cloud-based object store.

What happens in between that initial data creation event and when the data written to some type of permanent repository is collectively referred to as stream processing. Initially, proprietary products developed by the likes of TIBCO, Software AG, IBM, and others were developed to handle streaming data. But more recently, distributed, open source frameworks have emerged to deal with the massive surge in data generation.

Apache Kafka — a distributed publish and subscribe message queue that’s open source and relatively easy-to-use –by far is the most popular of these open source frameworks, and Kafka is seen today by industry insiders as helping to fuel the ongoing surge in demand for tools to work with stream data processing.

Steve Wilkes, the CTO and founder of Striim, says Kafka’s popularity is helping to push stream processing into the center stage. “Kafka is driving a lot of our market,” he says. “A good majority of our customers are utilizing Kafka in one way, shape, or form.”

The underlying trend driving investment in stream processing is that customers need access to the latest data, Wilkes says. “It’s the recognition that, no matter how you’re doing analytics — whether you’re doing them in streaming fashion or whether you’re doing them after the fact through some sort of Hadoop jobs or big data analytics you need that up-to-date data,” he tells Datanami.

Striim this week unveiled a new release of its stream data processing solution, Striim version 3.8, that features better support for Kafka. This includes the capability to automatically scale Striim to more efficiently read from and write to Kafka as users scale up their real-time streaming architecture.

Many Kafka users are using the core Kafka product, along with the open source Kafka Connect software, to rapidly move data from its source to another destination, such as Hadoop or a data lake hosted on the cloud. Fewer shops are using the Kafka Streams API to write application logic on top of the message bus, a niche that third-party vendors are moving to fill.

According to a recent report from Confluent, the company behind open source Kafka and developer of the Confluent Platform, 81% of Kafka customers are using it to build data pipelines. Other common use case include real-time monitoring, ETL, microservices, and building Internet of Things (IoT) products.

Keeping the data lake updated with fresh data is an increasingly difficult task – and one that stream processing is being asked to fill as a sort of modern ETL role. According to Syncsort‘s recent 2018 Big Data Trends survey, 75% of respondents say that keeping their data lake updated with changing data sources is either “somewhat” or “very difficult.”

Another vendor that’s seeing the Kafka impact is Streamsets, a software vendor that bills itself as the “air traffic control” for data in motion. Streamsets’ initial product was a data collector that automated some of the nitty gritty work involved in capturing and moving data, often atop the Kafka message queue. The vendor recently debuted a low-footprint data collector that works in CPU- and network-constrained environments, and cloud-based console for managing the entire flow of customer’s data.

Streamsets Vice President of Marketing Rick Bilodeau says Kafka is driving a lot of the company’s business. “We do a lot of work with customers for Kafka, for real-time event streaming,” he tells Datanami. “We see fairly broad Kafka adoption as a message queue, where people are using [Streamsets software] primarily to broker data in and out of the Kafka bus.”

Some of Streamsets customers have a million data pipelines running at the same time, which can lead to serious management challenges. “Companies will say, ‘We built a bunch of pipelines with Kafka, but now have a scalability problem. We can’t keep throwing people at it. It’s just taking us too long to put these things together,’” Bilodeau says. “So they use data collector to accelerate that process.”

Today, Streamsets sees lots of customers implementing real-time stream processing for Customer 360, cybersecurity, fraud detection, and industrial IoT use cases. Stream processing is still relatively new, but it’s beginning to grow in maturity rapidly, Bilodeau says.

“It’s not the first inning, for sure. It’s maybe the third inning,” he says. “On the Gartner Hype Cycle, it’s approaching early maturity. Every company seems to have something they want to do to with streaming data.”

Striim’s Wilkes agrees. Fewer than half of enterprises are working with streaming data pipelines, he estimates, but it’s growing solidly. “Streaming data wasn’t even really being talked about a few years ago,” he says. “But it’s really starting to get up to speed. There is a steady progression.”

We’re still mostly in the pipeline-building phase, where identifying data sources and creating data integrations dominates real-time discussions, Wilkes says. That period will give way to more advanced use cases and people become comfortable with the technology.

“We’re seeing that a lot of customers are still at the point of obtaining streaming sources. They understand the need to get a real-time data infrastructure,” he says. “The integration piece always comes first. The next stage after you have access to the streaming data is starting to think about the analytics.”

Related Items:

Spark Streaming: What Is It and Who’s Using It?

How Kafka Redefined Data Processing for the Streaming Age

The post Fueled by Kafka, Stream Processing Poised for Growth appeared first on Datanami.

Read more here:: www.datanami.com/feed/

Why is the connection piece so hard?

By IoT Now Magazine

Increased usage of digital technologies by the manufacturing industry is inevitable but, while the shift is gradual, the pressure to go faster is greatWhen Hewlett-Packard Enterprise (HPE) and the Industry of Things World Conference conducted a survey to find out how successful industrial IoT (IIoT) projects have been in the last 12 months, the responses uncovered that only 53% of respondents thought their IIoT projects had met or exceeded goals. The remaining 47% said their goals had not been reached.

IIoT isn’t about companies buying a technology and suddenly they’re digital. It’s an entire architecture which encompasses an ecosystem with careful communication across various touchpoints within an organisation, all of which requires common standards as well as new technology architectures to create convergence of information technology (IT) and operational technology (OT).

One critical and common chokepoint is a lack of understanding about device connection. Even if devices are connected, there are often no simple tools to manage the devices in order to extract and transmit the data out of one language into another; for example the transmission and translation of programmable logic controller (PLC) data into enterprise resource planning (ERP) systems.

Let’s look at the top five things to consider about the connection puzzle and how to weave them into your overall plan.

1. Getting things connected is easier said than done

During the discovery phase, many IIoT vendors gloss over this. Once manufacturers decide to take the plunge, they suddenly realise that connecting to all these different devices – legacy and modern, proprietary and open source – is really difficult, and results in significant delays which blow up the original projected timeframe.

If you’ve ever engineered systems on the plant floor, you know there are those things down on the plant floor that are a nightmare to connect to and integrate with a variety of other applications. A simple data collection task can end up taking weeks of custom coding.

2. Embrace complexity

There is no single standard way of connecting everything together. Over time, the industrial plant floor has evolved as technology has changed. For better or for worse, this advancement also means more complexity and it is not going away; in fact, it will increase.

As a result, plant floors have a mixture of device brands, different protocols, and different proprietary data sets. Embracing complexity means we accept there are a lot of moving parts in IIoT solutions that we need to link together for success and that requires a level of expertise which is better addressed as a holistic solution versus a complicated system of patch work.

3. Prepare for latency

One piece that addresses complexity is open platform communications (OPC). OPC was designed to provide industrial automation with a standard networking protocol that requires polling to receive data from devices. Polling is where the system must ask the device for data at a preset rate, such as once every second or once every half hour.

OPC requires multiple steps to send data; it is not point A to point B. A typical path looks like this: PLC […]

The post Why is the connection piece so hard? appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/