internet protocol specification

Antenova to show two new high performing 4G/LTE diversity antennas for small PCBs

By Zenobia Hegde

Antenova Ltd, manufacturer of antennas and RF antenna modules, is showing a brand new pair of high performing 4G/LTE antennas which are suitable for PCBs as small as 60mm, at the consumer electronics show CES. The two antennas can also be used in 3G and MIMO applications.

The two antennas are similar – the difference being that Inversa is built for the USA market while Integra is for European and Asian markets.

Both antennas are available in left and right versions to provide more options for placement on the PCB, and can be used singly or in pairs for MIMO. Both use beam steering to ensure good isolation and cross correlation, and achieve high performance.

Inversa, part numbers SR4L034-L /SR4L034-R, measures 28.0 x 8.0 x 3.3mm and covers the USA bands 698-798 MHz, 824-960 MHz, 1710-2170 MHz, 2300-2400 MHz and 2500-2690MHz.

Integra, part numbers SR4L049-L/SR4L049-R measures 23.0 x 8.0 x 3.3mm and covers the bands 791-960 MHz, 1710-2170 MHz, 2300-2400 MHz and 2500-2600 MHz, used in Europe and Asia.

Antenova has designed these antennas for use in small trackers, OBDs and other similar devices where space is limited. For more details, antenna samples and evaluation boards, please click here.

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post Antenova to show two new high performing 4G/LTE diversity antennas for small PCBs appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Fueled by Kafka, Stream Processing Poised for Growth

By Alex Woodie

Once a niche technique used only by the largest organizations, stream processing is emerging as legitimate technique for dealing with massive amounts of data generated every day. While it’s not needed for every data challenges, organizations are increasingly finding ways to incorporate stream processing into their plans — particularly with the rise of Kafka.

Stream processing is just that – processing data as soon as it arrives, as opposed to processing it after it lands. The amount of processing that is applied to the data as it flows can vary greatly. On the one hand, users may do very little besides a simple transformation, such as converting temperatures from Celsius into Fahrenheit or combining it with another stream, while at the upper end, stream processors may apply real-time analytics or machine learning algorithms.

Almost any type of data can be used in stream processing. Sources can include a database event from RDBMs or NoSQL, sensor data from the IoT, comments made on social media, or a credit card swipe. The data’s destination similarly can be diverse – it could be headed to a traditional file system, a relational or NoSQL database, a Hadoop data lake, or a cloud-based object store.

What happens in between that initial data creation event and when the data written to some type of permanent repository is collectively referred to as stream processing. Initially, proprietary products developed by the likes of TIBCO, Software AG, IBM, and others were developed to handle streaming data. But more recently, distributed, open source frameworks have emerged to deal with the massive surge in data generation.

Apache Kafka — a distributed publish and subscribe message queue that’s open source and relatively easy-to-use –by far is the most popular of these open source frameworks, and Kafka is seen today by industry insiders as helping to fuel the ongoing surge in demand for tools to work with stream data processing.

Steve Wilkes, the CTO and founder of Striim, says Kafka’s popularity is helping to push stream processing into the center stage. “Kafka is driving a lot of our market,” he says. “A good majority of our customers are utilizing Kafka in one way, shape, or form.”

The underlying trend driving investment in stream processing is that customers need access to the latest data, Wilkes says. “It’s the recognition that, no matter how you’re doing analytics — whether you’re doing them in streaming fashion or whether you’re doing them after the fact through some sort of Hadoop jobs or big data analytics you need that up-to-date data,” he tells Datanami.

Striim this week unveiled a new release of its stream data processing solution, Striim version 3.8, that features better support for Kafka. This includes the capability to automatically scale Striim to more efficiently read from and write to Kafka as users scale up their real-time streaming architecture.

Many Kafka users are using the core Kafka product, along with the open source Kafka Connect software, to rapidly move data from its source to another destination, such as Hadoop or a data lake hosted on the cloud. Fewer shops are using the Kafka Streams API to write application logic on top of the message bus, a niche that third-party vendors are moving to fill.

According to a recent report from Confluent, the company behind open source Kafka and developer of the Confluent Platform, 81% of Kafka customers are using it to build data pipelines. Other common use case include real-time monitoring, ETL, microservices, and building Internet of Things (IoT) products.

Keeping the data lake updated with fresh data is an increasingly difficult task – and one that stream processing is being asked to fill as a sort of modern ETL role. According to Syncsort‘s recent 2018 Big Data Trends survey, 75% of respondents say that keeping their data lake updated with changing data sources is either “somewhat” or “very difficult.”

Another vendor that’s seeing the Kafka impact is Streamsets, a software vendor that bills itself as the “air traffic control” for data in motion. Streamsets’ initial product was a data collector that automated some of the nitty gritty work involved in capturing and moving data, often atop the Kafka message queue. The vendor recently debuted a low-footprint data collector that works in CPU- and network-constrained environments, and cloud-based console for managing the entire flow of customer’s data.

Streamsets Vice President of Marketing Rick Bilodeau says Kafka is driving a lot of the company’s business. “We do a lot of work with customers for Kafka, for real-time event streaming,” he tells Datanami. “We see fairly broad Kafka adoption as a message queue, where people are using [Streamsets software] primarily to broker data in and out of the Kafka bus.”

Some of Streamsets customers have a million data pipelines running at the same time, which can lead to serious management challenges. “Companies will say, ‘We built a bunch of pipelines with Kafka, but now have a scalability problem. We can’t keep throwing people at it. It’s just taking us too long to put these things together,’” Bilodeau says. “So they use data collector to accelerate that process.”

Today, Streamsets sees lots of customers implementing real-time stream processing for Customer 360, cybersecurity, fraud detection, and industrial IoT use cases. Stream processing is still relatively new, but it’s beginning to grow in maturity rapidly, Bilodeau says.

“It’s not the first inning, for sure. It’s maybe the third inning,” he says. “On the Gartner Hype Cycle, it’s approaching early maturity. Every company seems to have something they want to do to with streaming data.”

Striim’s Wilkes agrees. Fewer than half of enterprises are working with streaming data pipelines, he estimates, but it’s growing solidly. “Streaming data wasn’t even really being talked about a few years ago,” he says. “But it’s really starting to get up to speed. There is a steady progression.”

We’re still mostly in the pipeline-building phase, where identifying data sources and creating data integrations dominates real-time discussions, Wilkes says. That period will give way to more advanced use cases and people become comfortable with the technology.

“We’re seeing that a lot of customers are still at the point of obtaining streaming sources. They understand the need to get a real-time data infrastructure,” he says. “The integration piece always comes first. The next stage after you have access to the streaming data is starting to think about the analytics.”

Related Items:

Spark Streaming: What Is It and Who’s Using It?

How Kafka Redefined Data Processing for the Streaming Age

The post Fueled by Kafka, Stream Processing Poised for Growth appeared first on Datanami.

Read more here:: www.datanami.com/feed/

Sengled Partners with Baidu to Introduce China’s First Voice-Activated Smart Lamp Speaker

By IoT – Internet of Things

Sengled is partnering with leading AI company, Baidu, to introduce the Sengled Smart Lamp Speaker, a new voice-enabled lighting concept. Both companies will showcase Sengled Smart Lamp Speaker at the annual CES in Las Vegas, NV, January 9-12, 2018, with the official reveal commencing at the Baidu World @ Las Vegas on January 8, 2018. The […]

The post Sengled Partners with Baidu to Introduce China’s First Voice-Activated Smart Lamp Speaker appeared first on IoT – Internet of Things.

Read more here:: iot.do/feed

Major Breakthroughs in Smart Devices and IoT from Taiwan to be Unveiled at CES 2018

By IoT – Internet of Things

CesKicking the new year off with a bang, the Taiwan External Trade Development Council (TAITRA), Taiwan’s foremost trade promotion organization, will provide a first look at brand new “smart” innovations from ITRI, AEON Motor, GEOSAT, Robotelf, and Taiwan Main Orthopedics at a January 8 press conference at CES 2018. The companies will showcase products poised […]

The post Major Breakthroughs in Smart Devices and IoT from Taiwan to be Unveiled at CES 2018 appeared first on IoT – Internet of Things.

Read more here:: iot.do/feed

Predicting the maintenance

By Zenobia Hegde

It happens at the worst of times – late for a meeting, on the way to the rugby and even when you’re desperate for the bathroom. When your car breaks down, you can moan in retrospect, acknowledging the signs that it needed urgent maintenance. Thanks to technology, more specifically the evolution and application of cognitive learning, these frustrating occurrences will become a thing of the past.

Connecting the things

Analyst house Gartner forecasts that there will be 20.8 billion connected ‘things’ worldwide by 2020. Enterprises that stick to an old ‘preventive’ data methodology, says Mark Armstrong, managing director and vice-president International Operations, EMEA & APJ at Progress, are going to be left behind, as this approach accounts for a mere 20% of failures.

Predictive maintenance brings a proactive and resource saving opportunity. Predictive software can alert the manufacturer or user when equipment failure is imminent, but also carry out the maintenance process automatically ahead of time. This is calculated based on real time data, via metrics including pressure, noise, temperature, lubrication and corrosion to name a few.

Considering degradation patterns to illustrate the wear and tear of the vehicle in question, the production process is not subject to as high levels of interruption without the technology. By monitoring systems ‘as live’, breakdowns can be avoided prior to them happening.

It’s no longer a technological fantasy. Due to data in cars being collected for decades, researchers and manufacturers can gather insights that could be used to prepare predictive analytics. This will assist in predicting which individual cars will break down and need maintenance.

Now that the Internet of Things (IoT) is a reality, car manufacturers can use this information to offer timely and relevant additional customer services based on sophisticated software that can truly interrogate, interpret and use data. So who is going to be responsible for taking advantage of this technology?

Bolts and screws

Key management figures in the transport industry must commit to a maintenance management approach to implement a long-term technological solution. As described by R.Mobley, run-to-failure management sees an organisation refrain from spending money in advance, only reacting to machine or system failure. This reactive method may result in high overtime labour costs, high machine downtime and low productivity.

Similarly reactive, preventive maintenance monitors the mean-time-to-failure (MTTF), based on the principle that new equipment will be at its most vulnerable during the first few weeks of operation, as well as the longer it is used for. This can manifest itself in various guises, such as engine lubrication or major structural adjustments. However, predicting the time frame in which a machine will need to be reconditioned may be unnecessary and costly.

As an alternative option, predictive maintenance allows proactivity, ensuring lengthier time between scheduled repairs, whilst reducing the significant amount of crises that will have to be addressed due to mechanical faults. With a cognitive predictive model, meaning applications are able to teach themselves as they function, organisations will be able to foresee exactly why and when a machine will break down, allowing them to act […]

The post Predicting the maintenance appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/