buy kindle online

Setting Sail: Next Stop, Caribbean!

By Bevil Wooding

We are excited to announce the launch of new ARIN in the Caribbean 2018 activities and open registration for our first two events in the following locations:

Similar to our ARIN on the Road events, these are one-day programs featuring information on our services, as well as how we can help you and your organizations design, secure, and maintain robust networks and contribute to Internet numbering policy development for the region.

ARIN in the Caribbean events are free to attend and offer a great environment to learn and share. The program includes presentations on timely topics such as obtaining IPv6 addresses from ARIN and transfers of number resources. In addition, there will be presentations on current policy discussions, ARIN technical services, and best practices for building resilient Caribbean networks.

The agenda for our upcoming meetings will cover the following topics:

  • ARIN’s Mission and Core Functions
  • ARIN Technical Services
  • Policy Development at ARIN
  • ARIN and Caribbean Network Autonomy and Resilience
  • IPv4 Services – Waiting List, Transfers, and more
  • IPv6 and ASN Services – Obtaining Resources, Creating Networking Plans
  • ARIN Q&A – Open microphone to answer your questions!

Each day will conclude with an open microphone question and answer session, followed by a drawing for a $100 USD Amazon gift card for those who complete a short survey about the event.

Space is limited at each event, so if you are interested in attending one of our upcoming events please register on or before 2 February 2018!

If you are not available to join us in Grenada or Barbados, please visit our brand new ARIN in the Caribbean page for a list of other planned ARIN in the Caribbean events!

The post Setting Sail: Next Stop, Caribbean! appeared first on Team ARIN.

Read more here:: teamarin.net/feed/

Antenova to show two new high performing 4G/LTE diversity antennas for small PCBs

By Zenobia Hegde

Antenova Ltd, manufacturer of antennas and RF antenna modules, is showing a brand new pair of high performing 4G/LTE antennas which are suitable for PCBs as small as 60mm, at the consumer electronics show CES. The two antennas can also be used in 3G and MIMO applications.

The two antennas are similar – the difference being that Inversa is built for the USA market while Integra is for European and Asian markets.

Both antennas are available in left and right versions to provide more options for placement on the PCB, and can be used singly or in pairs for MIMO. Both use beam steering to ensure good isolation and cross correlation, and achieve high performance.

Inversa, part numbers SR4L034-L /SR4L034-R, measures 28.0 x 8.0 x 3.3mm and covers the USA bands 698-798 MHz, 824-960 MHz, 1710-2170 MHz, 2300-2400 MHz and 2500-2690MHz.

Integra, part numbers SR4L049-L/SR4L049-R measures 23.0 x 8.0 x 3.3mm and covers the bands 791-960 MHz, 1710-2170 MHz, 2300-2400 MHz and 2500-2600 MHz, used in Europe and Asia.

Antenova has designed these antennas for use in small trackers, OBDs and other similar devices where space is limited. For more details, antenna samples and evaluation boards, please click here.

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post Antenova to show two new high performing 4G/LTE diversity antennas for small PCBs appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Fueled by Kafka, Stream Processing Poised for Growth

By Alex Woodie

Once a niche technique used only by the largest organizations, stream processing is emerging as legitimate technique for dealing with massive amounts of data generated every day. While it’s not needed for every data challenges, organizations are increasingly finding ways to incorporate stream processing into their plans — particularly with the rise of Kafka.

Stream processing is just that – processing data as soon as it arrives, as opposed to processing it after it lands. The amount of processing that is applied to the data as it flows can vary greatly. On the one hand, users may do very little besides a simple transformation, such as converting temperatures from Celsius into Fahrenheit or combining it with another stream, while at the upper end, stream processors may apply real-time analytics or machine learning algorithms.

Almost any type of data can be used in stream processing. Sources can include a database event from RDBMs or NoSQL, sensor data from the IoT, comments made on social media, or a credit card swipe. The data’s destination similarly can be diverse – it could be headed to a traditional file system, a relational or NoSQL database, a Hadoop data lake, or a cloud-based object store.

What happens in between that initial data creation event and when the data written to some type of permanent repository is collectively referred to as stream processing. Initially, proprietary products developed by the likes of TIBCO, Software AG, IBM, and others were developed to handle streaming data. But more recently, distributed, open source frameworks have emerged to deal with the massive surge in data generation.

Apache Kafka — a distributed publish and subscribe message queue that’s open source and relatively easy-to-use –by far is the most popular of these open source frameworks, and Kafka is seen today by industry insiders as helping to fuel the ongoing surge in demand for tools to work with stream data processing.

Steve Wilkes, the CTO and founder of Striim, says Kafka’s popularity is helping to push stream processing into the center stage. “Kafka is driving a lot of our market,” he says. “A good majority of our customers are utilizing Kafka in one way, shape, or form.”

The underlying trend driving investment in stream processing is that customers need access to the latest data, Wilkes says. “It’s the recognition that, no matter how you’re doing analytics — whether you’re doing them in streaming fashion or whether you’re doing them after the fact through some sort of Hadoop jobs or big data analytics you need that up-to-date data,” he tells Datanami.

Striim this week unveiled a new release of its stream data processing solution, Striim version 3.8, that features better support for Kafka. This includes the capability to automatically scale Striim to more efficiently read from and write to Kafka as users scale up their real-time streaming architecture.

Many Kafka users are using the core Kafka product, along with the open source Kafka Connect software, to rapidly move data from its source to another destination, such as Hadoop or a data lake hosted on the cloud. Fewer shops are using the Kafka Streams API to write application logic on top of the message bus, a niche that third-party vendors are moving to fill.

According to a recent report from Confluent, the company behind open source Kafka and developer of the Confluent Platform, 81% of Kafka customers are using it to build data pipelines. Other common use case include real-time monitoring, ETL, microservices, and building Internet of Things (IoT) products.

Keeping the data lake updated with fresh data is an increasingly difficult task – and one that stream processing is being asked to fill as a sort of modern ETL role. According to Syncsort‘s recent 2018 Big Data Trends survey, 75% of respondents say that keeping their data lake updated with changing data sources is either “somewhat” or “very difficult.”

Another vendor that’s seeing the Kafka impact is Streamsets, a software vendor that bills itself as the “air traffic control” for data in motion. Streamsets’ initial product was a data collector that automated some of the nitty gritty work involved in capturing and moving data, often atop the Kafka message queue. The vendor recently debuted a low-footprint data collector that works in CPU- and network-constrained environments, and cloud-based console for managing the entire flow of customer’s data.

Streamsets Vice President of Marketing Rick Bilodeau says Kafka is driving a lot of the company’s business. “We do a lot of work with customers for Kafka, for real-time event streaming,” he tells Datanami. “We see fairly broad Kafka adoption as a message queue, where people are using [Streamsets software] primarily to broker data in and out of the Kafka bus.”

Some of Streamsets customers have a million data pipelines running at the same time, which can lead to serious management challenges. “Companies will say, ‘We built a bunch of pipelines with Kafka, but now have a scalability problem. We can’t keep throwing people at it. It’s just taking us too long to put these things together,’” Bilodeau says. “So they use data collector to accelerate that process.”

Today, Streamsets sees lots of customers implementing real-time stream processing for Customer 360, cybersecurity, fraud detection, and industrial IoT use cases. Stream processing is still relatively new, but it’s beginning to grow in maturity rapidly, Bilodeau says.

“It’s not the first inning, for sure. It’s maybe the third inning,” he says. “On the Gartner Hype Cycle, it’s approaching early maturity. Every company seems to have something they want to do to with streaming data.”

Striim’s Wilkes agrees. Fewer than half of enterprises are working with streaming data pipelines, he estimates, but it’s growing solidly. “Streaming data wasn’t even really being talked about a few years ago,” he says. “But it’s really starting to get up to speed. There is a steady progression.”

We’re still mostly in the pipeline-building phase, where identifying data sources and creating data integrations dominates real-time discussions, Wilkes says. That period will give way to more advanced use cases and people become comfortable with the technology.

“We’re seeing that a lot of customers are still at the point of obtaining streaming sources. They understand the need to get a real-time data infrastructure,” he says. “The integration piece always comes first. The next stage after you have access to the streaming data is starting to think about the analytics.”

Related Items:

Spark Streaming: What Is It and Who’s Using It?

How Kafka Redefined Data Processing for the Streaming Age

The post Fueled by Kafka, Stream Processing Poised for Growth appeared first on Datanami.

Read more here:: www.datanami.com/feed/

Sengled Partners with Baidu to Introduce China’s First Voice-Activated Smart Lamp Speaker

By IoT – Internet of Things

Sengled is partnering with leading AI company, Baidu, to introduce the Sengled Smart Lamp Speaker, a new voice-enabled lighting concept. Both companies will showcase Sengled Smart Lamp Speaker at the annual CES in Las Vegas, NV, January 9-12, 2018, with the official reveal commencing at the Baidu World @ Las Vegas on January 8, 2018. The […]

The post Sengled Partners with Baidu to Introduce China’s First Voice-Activated Smart Lamp Speaker appeared first on IoT – Internet of Things.

Read more here:: iot.do/feed

Major Breakthroughs in Smart Devices and IoT from Taiwan to be Unveiled at CES 2018

By IoT – Internet of Things

CesKicking the new year off with a bang, the Taiwan External Trade Development Council (TAITRA), Taiwan’s foremost trade promotion organization, will provide a first look at brand new “smart” innovations from ITRI, AEON Motor, GEOSAT, Robotelf, and Taiwan Main Orthopedics at a January 8 press conference at CES 2018. The companies will showcase products poised […]

The post Major Breakthroughs in Smart Devices and IoT from Taiwan to be Unveiled at CES 2018 appeared first on IoT – Internet of Things.

Read more here:: iot.do/feed