dnssec cloudflare

Inmarsat and Rolls-Royce sign Ship Energy Management agreement

By Sheetal Kumbhar

Inmarsat and Rolls-Royce have signed a Letter of Intent (LOI) to have the option to make the Rolls-Royce Energy Management system available via Inmarsat Maritime’s Fleet Xpress high-speed broadband service, to reduce energy consumption and support environmental compliance. With data collected from a multitude of ship control systems and equipment sensors, Energy Management 2.0 also […]

The post Inmarsat and Rolls-Royce sign Ship Energy Management agreement appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Inmarsat and Rolls-Royce sign Ship Energy Management agreement

By News Aggregator

By Sheetal Kumbhar

Inmarsat and Rolls-Royce have signed a Letter of Intent (LOI) to have the option to make the Rolls-Royce Energy Management system available via Inmarsat Maritime’s Fleet Xpress high-speed broadband service, to reduce energy consumption and support environmental compliance. With data collected from a multitude of ship control systems and equipment sensors, Energy Management 2.0 also […]

The post Inmarsat and Rolls-Royce sign Ship Energy Management agreement appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

The post Inmarsat and Rolls-Royce sign Ship Energy Management agreement appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

B2Cloud to Exhibit at @CloudExpo | @B2CloudSA #IoT #M2M #SmartCities

SYS-CON Events announced today that B2Cloud will exhibit at SYS-CON’s 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
B2Cloud specializes in IoT devices for preventive and predictive maintenance in any kind of equipment retrieving data like Energy consumption, working time, temperature, humidity, pressure, etc.

read more

Read more here:: iot.sys-con.com/index.rss

B2Cloud to Exhibit at @CloudExpo | @B2CloudSA #IoT #M2M #SmartCities

By News Aggregator

SYS-CON Events announced today that B2Cloud will exhibit at SYS-CON’s 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
B2Cloud specializes in IoT devices for preventive and predictive maintenance in any kind of equipment retrieving data like Energy consumption, working time, temperature, humidity, pressure, etc.

read more

Read more here:: iot.sys-con.com/index.rss

The post B2Cloud to Exhibit at @CloudExpo | @B2CloudSA #IoT #M2M #SmartCities appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

A battle looms among $2.5bn LIDAR market suppliers and tech, says IHS Markit

By News Aggregator

By Sheetal Kumbhar

The current generation of automated functions — automatic emergency braking, adaptive cruise control and lane keep assistance — rely largely on cameras, radars and ultrasonic sensors. According to Akhilesh Kona, senior analyst for automotive electronics and semiconductors at IHS Markit, each of these devices is limited to different conditions, e.g. weather, object detection, distance, etc. […]

The post A battle looms among $2.5bn LIDAR market suppliers and tech, says IHS Markit appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

The post A battle looms among $2.5bn LIDAR market suppliers and tech, says IHS Markit appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

A battle looms among $2.5bn LIDAR market suppliers and tech, says IHS Markit

By Sheetal Kumbhar

The current generation of automated functions — automatic emergency braking, adaptive cruise control and lane keep assistance — rely largely on cameras, radars and ultrasonic sensors. According to Akhilesh Kona, senior analyst for automotive electronics and semiconductors at IHS Markit, each of these devices is limited to different conditions, e.g. weather, object detection, distance, etc. […]

The post A battle looms among $2.5bn LIDAR market suppliers and tech, says IHS Markit appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Announcing @N3N_IoT to Exhibit at @ThingsExpo Silicon Valley | #IoT #M2M #Sensors

SYS-CON Events announced today that N3N will exhibit at SYS-CON’s @ThingsExpo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
N3N’s solutions increase the effectiveness of operations and control centers, increase the value of IoT investments, and facilitate real-time operational decision making. N3N enables operations teams with a four dimensional digital “big board” that consolidates real-time live video feeds alongside IoT sensor data and analytics insights onto a single, holistic, display, focusing attention on what matters, when it matters.

read more

Read more here:: iot.sys-con.com/index.rss

Announcing @N3N_IoT to Exhibit at @ThingsExpo Silicon Valley | #IoT #M2M #Sensors

By News Aggregator

SYS-CON Events announced today that N3N will exhibit at SYS-CON’s @ThingsExpo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
N3N’s solutions increase the effectiveness of operations and control centers, increase the value of IoT investments, and facilitate real-time operational decision making. N3N enables operations teams with a four dimensional digital “big board” that consolidates real-time live video feeds alongside IoT sensor data and analytics insights onto a single, holistic, display, focusing attention on what matters, when it matters.

read more

Read more here:: iot.sys-con.com/index.rss

The post Announcing @N3N_IoT to Exhibit at @ThingsExpo Silicon Valley | #IoT #M2M #Sensors appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

How to Check If You’re Exposed to Those Scary BlueBorne Bluetooth Flaws

By News Aggregator

By David Meyer

The security research firm Armis Labs had identified massive vulnerabilities in the Bluetooth wireless technology that can allow attackers to take over people’s devices, whether they be smartphones, PCs or even Internet of Things devices such as smart TVs and watches.

The “BlueBorne” flaws would allow a virus to leap from device to device, regardless of the operating system being used.

They can even allow attackers to access so-called “air-gapped” computer networks that aren’t connected to the Internet, Armis warned Tuesday. Bluetooth-equipped devices do not need to be in discoverable mode, or paired with the attacker’s device, in order to be vulnerable.

“These silent attacks are invisible to traditional security controls and procedures. Companies don’t monitor these types of device-to-device connections in their environment, so they can’t see these attacks or stop them,” Armis CEO Yevgeny Dibrov said in a statement. “The research illustrates the types of threats facing us in this new connected age.”

So, are your Bluetooth-equipped devices vulnerable? Armis told many of the affected tech companies about the flaws well before informing the public–an approach known in the industry as responsible disclosure–so they’ve had a chance to push out patches.

Not everyone has, though.

According to Armis, Google


googl



put out an Android security update last month and Microsoft


msft



planned a Windows update for Tuesday. The team working on security for the open-source Linux operating system was also targeting an update for Tuesday.

Apple


aapl



fans will be delighted to hear that the current versions of its software are not vulnerable. That means anything more recent than iOS 9.3.5 or, for Apple TV users, version 7.2.2 of the software for that device. iOS 10 is definitely OK, Armis said.

Samsung


ssnlf



fans will be less pleased to read this from Armis: “Contact on three separate occasions in April, May, and June. No response was received back from any outreach.”

Those using non-Google-branded Android devices will just have to hope that the manufacturers issue security updates to keep them safe. Google automatically updates its own devices, such as the Pixel, but when it comes to the wider Android ecosystem, all it can do is make updates available to manufacturers and hope they relay them to their customers’ phones and tablets.

Armis has released an Android app to help people check if they are vulnerable.

In short, install the latest updates for everything, and unless you’re sure that your devices have been updated with a fix, it might be a good idea to turn off Bluetooth for now.

Read more here:: fortune.com/tech/feed/

The post How to Check If You’re Exposed to Those Scary BlueBorne Bluetooth Flaws appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

A Four-Phased Approach to Building an Optimal Data Warehouse

By Barry Devlin

If you’re planning to create a data warehouse, make sure you create one that is cross-functional and provides a long-life foundation for data provision and decision support. This means covering the entire enterprise and satisfying the needs of multiple projects and groups for several years. The foundation must provide consistent, reconciled, legally binding data to your business clients.

But, easier said than done. Right?

Think of your project in these four steps: Design, Build, Implement and Maintain.

Designing your data warehouse

Let’s start at the design phase. When planning your design, the vision for your new data warehouse is best laid out over an enterprise data model (EDM), which consists of high-level entities including customers, products and orders.

In the past, EDMs were built from scratch, which worked for data modelers but not business users who were drawn into definitional debates rather than seeing the desired results. Today, many EDMs are customized from standard industry models, which are much faster and easier to follow. Sometimes an EDM already exists from a previous data warehouse or other enterprise-wide undertakings, such as master data management.

(Andrija Pajtic/Shutterstock)

After establishing your EDM, your next major challenge is to define and refine the logical and physical structure of the data warehouse relational tables. This part will consider the limitations of the source systems, the challenges in combining data from multiple sources, and possible changes in business needs and source system structures over time.

A traditional design approach involves mapping entities to “loosely normalized” tables based on third normal form (3NF) or based on a dimensional or star-schema model. However, both present challenges for modern data warehouse development.

Another approach uses the Data Vault Model (DVM), which is a hybrid of the 3NF and star-schema forms. First introduced by Dan Linstedt, the Data Vault is a detail-oriented, history-tracking, linked set of normalized tables designed to support multiple functional business areas.

The DVM consists of three specialized types of entities/tables: hubs based on rarely changed business keys, links that describe associations or transactions between business keys, and satellites that hold all temporal and descriptive attributes of business keys and their associations. A new version introduced in 2013 consists of a data model, methodology, and systems architecture, which provides a design basis for data warehouses to emphasize core data quality, consistency, and agility to support enterprise-wide data provision requirements. In May 2017, data warehouse automation specialist, WhereScape announced automation software to enable rapid and agile Data Vault 2.0 development, cutting delivery time of Data Vault-based analytics solutions by two-thirds.

Get Busy Building

Once you set your design, now comes the hard work of building your data warehouse. But before you start, accept the fact that no matter how nicely you’ve designed your model, you will face the reality of imperfect data source systems.

Data warehouse builders struggle with missing data in source systems, poorly defined data structures, incorrect content and missing relationships. Implementation is a delicate balancing act between the vision of the model and the constraints of the sources.

The building process comes down to five steps:

  1. Understand the data sources. Keep in mind that legacy systems might be “bent to fit” emerging and urgent requirements. And modern big data sources might lack documentation.
  2. Compare the data available to the data warehouse model and define appropriate transformations to convert the former to the latter.
  3. Where transformations are too difficult, modify the data warehouse model to accommodate the reality of the data sources. Changing the data sources is usually impossible for reasons of cost and politics.
  4. Test performance of load/update processes and check the ability of the modified model to deliver the data the business requires.
  5. If successful, declare victory. Otherwise, rinse and repeat.

    (zhengzaishuru/Shutterstock)

Traditionally, the output of the above process would be encoded in a script or program and run overnight in batch to populate the warehouse. Any changes in requirements or the source systems would require a round trip back through steps one to five, followed by code update. The approach is manual, time-consuming, and error-prone.

Improved approaches to automating the process have emerged in stages over the history of data warehousing: extract, transform, load (ETL) tools, data integration systems and, finally, data warehouse automation (DWA). In essence, each stage on this journey depicts an increasing level of automation, using DWA to address the entire process of designing, building, operating and maintaining a data warehouse. Companies such as WhereScape offer useful tools to automate the data source discovery, design and prototyping phases of projects. Additionally, advanced automation solutions with an integrated development environment (IDE) targeted to your data platform can eliminate the majority of traditional hand-coding required and dramatically streamline and accelerate the development, deployment, and operation of data infrastructure projects.

In the transition from design to build, the combination of a well-structured data model and DWA offers a more powerful approach. This is because the data model provides an integrated starting set of metadata that describes the target tables in both business terms and technical implementation. This is particularly true with the Data Vault model, which has been designed and optimized from the start for data warehousing.

A DWA tool automates the transformation of the data structures of the various sources to the optimized model of the Data Vault and populates the target tables with the appropriate data. This approach creates necessary indexes and cleanses and combines source data to create the basis for the analysis to address the business need.

Shifting to Operations

Having designed and built your data warehouse, the next step is to deliver it successfully to the business and run it smoothly on a daily basis.

(Don Pablo/Shutterstock)

Historically, however, approaches to operating a data warehouse environment have been somewhat lax. Manual and semi-automated methods for populating and updating the contents of the warehouse have been widespread. Advances made in the data warehouse itself are being offset by the spread of ad hoc approaches in departmentally managed data marts. The data lake has seen the re-emergence of a cottage industry of handcrafted scripts, often operated by individual data scientists.

The data warehouse has become the repository of truth and history for businesses to analyze challenges and opportunities that can be addressed by advanced management and automation practices. The combination of DWA and a Data Vault address these needs from two perspectives: function deployment and ongoing day-to-day operations.

Deployment seldom gets the attention it deserves. A data warehouse is substantially more complex than most IT projects, but must be deployed correctly as we move toward data-driven business and agile development.

As you move from a development phase to test, quality assurance, and on to production, you must address mundane issues such as packaging and installation of the code built in the previous phase. The clear aim is to automate and speed deployment in an agile environment to reduce human error across the full lifecycle.

Having deployed the system to production, the next—and ongoing—task is to schedule, execute, and monitor the continuing process of loading and transforming data into the data warehouse. In this phase, jobs consist of a sequence of interdependent tasks. To ensure that data consistency is maintained, if a task fails during execution, then all

downstream dependent tasks are halted. When the problem has been resolved, the job is restarted and will pick up from where it left off and continue through to completion. From an operational point of view, given potential interdependencies of data across these systems, it makes sense to manage this ensemble as a single, logical environment.

The smooth, ongoing daily operation of the entire data warehouse environment is a fundamental prerequisite to its acceptance by users and its overall value to the business.

Maintaining with Agility

In more traditional IT projects, when a successful system is tested, deployed and running daily, its developers can sit back and take a well-deserved rest. Developers of today’s data warehouses have no such luxury.

(Kzenon/Shutterstock)

Instead, you can count on working on a regular basis to deliver new and updated data to your businesses. To make life easier, leverage and apply agility whenever possible.

Vendors and practitioners have long recognized the importance of agility to deliver new and updated data through the data warehouse. Unfortunately, such agility has proven difficult to achieve.

Now, ongoing digitalization of business is driving ever-higher demands for new and fresh data. Some people think a data lake filled with every conceivable sort of raw, loosely managed data will address these needs. That approach may work for non-critical, externally sourced social media and Internet of Things data. However, it doesn’t help with historical and real-time data.

Fortunately, the agile and automated characteristics of the Data Vault / DWA approach applies also to the maintenance phase. In fact, it may be argued that these characteristics are even more important in this phase.

One explicit design point of the Data Vault data model is agility. The engineered components and methodology of the Data Vault are particularly well suited to the application of DWA tools.

At this point, widespread automation is essential for agility because it increases developer productivity, reduces cycle times, and eliminates many types of coding errors. Look for a solution that incorporates key elements of the Data Vault approach within the structures, templates, and methodology to make the most of the potential automation gains.

Another key factor in ensuring agility in the maintenance phase is the ongoing and committed involvement of business people. An automated, template approach to the entire design, build and deployment process allows business users to be involved continuously and intimately during every stage of development and maintenance of the data warehouse and marts.

With maintenance, we come to the end of our journey through the land of automating warehouses, marts, lakes, and vaults of data. At each step of the way, combining the use of the Data Vault approach with DWA tools simplifies technical procedures and eases the business path to data-driven decision-making.

About the author: Dr. Barry Devlin is among the foremost authorities on business insight and one of the founders of data warehousing, having published the first architectural paper on the topic in 1988. Barry is founder and principal of 9sight Consulting. A regular blogger, writer and commentator on information and its use, Barry is based in Cape Town, South Africa and operates worldwide.

Related Items:

Data Warehouse Market Ripe for Disruption, Gartner Says

What’s Challenging in Big Data Now: Integration and Privacy

Why Hadoop Won’t Replace Your Data Warehouse

The post A Four-Phased Approach to Building an Optimal Data Warehouse appeared first on Datanami.

Read more here:: www.datanami.com/feed/