Author Dominique Guinard

Product codes are set to change, and we’re calling on brands to have their say.

GS1, the global standards organisation, is launching a working group on a new standard called the ‘GS1 URI’.  In their call to action, GS1 describe the twin objectives:

“The short-term goal of this work is to reduce the need for multiple codes on packs, while ensuring that we develop a glide path with industry toward a future where a single 2D barcode could serve the needs of all parties.”

What this means and why it matters

Today products are usually identified by a 1D barcode also known as a GTIN. 1D GTINs are fine when you stay within the supply chain, however they generally can’t be used by consumer apps or smartphones.

So if a consumer wants to scan their purchase to find out about the product’s provenance, manufacturers often have to connect consumers to such information via the scan of an additional code on the product (typically a QR code) which contains the address of a web page where product or marketing information is available.  The new GS1 URI Standard proposed would mean manufacturers will no longer have to add an additional code.

This may seem a small change, but it’s actually a major milestone in product identification.

Interested?  Here’s how you can get involved

To make this a reality we hope our customers or other manufacturers can get involved with the working group, either as an active contributor or in a more passive overseeing role, to make sure the initiative gets inputs from the voices that matter.  The working group is available for GS1 members organisations who can follow this simple three step process to register and ideally join the kick off call on Thursday, 18 January or one of the following weekly calls.

All products BornDigital

At EVRYTHNG we’ve long advocated the use of QR codes containing URLs (Web identities) as a way to drive digital experiences directly from products and it has worked very well for a number of our customers. So we’re excited by GS1’s move which is a major step in this direction: the working group will look into standardizing product URLs leading to a world where every single product will embed a single universal code useable in the supply chain but also beyond, to create a unique and dynamic link between consumers and products.

We have already joined this group and will play an active role in helping to shape this milestone. We’d love you to get involved too!

Authors Joël Vogt (Research Engineer) & Dom Guinard (Co-founder and CTO)


What is machine learning?

Machine learning is to data scientists what mining automation is to gold diggers. That is to say, it is now profitable to extract gold from large piles of rubble and sand that were previously considered too expensive, or even impossible, to manually process.

Machine learning is ushering in a paradigm shift in the way software is designed and developed. Traditional software engineering is essentially coders writing step by step how a machine is supposed to transform data. These instructions are written using a programming language, which is then translated into a program that a machine can execute.  When we talk about machine learning, a software engineer’s work is to describe what the problem is as a ‘machine learning model’ and let a machine learning algorithm automatically discover how to best tune the model, based on the training data. Take as an analogy the work of a business consultant. A business consultant can either define a business process model top down, based on industry best practices, regulations and personal experience. Or she can observe informal processes within an organisation, conduct interviews and then summarise her findings as one or more business process models, based upon these learnings.

Why now?

While machine learning has been around as a very active research field for a while, it is only recently being adopted by the wider industry, after game changing success stories by Internet giants like Google, Microsoft and Facebook. But why are we now only seeing a wide adoption of machine learning?

First, there is a caveat: for a machine learning algorithm to perform well, it needs a lot of data, which is now increasingly available thanks to the Internet. The more data you throw at it,  the more accurately a machine learning algorithm can learn a representation. The second reason is affordable specialized processors, initially GPUs (Graphical Processing Unit) that were developed for gaming, that could crunch through these huge datasets in reasonable time. Thirdly, machine learning frameworks like Keras or Tensorflow are now available and greatly facilitate the development, training and deployment of very powerful machine learning solutions.

How can it help the IoT and supply chains?

What could we do with machine learning that couldn’t be done before? Automated business workflow and rules? Well if you know the rules that governs your data then you don’t need machine learning. A rules engine such as EVRYTHNG’s powerful Reactor™ is a better fit! Data exploration? Check out our latest dashboard widgets and query tools!

Machine learning and in particular, deep learning, lets you extract insights from massive amounts of data when visualizations become too complex and writing rules practically impossible because of the number of permutations.

The IoT is generating data at an unprecedented rate and this is only the beginning. Machine learning frameworks will help distill information from vast data pools containing unstructured, semi-structured or well structured data, and can be used in the following example use cases:

Gray market detection: Gray market can be seen as a classification task. By classifying products by their expected market, a machine learning algorithm can learn the context, route, etc. of products in each class. Products that are purchased in a place other than their intended market are considered as sold on the gray market.

Product authenticity: We can use machine learning to add further intelligence to scans of physical products (known as THNGS). This is done by training a machine learning model on the context of product scans of authentic and counterfeits products. Once deployed, each scan will be result in a probability of authenticity.

Replenishment: The EVRYTHNG platform makes it easy to train a predictive machine learning model on appliance telematics data coming from a collection of similar appliances, for example coffee machines or washing machines. We are able to predict when to reorder coffee beans based on the vibration and duration of the coffee machine in use. Because the accuracy of machine learning improves with more data,  this collective “knowledge” will yield overall a more reliable, personalised reordering service. Furthermore, our platform can act as a mediator between the coffee machine and the supplier. With a simple Reactor™ rule, we can reorder coffee from our favourite supplier when they are only two days of coffee beans supplies left, ensuring that we always have coffee at the office. You don’t need machine learning to predict what happens when software engineers are deprived of coffee!

Preventive maintenance: This is similar to the replenishment use case. The difference being that we use the Reactor™ to dispatch a maintenance notification to a person and that we leverage third-party data sources, such as the current weather report, to augment operational data from appliances. Read our appliance telematics blog to learn more about how we make this a reality.

Applying machine learning to detect gray markets

Let’s focus on one of these use cases specifically.  Customers often come to us with seemingly simple questions, such as: can you tell me if a product is being sold on the gray market or not?

The only way for brands to detect gray market problems is by having visibility over their supply chain: product digitization and item-level traceability is a must. This is an area in which EVRYTHNG specialize – our platform makes it easy to integrate different information systems and write apps and analytics tools to gain insights from vast data pools.  The EVRYTHNG Labs team have looked at ways to use deep learning to help our customers make even more use of the vast amount they manage in our platform.

If each step in the supply chain is logged, determining parallel imports should be straightforward, just look for products that were bought where they were not supposed to be sold. Unfortunately, this only works in theory, because it would require every step to be recorded and it would require every party in the supply chain to use the same ‘vocabulary’ which is often far from being the case. So in many cases the data is either wrong, or missing. This is common among many supply chains which are only partially instrumented. Could we possibly go through millions of records and figure out the correct destination of every product? No, but our newly developed machine learning feature could!

Figure 1: our parallel import neural network for gray market detection

This gray market problem was essentially a multiclass classification problem. Each expected market is a class. We developed a 1D convolutional neural network. The picture above shows the model architecture. Next, we trained the model with full records that included ‘expected market’. Training the model means the machine learning algorithm had to correctly classify records by expected market, based on values such as stock keeping unit (SKU), reseller or where the product was collected. After each iteration, the machine learning algorithm tunes the model parameters to improve accuracy.
The result was a trained neural network built upon Keras and Tensorflow that proved to accurately find the actual destination of a product (in 94% of the cases), hence helping to detect parallel imports.

Naturally, it is essential to keep the human in the loop. We do this for example by predicting the value of every new record, irrespective of whether this value is missing. This way users can see how well the model is performing, and it will help improve the model over time.

Getting started with machine learning for your products

After over a year of research in this space we are now in the process of introducing machine learning features into our platform. Rather than offering raw machine learning capabilities, our productization approach is to ‘pre-package’ trained networks that can be used to achieve specific goals. The idea being that our customers can activate these trained networks within a few clicks to start learning from the incoming data. For instance, as explained above, to detect gray market or product authenticity, or enhance automatic replenishment and preventive maintenance.

We’ll announce soon when our new machine learning capabilities are generally available, but in the meantime, if you’d like to trial these exciting features please contact us.

Author Niall Murphy

As we embark on 2018, I wanted to stop for a moment and take a look at some key disruptive technology trends emerging over the last 12 months. Here are 7 of the most significant which are certain to shape the year ahead.

2017 was a turning point for mass-scale consumer engagement via product packaging.  This came about from 2 key developments:

  • Apple enabled QR codes and NFC support in iOS 11.  Alongside QR code support in Chrome, now up to two billion devices can natively scan and interact with products, taking a huge amount of friction out of the user journey and removing one of the biggest inhibitors for brands to digitally-activate their products.
  • Consumer behaviors have now changed dramatically thanks to the likes of WeChat, Facebook, Spotify and Snapchat all promoting ‘social codes’ and physical-to-digital engagement. 8 million snapcodes are scanned each day and over 800 million WeChat users scan QR codes in their daily lives to pay for goods, get information and interact digitally.

Blockchain was widely promoted as the ‘solve-all’ solution for provenance, trust and transparency. Startups like and tap into the current zeitgeist of consumer desire for transparency and accountability.

But the maturity and scalability of Blockchain technology remains in question: In one year, Bitcoin mining and transaction processing consumed the same amount of power as Ireland. Many will put this down simply to ‘bumps on the road’.

EVRYTHNG joined the Blockchain Research Institute, and published a white paper detailing how Blockchain works effectively in the Internet of Things. We’ve also developed a PoC which we’ll be rolling out in 2018.

Away from Blockchain, we saw drives towards transparency everywhere.  The Sustainable Apparel Coalition (SAC) collaborated with EVRYTHNG to make provenance and sustainability information on clothing and footwear items directly accessible to consumers through the products themselves. And working with the Grocery Manufacturers Association (GMA), EVRYTHNG and SmartLabel now provide detailed product information for food and beverage products in the US. In early 2018 we’ll be rolling out a recycling scheme with Unilever as part of a European project whereby consumers can recycle product packaging and earn rewards.

In 2017 Amazon’s vice-like grip has squeezed retail and big brands even tighter. Over 5,000 stores closed across the US through the year and as Amazon gains more and more data on consumers buying habits, brands have in turn focused on building their own Direct-to-Consumer capabilities to compete.

Physical products have emerged as the crucial asset that manufacturers can control and leverage as the ‘brand in the hand’ – the trigger for an online transaction or interaction, and a direct relationship between the brand and its customer or consumer.

Nike, as an example, is forecasting their DTC business to grow from the $6.6 billion generated in 2015 to $16 billion by 2020. Under Armour are applying similar focus.

Rebecca Minkoff worked with EVRYTHNG to launch new ranges of digitally-enabled accessory products that connect directly with consumers to offer new experiences and content. This provides CRM data, new sales and brand building through their products.

Other brands will follow in 2018 as their focus narrows onto gathering direct 1st party data on consumers to furnish vital these CRM initiatives. They will increasingly redirect marketing spend from digital media to their DTC initiatives to gather this data and close the loop of measurability in marketing dollars spent and consumer relationships and sales acquired.

End to End visibility is the number 1 supply chain challenge facing global consumer product brands, hindered by the fragmented nature of supply chains, silo’d information and data locked-up in legacy systems. This was the takeaway from a survey we conducted in 2017. And a glance at industry data paints a worrying picture on the impacts of this:

But the IoT offers a solution. Products can now be followed as they move through the supply chain. With a digital identity in the cloud, product data can be easily aggregated from many different role players – from manufacturing, to distribution, to retail, to consumer – providing a full picture of the product journey and analytical insight. EVRYTHNG has worked with several global consumer brands during 2017, implementing traceability on tens of millions of products. We’re also now combining machine learning with supply chain product tracking data to identify counterfeit and integrity issues and we expect this to grow substantially in 2018.

2017 also saw the maturing of Low-Power WAN technology – making it possible for battery-powered sensor devices on pallets or cases of products to provide real-time tracking of location and state. EVRYTHNG teamed up with Sigfox, The Things Network and Things Connected to provide standard integrations for how any device could report data automatically to the cloud. This technology makes it possible to track inventory in real-time, redirect shipments as they’re on the road, and monitor quality of product in transit.

EVRYTHNG pioneered the concept of every product in the world having a digital identity on the Web – what we call ‘Active Digital Identities’. Last year ‘Digital Twins’ entered mainstream vocabulary, with Gartner defining it as: “a dynamic software model of a physical thing or system.” As we enter 2018, EVRYTHNG is managing close to a billion Active Digital Identities, each one representing a physical product in the real world, accessed via standards-based Web APIs. The digital identity connects a product to the Web, which is the world’s application platform and makes it possible for products to participate seamlessly in the digital application ecosystem.

Another important step concerned product coding.  As the global industry product standards organisation, GS1 serves two million companies globally. This is how every point of sale system in the world is able to understand a barcode.  In 2017 GS1 and EVRYTHNG established an important partnership to give an Active Digital Identity to the physical GS1 identifiers on packaging (think barcodes, QR codes or RFID tags).

This is a crucial step for the large scale digitisation of consumer products. The combination of GS1 standards-based coding with EVRYTHNG digital identity means that manufacturers can use one tag or code on their product packaging to drive a multiplicity of applications, including point-of-sale, consumer engagement, authentication and supply chain management. 2018 will see this model rolled out around the world, with packaging service provider, point-of-sale and application services integration.

Connected clothing took off last year. Levi’s experimented with their smart jackets. Nike went a step further with the launch of their connected NBA jerseys.  And as mentioned, Rebecca Minkoff brought out their new Fall range of smart handbags to give consumers style tips, recommendations and location-specific offers.

While these examples focus on consumer engagement, in 2018 we’ll see applications expanding to include brand protection, self-checkout and real-time inventory management. EVRYTHNG’s strategic partnership with Avery Dennison, bringing digital identity and digital application capability to billions of apparel and footwear products is a crucial foundation to these developments.

The World Economic Forum estimates that the Digital Economy will reach $100 trillion by 2025. More specifically, the global IoT market is projected to grow from $2.99T in 2014 to $8.9T in 2020 (Statista) and Bain predict that the B2B IoT sector will generate annually more than $300B. These are huge numbers, demonstrating growing momentum and uptake.

In 2017, we witnessed a convergence and integration of the four major technology streams AI & Machine Learning, IoT, Big Data and Blockchain. The boundaries between them will blur as they interplay with each other.

EVRYTHNG is generating big data from close to one billion individual product items, applying machine learning to drive predictive applications and intelligent response to supply chain and product usage events, and working with Blockchain technology to authenticate product transaction and provenance data. These technology areas are likely going to dominate enterprise investment in 2018, central to the digital transformation strategy global product manufacturers and brand owners are pursuing.

Product digitization will be at the core of those transformation strategies as brands use their physical products to connect directly with their customers, transition to business models with services linked to their products, and apply real-time intelligence to operate their businesses with more efficiency.

We’re excited about what is to come – it is set to be an exciting year ahead.     Wishing you all a happy and prosperous 2018!


If you want to talk about how product digitization could transform your business, we’d love to hear from you!  Email us at

Authors Curt Schacker (SVP, Connected Devices) & Dominique Guinard (Co-founder and CTO)

Remember all of the hype about Internet-connected appliances, including refrigerators that would order milk and washing machines that could be controlled and monitored from work?  For all of that excitement, it might surprise you to know that in 2014, of the 600M white goods appliances sold globally, less than 0.2% came with Internet connectivity.*

While the reasons for this are important (and were contemplated in a previous post), a highly unfortunate byproduct of the slow rate of IoT adoption by the appliance industry has been the lost opportunity to extract business value from the data these machines would otherwise be gathering and reporting.

And, it’s not just manufacturers who are missing out on data-driven insights about their products and customers.  For example, consider the ability of:

  • Warranty providers to monitor operational performance, anticipate malfunctions, and deliver proactive services
  • Insurance companies to detect issues that might result in property damage
  • Consumer packaged goods brands to drive Direct to Consumer propositions
  • Utilities to understand energy and water consumption and make recommendations for improvement
  • Research agencies to better understand markets and consumers
  • And, of course, consumers themselves to receive more personalized, higher value services from all of these organizations

Well, we’re happy to report that EVRYTHNG Labs has developed a solution that promises to unlock this treasure trove of data from the billions of installed appliances around the world, thereby enabling these and many other business opportunities.  The solution does not rely on manufacturers to build IoT technology into their products; rather, it’s a low-cost accessory device, paired with EVRYTHNG’s powerful IoT platform and data analytics. We call it Appliance Telematics.

Let’s start with the device, which is a small, low-cost, battery powered module that records and reports vibration, motion, sound, temperature, humidity, carbon monoxide, and light as well as other environmental conditions.  Importantly, it only needs to attach physically – not electrically – to the appliance and is configured by the owner through a simple web page rather than a mobile application.

For connectivity, the device attaches to the Internet over any wireless network – including WiFi, LPWAN (LoRA + SigFox), and 5G as it rolls out – and reports sensor values to the EVRYTHNG platform, which timestamps and stores every data point.  From here, two key platform capabilities are applied.

  • All of the data is written to the EVRYTHNG streaming analytics and machine learning engine, where it is used to derive important insights.  For example, let’s say that a clothes dryer is taking an increasing amount of time to complete the drying cycle.  This could indicate a potential failure mode for which a warranty provider might issue a proactive service call.  If the problem manifests across many products, it could also indicate a design or manufacturing flaw to the appliance maker.  The retailer who sold the appliance might want to get involved from the perspective of customer satisfaction, and the local utility would like to understand the impact on energy consumption.
  • The platform may be programmed to take actions on events to provide more service value to consumers.  Examples include notification that a refrigerator or freezer door was left open; a leaking dishwasher; a stove/oven that was left on; or an out-of-balance clothes washer. Rather than being perceived as a nuisance like many of the notifications we get every day, messages of this nature would likely be highly appreciated.  We also envision data sharing partnerships in which, for example, CPG companies could offer Direct-to-Consumer subscriptions for commonly-used replenishables such as detergents and filters.

At EVRYTHNG, our mission is to see every thing connected to the Internet and Web.  But we recognize that in many cases there are significant challenges imposed by real-world business and technical constraints.  That’s why we are super-excited about this patent pending innovation, which offers a means of accessing and leveraging a valuable, but largely untapped, resource in a way that:

  • Is universally applicable to any appliance over any network
  • Can be set up in under two minutes without a mobile app
  • Can be retrofitted to the global appliance install base
  • Is easily configured and programmed to drive numerous business applications

But mostly we are excited to help companies apply new, cutting-edge technologies and methodologies to support their digital transformation strategies, thereby creating value for themselves, their partners, and their customers.

If you’d like to learn more, or you are attending CES in January, and would like a demonstration please contact us at

* source: IHS Marketing

Author Matt Shorts (VP Product)

You’ve made it through to the final post in our blog series on GS1 identifiers and the EVRYTHNG platform… congratulations! (For a recap on what we covered so far, take a look at GS1 in a nutshellWhy web-enable GS1 Identifiers  and  EVRYTHNG and SmartSearch.)

In this post we’ll look at the “share” standards developed by GS1. Specifically we’re going to look at the Global Data Synchronization Network (GSDN) and Electronic Product Code Information Services (EPCIS). EVRYTHNG enables brands to leverage these standards, realizing the vision of standards interoperability and information exchange. Let’s dive right in.


GS1’s GDSN helps brands manage and share their Master Data on a global basis. This makes it easy to publish and update data about their products. GDSN does this by using what are called Data Pools, which brands subscribe and publish information to on a regular basis. As of November 2017, the global registry has 36 Data Pools, 45.9k Trading Partner GLNs and 25.1M GTINs.   That’s a lot, so it’s important to be part of it if you use GS1 identifiers!

Figure 1:  how GDSN works. 

If a brand is not a member of the network, it can be a daunting proposition to join. Which data pool do they join? How do they get started standardizing data?

EVRYTHNG simplifies it all and helps the brand become GDSN-ready without having to work through each of these challenges or questions. The data structure standards are there through our implementation of the GS1 semantic model (remember blog post #3?). We’ve also standardized the communications approach to make things easy as well, using the Web of Things model. This means that a brand not already leveraging the GDSN can be up and running in no time.

If a brand is an existing member however, they are not without some challenges. Connections to data pools come with management overhead and changes to Master Data sets have downstream implications that need to be considered at all turns.  That’s why we put in place our flexible, standards-based data model: to help help brands manage these challenges.  However, it’s worth saying that joining and leveraging the GSDN is a minor thought in comparison to handling the ever-changing demands of your data consumers.

Being an open standards based network, it’s expected that the GDSN shouldn’t change as rapidly as a proprietary based system. This ensures that all systems leveraging it continue to work no matter what new platform or technology stack is introduced.

However, let’s talk about what happens when the data model needs to evolve. It can slow down brands from reacting to new industry needs or fast evolutionary changes. Technology will always evolve quicker than standards, so ensuring that a brand can future-proof themselves for emerging data communication strategies is vitally important. The same data model that EVRYTHNG provides helps address these challenges as well. It provides a layer of standardization and validation to help conform to standards, but in a manner flexible enough to augment the model to try new things. This allows brands to simplify their strategy for current/future e-commerce integrations, marketing channel evolutions and the emergence of new platforms with new data demands, all the while supporting a consistent and stable standard in GDSN.


Ok, we’ve structured the data around a brand’s products and shared it with everybody interested; now we need to track and describe their movements. GS1 EPCIS to the rescue! GS1 EPCIS is a structure that gives a brand the What, When, Where and Why of its products, and can be implemented with both product-level data (GTINs) or with serialized products (SGTINs).

Figure 2: how EPCIS works

The execution of this is very straight forward on the EVRYTHNG platform. We have Places identified with GLNs, Products with GTINs, Thngs with SGTINs and Collections with SSCCs. To model their movement and the events associated with them, we simply create Actions on any of these resources. Action types are the means by which you identify the EPCIS Event Type (Object, Aggregation, Transaction, etc.). The flexible Action data model then handles the payload of that event; items such as the class or instance-level objects, the location, time and reason (Business Step) are all described in the GS1 Core Business Vocabulary (CBV).  The EVRYTHNG platform also extends the model to include additional information from the web client itself. EVRYTHNG’s action model allows a business the flexibility to implement just the portions of EPCIS that add value to their organization.

Finally, regardless of whether a brand fully implements EPCIS semantics and structure, chooses some portions of it or creates a new model altogether, it’s important to understand the value of the Action data model. It has been created to help model real-world events digitally, such that all physical events are recorded. These events are immutable, providing a clear chain of visibility and audit to help brands have complete visibility across their supply and distribution networks. This also means you can use technologies like blockchain to record the events in a similar register. Check out our blockchain whitepaper here for additional details.

All good things must come to an end!

This was the last in our series on GS1 Standards – you’ve probably seen enough acronyms to last a lifetime!  We hope these blogs have given you a deeper understanding of how GS1 identifiers, and the standards surrounding them, can be used and extended on EVRYTHNG. Chances are you have the identifiers today, leverage them in many different processes throughout the enterprise, but just need to liberate them to increase their interoperability. That’s what we do!

Reach out to us at, and let us help you break down your legacy data silos.  And if you liked this blog series, please feel free to suggest other areas of interest you think we should explore further.