Master Data Management: The What, Why, Who and How

Navigate

It is day three of driving a brand-new, shiny SUV around town when the letter carrier delivers an unexpected letter from an unfamiliar tire manufacturer. The letter explains the tires on which the oh-so-pretty SUV sits have been observed to unexpectedly explode when travelling at high speeds. After remembering going eighty-five down the highway in a rush to get to work yesterday, the following question arises – how do tire manufacturers determine who is driving on their tires? The answer – Master Data Management (MDM).

What is MDM?

MDM is a process-based model used by companies to consolidate and distribute important information, or master data. The idea is to have an accurate version of master data available for the entire organization to reference.

Master data is the agreed-upon core data set of a business. As opposed to reference or transactional data, which could be something as mundane as the amount of invoices completed in a day, master data refers to data directly linked to the meat of a business.

Master data varies depending on the organization and industry, but typically includes detailed information about vendors, customers, products, and accounts. Master data is critical, because conducting business transactions without it is near impossible. Without first establishing product codes for a particular model of tire, the manufacturer would not be able to track which tires are sold to which customers.

Why is MDM Important?

In the example above, the only way the tire manufacturer would have the correct customer information for the owner of the new SUV is if the dealership gave it to them. And you can already see why maintaining a database of all their customers is important. Sure, they might inundate their customers with flyers and ads in the mail, but wouldn’t you appreciate the notification about potential tire explosions?

Managing master data is important, because business decisions are based on the story the company’s data is telling. Even the simplest of errors in master data will trickle down, causing magnified errors in other applications utilizing the flawed information.

Companies with non-existent or underdeveloped MDM processes often encounter finger-pointing and displaced blame as a result of data discrepancies. Data discrepancies can be seen when month-end sales reports are delivered with conflicting data in the accounting systems and manufacturing systems. Discrepancies make it difficult to determine which system, if any, has the most accurate information.

Who is Responsible for MDM?

MDM is often mistaken for data quality projects or technology systems owned by IT. Although IT may be involved in the distribution of master data, there is not one sole owner of MDM. To be successful in maintaining the integrity of critical company data, there must be a company-wide effort and to the ongoing maintenance of master data.

It is important to clearly define ownership of the components of MDM including: establishing data governance (standards around how data is used), creating an MDM strategy, and developing procedures for maintaining and distributing information to the people who need it.

A successful MDM program should include holding people accountable for maintaining master data and streamlining the sharing of critical data between departments.

How Do I Create an MDM Organization?

Improper maintenance of master data causes reporting inaccuracies and can lead to poor business decisions. The steps below should be followed to establish an MDM organization:

  1. Define which data is master data—products, customers, vendors, etc.
  2. Determine primary data sources and consumers—the CRM system owns the customer master list which is owned by the credit department and used by sales team.
  3. Designate ownership of each master data set—the AP Clerk is responsible for entering and updating vendor information.
  4. Develop data governance processes—all new product information requires review/approval from the management team prior to product entry into the accounting and manufacturing systems.
  5. Design necessary tools and workflows—technology can be implemented to help automate approvals and the flow of information.
  6. Deploy processes for maintaining master data—businesses can create templates and enforce procedures to capture requested master data updates.

Organizations large and small are faced with data challenges. Occasional focus on cleaning up critical data is not enough. Creating a comprehensive MDM strategy is the starting point to having confidence in company data. Avoid the pitfalls of poorly maintained master data by establishing processes to manage the creation of new master data and enforcing everyday practices to maintain the data over time.

 

Trenegy is a management consulting firm equipping energy and manufacturing companies for growth and change.

The Power of Storytelling

Navigate

 

Making change endure and building culture are two of the hardest things a leader has to accomplish. Fortunately, there is a powerful tool available for leaders to use: storytelling. In this article, we will explore the power of stories, why they work, and the elements that form a great story.

The power of stories

Appropriately, let’s begin with a story.

Jesus lived on this earth until approximately the year 30.  But most scholars believe that the books of the New Testament were not put down in writing until the year 50 or 100. Even then, the challenge of remembering the events of Jesus’ life did not end. The stories of the New Testament were primarily passed down orally, because copying the gospels by hand was banned until the emperor Constantine allowed it at the Council of Nicaea in the year 325. For almost 300 years, the accounts of the New Testament were primarily passed on orally from person to person! Furthermore, from that point until the invention of the western printing press in 1439, the Bible was copied primarily by hand; thus, very few people owned the Bible.  Therefore, for almost 1400 years, the stories in the New Testament of the Bible were learned primarily through storytelling. I would venture to say that most people in western civilizations can recite one or more stories from the Bible. That, dear reader, is some powerful storytelling.

Here’s a more recent example. Steve Epstein was a lawyer for the U.S. Department of Defense and in charge of the Standards of Conduct Office. When he conducted training, he found that reciting the rules alone was not working. The message did not seem to “stick”. So, he created the Encyclopedia of Ethical Failures in which he collected stories of compliance failures in chapters titled “Bribery”, “Abuse of Power” and the like. Here’s a real gem from the encyclopedia:

A military officer was reprimanded for faking his own death to end an affair. Worthy of a plot in a daytime soap-opera, a Navy Commander began seeing a woman that he had met on a dating website. The Commander neglected to tell the woman that he was married with kids. After six months, the Commander grew tired of the relationship and attempted to end it by sending a fictitious e-mail to his lover – informing her that he had been killed. The Commander then relocated to Connecticut to start a new assignment. Upon receipt of the letter, his mistress showed up at the Commander’s house to pay her respects, only to be informed, by the new owners, of the Commander’s reassignment and new location. The Commander received a punitive letter of reprimand, and lost his submarine command.

This story, although better than just reciting policy statements, can still be improved. We will look more into that later in the article.

Why stories work

In 1944, Heider and Simmel conducted a study that showed that our brains are wired for stories. They played a movie to groups of students (you can see the movie here). The movie shows silent geometric shapes moving around the screen. One group was given directions to describe the story they saw. The other group was given little direction prior to seeing the movie, and then was asked to describe what they saw. Not surprisingly, there was very little difference between the two groups: they both saw the story of an angry triangle and its confrontation with a friendly triangle and circle.

This experiment shows how important putting data into the context of a story is to our brain.   In fact, 65% of our conversations are stories – primarily in the form of gossip. Why is that?

When processing facts, only two areas of our brains are engaged. These areas do nothing more than decode the words we hear into their dictionary meaning. But when we listen to an effective story, many other parts of the brain are activated. This results in interesting activities happening in our brains.

The first is neural coupling. Neural coupling allows the listener to turn words into virtual experiences. For example, smell words such as “lavender” and “cinnamon” activate the smell centers of the brain as if the listener had actually smelled them. Similarly, using active sentences such as “Bob kicked the ball” activate the portion of the listener’s motor area associated with leg motion.

The power of neural coupling lets an effect called mirroring to take place. Mirroring allows a story teller to relate personal experiences directly with the listener.  Since the motion and sensory areas of the brain are activated by action and sensory descriptive words, a story teller can almost recreate his or her reality in the listener’s brain by using those words.  When a story teller relates an effective story, the listeners’ brains are literally living those events.

Finally, the brain releases dopamine into the system when it experiences an emotional event – even through storytelling. Dopamine imprints a memory and makes the event easier to remember with more clarity. This is why saying “don’t go near that dog” is not as effective as saying “my friend was mauled by that slobbery, mean dog, so stay away.” The combination of neural coupling, mirroring and dopamine makes storytelling 22 times more effective in helping the listener retain information than data alone!

Furthermore, people are not rational beings by default. Our brain operates from two systems – sometimes called Fast Thinking (System 1) and Slow Thinking (System 2).

Fast Thinking is intuitive and uses less energy; therefore, it is the default system our brain uses. This is the system that allows you to talk while driving your car; to play the guitar without looking at the strings; or to just know that 2+2=4.

Slow Thinking is the rational and deliberate system. This system uses significantly more energy than Fast Thinking; therefore, it is not the one our brain chooses to use by default. Slow Thinking is the system which makes you think through a process, or figure out a problem. It is how the brain is operating when you’re trying to calculate 17 x 54 (which by the way is 918). Also, when you learn a new skill – think of driving a car – your brain is relying on Slow Thinking. The person learning a new skill has to think through every step, and when that person is done, they feel tired and drained. But as one is going through this learning, the brain is literally rewiring itself. New connections are made in the brain until the new skill can be performed without really thinking about it.

Again, our brains are designed to use Fast Thinking as the default. It is more energy efficient and faster. If our caveman ancestors would have had to use Slow Thinking for everything, we would not be here today. They would have been eaten by saber tooth tigers while they thought about how to react when they saw one!

These peculiarities of how our brains function are why flight simulators work for training an aircraft pilot. Without first practicing in a flight simulator, a mistake by a first-time pilot while landing in an unfamiliar airport (or on a short landing strip, or with a strong crosswind) could result in death, fire, and mayhem. But with a flight simulator, the same pilot can try the landing many times – and fail many times – without anyone getting hurt. All the while, the pilot’s brain is rewiring to be able to perform the real landing using their Fast Thinking brain.

Stories work like flight simulators on our brains. Because a well-crafted story causes our brains to activate as if we were experiencing the events (through mirroring), stories take us into intense simulations of situations that we are able to experience in parallel to our real life. We can have that rich experience without getting hurt at the end.

Stories rewire our brains much in the same way that a flight simulator does. These new brain connections then allow us to react using our Fast Thinking brain when exposed to similar real-life situations as those that we were exposed to through stories.

Connecting through stories

In his book I and Thou, the German philosopher Martin Buber states that humans are defined by two pairs: I-Thou and I-It.

Our relationship with things, the I-It relationship, separates us from that object. Our relationship with others, the I-Thou relationship, eliminates the boundaries between us. That is why, even if one is an extreme naturalist, one has no qualms over cutting down a tree for fire to warm one’s cold family.  The connection you have with your family — the I-Thou relationship – bonds us to them, while the I-It relationship with the tree separates us from the tree.

Storytelling allows us to see the subject of the story in an I-Thou relationship instead of an I-It relationship. When implementing change in an organization, simply putting lessons learned into policies and procedures keeps us separated from the results, and therefore does not create a sustainable culture. Understanding the “thou” behind a story builds a long-lasting connection.

In the medical field, many hospitals are now using this concept to avoid medical mistakes. Instead of referring to patients solely by their medical condition (believe it or not, medical staff used to call their patients names like “the broken leg in room 2”), the staff in encouraged to build the “story” behind the patient in their charts and conversations. This has been shown to help reduce the number of medical errors, since now the staff is connected to the patient through an I-Thou relationship instead of an I-It relationship.

Building a great story

The steps to building a great story can by remembered through the acronym “CAR”, which stands for Context, Action, and Results.

Context sets the background for the story. Where and when did it happen? Who is the main character? What does your character want to accomplish? Who is the villain, or what stands in your character’s way? The main character in your story needs to be someone the audience can connect to, and the villain (which doesn’t need to be a person, but could be a situation) needs to present a real challenge.

Action is the substance of your story. What does your character do? Action must include an obstacle, setback or failure.

Finally, Result is where you reveal your character’s fate. To be effective, you must subtly give the moral of the story.

To be fully effective, you must remember why stories work. Create an experience that induces mirroring by using action words, words that stimulate the senses, and avoiding clichés. Engage the audience by illustrating a real struggle. Pin your character against real villains – whether they are people, equipment, procedures, a change initiative, or whatever. Finally, admit flaws. Rosy pictures are boring.

Next time you are in a movie theater, look around you and watch the people. If the movie is effectively telling a story, you will see everyone reacting as a single organism. They will all laugh at the same time, flinch together, gasp simultaneously… A great story builds a common thread across a group of people. The story does not even have to be told to a group that is together. Think about the connection you feel with other people who also saw the latest blockbuster movie. You do not have to be in the same movie theater to share the experience.

Just as in the days of the early Christians, stories continue to serve the function of building a community through common experiences. Common stories build a culture. They encourage a group of people to behave in a unified way. Effective change management and the building of a strong culture cannot be done without the transformative power of storytelling.

Don’t Avoid the Checkup – Embrace the New Lease Standard

Navigate

How many times have we cancelled our dental checkup, because we are too busy? Tooth pain reminds us to visit the dentist, and he says, “you should have come sooner….”. The FASB issued a new lease standard, Leases, (ASC 842), on February 25, 2016. Are you ready to move forward with implementing the new standard or do you want to delay until the pain is felt?

The key provision of the new FASB lease standard is that lessees will recognize virtually all their leases on their balance sheet by recording a right-to-use asset and a lease liability. This includes operating leases, having previously been recorded off-balance sheet. The existing lease standard has been criticized for failing to meet the needs of users of financial statements, because it doesn’t always provide a faithful presentation of leasing transactions. The new standard proposes to provide for greater transparency in financial reporting. Companies that lease assets including real estate, manufacturing equipment, vehicles, airplanes and similar assets will be impacted.

Public company implementation dates for the new lease standard are fiscal years starting after December 15, 2018. Non-public companies must comply for fiscal years starting after December 15, 2019. Financial executives may look at the implementation dates and be inclined to focus on projects with more immediate due dates and address the new lease standards later. Financial executives can devote some time now to analyze the potential complexity of the implementation and the impact on company resources. The analysis can help a company decide when to move forward with the implementation process and avoid unnecessary financial reporting risks.

The AICPA recommends six steps to an effective implementation of the new lease standard:

  • Assigning an individual or a task force to take the lead on understanding and implementing the new standard;
  • Updating the list of all leases;
  • Deciding on a transition method;
  • Reviewing legal agreements and debt covenants;
  • Considering IT system needs;
  • Communicating with stakeholders.

At first glance, the six-step recommendation seems simple and manageable. Before we get too comfortable with its simplicity, let’s peel back the onion with a few questions. Do you have technical accounting staff available to spend quality time understanding the lease accounting guidance and determining how it impacts your company? Do you know of all your lease contracts and where they are located? Is your company public, and how do you determine whether to transition with the retrospective (requires restatement of comparative periods in your financial statements) or the modified retrospective method (does not require restatement of comparative periods in your financial statements)? Do you have debt covenants or other legal agreements limiting debt levels or requiring approval prior to incurring additional debt? Do you utilize an IT system to manage your lease records or do you use Excel or similar process? Have you discussed the impact of the new leasing reporting standards with executive management, board of directors, debt holders, or other stakeholders?

You may not have answers for all of the questions above, and as you move forward with the implementation of the lease standard, many more questions will arise. The implementation process will not be limited only to the accounting staff. Moving to the new reporting standards will be a company-wide initiative with communication and cooperation among several departments including treasury, legal, facilities management, purchasing, logistics, and fleet management, to name a few. To achieve success with the implementation requires development of a project plan including input from a wide range of functions and requires commitment from executive management.

Companies should look at the implementation as more than a compliance project and use the opportunity to create value for their company. Below are examples of opportunities for value that may be identified during the implementation process:

  • New avenues to improved communication among different departments within the company
  • Improvement of existing internal controls and processes, updates to related documentation, and communication of improvements and changes to affected parties
  • Selection of cost efficient IT solutions to track leases and to meet reporting requirements for the lease standard
  • Consolidation of lease vendors and negotiation of improved pricing
  • Termination or buy out of stale and unneeded operating leases

Companies have the opportunity to identify additional opportunities to achieve value beyond compliance. Challenging the organization to always be vigilant in identifying value-creating opportunities in all our daily tasks is critical for continuous improvement.

 

Trenegy helps companies to implement new accounting guidance and to identify opportunities for companies to create value through process, controls and system improvements.

The Accounting Rule You Need to Know Before Moving to the Cloud

Navigate

There are a number of factors our clients consider when evaluating the purchase of cloud software.  The main factors for consideration often include: system performance, security, data access and of course, cost, specifically which costs must be expensed and which costs can be capitalized.

Due to the recent updates of standards for intangible asset accounting, the rules for which costs can be capitalized and expensed are no longer as clear-cut as they used to be. The presumption a company can capitalize costs incurred with software implementation activities no longer holds true under every circumstance, or type of contract, when it comes to cloud software.

At the beginning of 2016, the Financial Accounting Standards Board (FASB) threw an Adam Wainwright-style curve ball to companies which are evaluating or have purchased cloud computing software. You can read the full update to the Accounting Standards Codification (ASC) 350-40, Internal Use Software here.

However, the update created somewhat of a gray area around whether a cloud computing agreement represents a purchase of software or a purchase of services.

The update states depending upon the specific language of a cloud computing contract, the purchase costs may be viewed one of two ways:

  1. as the purchase of a software license
  2. as the purchase of a service

In order to be deemed as a purchase of a software license the cloud computing contract must explicitly denote the customer is paying for the transfer of a license required to operate the software.  Otherwise, the contract is viewed as a purchase of services.

If a contract is viewed as a purchase of services, then the costs must be accounted for like any other service contract, which means all costs must be expensed when the service is performed.  The only opportunity to capitalize these expenses on the balance sheet is to book the costs as a prepaid asset and amortize them as the prepaid (software) services are used.

Being forced to expense all costs associated with purchasing and implementing new software poses a significant hurdle to potential buyers of cloud computing software. If the contract is considered a purchase of services, then implementation costs related to the software – which can often times reach seven figures – must also be expensed. The potential for taking an immediate hit to the income statement for such a large dollar amount is more than enough to give many companies pause when evaluating cloud software.

As such, many cloud software providers have also taken steps to simplify the process by moving from software service subscription fees to offering contracts based on software licensing fees. An arrangement which includes a software license is considered “internal use software” and accounted for as an intangible asset. Under the internal use software designation, the typical expense vs. capitalization rules apply and companies are allowed to capitalize and then amortize implementation costs accordingly.

New Accounting Rules Cheat Sheet
With many cloud software vendors offering either a subscription based or license based contract, it is important for perspective buyers to understand the impact to the software’s total cost of ownership. In some cases, a subscription or service-based contract may have a lower total cost of ownership. Some clients may choose to go with the service contract to lower the total amount of cash going out the door, versus other clients which may choose to pay more for a license-based contract in order to absorb the costs on the P&L over time.

To avoid any surprises with accounting for cloud software costs, we advise our clients to obtain a clear understanding of the pricing model from every perspective cloud software vendor and to take a total cost of ownership approach when making any software decision.

Trenegy assists companies in selecting and implementing the right technology solutions. For additional information, please contact us at: info@trenegy.com.

 

Integrated Business Planning: Gimmick or the Go-To Method for Doing Business

Navigate

Companies whose doors have been open for any longer than 5 minutes know teamwork is important. Each company has their own methods for maintaining communication and teamwork. Sales and Operations Planning, or S&OP, is a perfect example. S&OP is the process by which Sales and Operations work together to create one plan for a specific timeframe. Sales provides projected revenue, and Operations provides their expected production. It is a game of supply and demand and predicting equilibrium. The end product is a forecast that aligns with executives’ strategic objectives and serves as a performance measurement tool.

The inventor of S&OP, Oliver Wight, tells us there is a new concept which improves upon even the most tightly run S&OP organization. The process is Integrated Business Planning, or IBP. According to many skeptics, IBP is simply a marketing gimmick to re-brand S&OP. However, Wight confirms Integrated Business Planning was not introduced to announce the invention of a new process, but rather to reveal the considerable changes to an existing one. The focus of S&OP has shifted toward gaining a better understanding of external factors and aligning all internal functions, not just Sales and Operations.

In the new world of IBP, Sales, Operations, Logistics, HR, Finance, Marketing, and Pricing are all working toward the same goals. Some examples of the ways IBP improves upon S&OP include:

  • Stronger financial integration
  • Improved product & portfolio review
  • Addition of strategic plans and initiatives
  • Improved pricing decision-making
  • Enhanced scenario planning and risk visibility
  • Improved trust within the leadership team

IBP starts with implementing a process which works best for the company. If a company has been operating a run-of-the-mill S&OP process for twenty years, a change management plan to shift to IBP could be exactly what the company needs to take business to the next level. By making improvements to the traditional S&OP process, the company uses cross-functional data to make business decisions, set targets together, and commits to achieving the strategic plan.

A potential benefit of IBP over traditional S&OP is the ability to develop trust with suppliers and customers by including the strategic pricing equation in the planning process. IBP allows companies, and in turn, their suppliers and customers, to depend on reliable pricing and available to promise (ATP) numbers. Trust can only be established when people deliver to expectations. Pricing is a key player in IBP, as are ATP dates. Price is the translation between units and dollars enabling a common language.

Still not sure about the difference between traditional S&OP and IBP? It would not be surprising if the term IBP faded away and the term S&OP remained, but regardless, the concepts of IBP are the new gold standard. Whether a company has employed traditional S&OP processes for decades or is starting from scratch, it is important to apply the latest model. Why settle for a thing of the past when the future is much brighter?

Companies investing in IBP will notice a behavioral shift. For potentially the first time, the entire company will be moving toward the same set of strategic objectives. Ultimately the company will be able to provide a higher level of customer service, improve lead times, increase profit, and enjoy a positive impact on the bottom line.

Trenegy has years of experience helping companies to achieve their goals through integrated business planning. Please contact us at info@trenegy.com for more information.

 

 

Cyber Hygiene – How to Clean Your Online Presence

Navigate

In 2000, Chinese hackers  began a 9-year hacking assault on telecommunication giant Nortel. The hackers used remote access and automated software to generate a large number of password guesses to eventually break the credentials of seven executive team members. The hackers successfully obtained critical reports, research and development materials, employee emails, and strategic information. Unfortunately, Nortel’s top executives neglected to secure their network and eventually declared bankruptcy in 2009. As Nortel disintegrated, Chinese telecom Huawei grew, with some speculating “Huawei’s rise was at the expense of Nortel.”

Nortel’s downfall raises awareness of the devastating consequences that are the result of a cyber-attack. However, there is a growing tendency to generalize cyber-attacks as simply “cyber-attacks,” leaving us numb to the term rather than educated. This makes all cyber-attacks seem like nebulous boogeymen. In fact, there are many different types, and taking these threats seriously is the first step in preventing them.

types of cyber attacks

Everyone must evaluate their online behavior and become hyper-vigilant about their cyber hygiene, the measures taken to ensure one’s “health” and safety online. Cyber hygiene begins now, with improving passwords, enabling two-factor authentication, installing anti-virus software, and routinely scrutinizing potential online threats (like the ones mentioned above).

Business leaders must invest time and money into their organization’s cybersecurity strategy by first training employees to maintain good cyber hygiene. Many executives and board members remain hesitant to spend millions on cybersecurity. However, cyber-criminals take $400 billion per year from companies, with much of that theft going undetected. Technological solutions are simply not enough to prevent a cyber-attack. Making employees aware of the threat is crucial.

Three things companies must do to immediately implement an adequate cyber hygiene program:

  1. Set tone at the top
    1. Executives are responsible for setting the company culture – when they support a cybersecurity initiative, the company follows
    2. A CEO who takes cyber security seriously will influence his or her employees to do the same
  2. Make cybersecurity a part of the office conversation
    1. Discuss cybersecurity measures regularly – learn from Nortel’s mistakes and make employees aware of the dangers
    2. Create a best practices document with instructions for changing passwords every 90 days, updating anti-virus software and other apps, protocols for downloading 3rd party apps on work computers, etc.
  3. Understand and limit access
    1. Know which employees have access to workstations and keep this information up-to-date (expired accounts are targets for hackers)
    2. Minimize attack exposure by limiting access to only those who need it

Personal cyber hygiene is equally as important. Business and personal information are intertwined, and it is nearly impossible to untangle the spider web when a cyber-attack occurs. Many people manage their work and personal lives on the same smart device. Protecting one’s cell phone is just as important as protecting one’s work computer. It is essential to know what apps are on smart devices, what personal information they require before downloading, and what the potential risks are in having those apps. Scrutinizing emails for suspicious activity on phones and home computers is also important. Essentially, any cybersecurity strategy employed in the workplace carries into the home and impacts personal devices.

It is time to be more cautious on the internet. Technology has come a long way and the most reliable security guard for your information is you.

Kaizen or Just Bringing in Lunch?

Navigate

Kaizen has been hailed as a quick ‘cure all’ for any ailing business problem. Companies like Toyota, Bacardi, and Nestle use Kaizen in their businesses. Yet for some companies, the go-to approach is to lock employees in a conference room with the  hopes of arriving at a better solution before they are released. Is Kaizen the panacea it claims to be, or simply an excuse to bring in lunch?

In Japanese, “Kaizen” means  “improvement.” Dissecting its Kanji reveals “change” (Kai) is “good” (Zen) – more or less. The Kaizen approach focuses on small improvements under the theory that over time, these small changes could result in massive benefits to a business.

The concept of “Kaizen” was first introduced during World War II under a United States program called “Training Within Industry,” or TWI. The TWI group encouraged small, incremental improvements over transformational changes. Eventually, W. Edwards Deming (the management consultant responsible for TWI) was recognized by the Emperor of Japan for introducing the concept of Kaizen to the Japanese workforce.

A typical Kaizen master leads the room through six steps, including:

  1. Establishing the reason for the workshop. Why are we here? What is the scope?
  2. Understanding the current “state.”
  3. Serving boxed lunch.
  4. Developing a future “state” vision.
  5. Creating a timeline and ownership for each step.
  6. Recognizing everyone’s participation with a ribbon

Kaizen is commonly used in manufacturing companies since the concept is an essential part of “lean manufacturing;” and is associated with the Toyota Production System (TPS). The TPS is well-known for defining standards of eliminating waste in production and unnecessary stress in the workplace. However, any industry, any job function or process, can benefit from Kaizen.

Kaizen is most beneficial when:

  • It is seen as a proactive approach to identifying incremental improvements
  • There is a shared mentality that the current state can always be improved upon
  • Continuous improvement is embedded in company culture
  • It is supported by upper management
  • Employees have an outlet to submit Kaizen suggestions

The main drawback to a Kaizen event is “groupthink.” Basically our minds become influenced and confined to the ideas generated in a group. The loudest person in the room generally drives the conversation. The Harvard Business Review describes the benefits of convergence and divergence well: in a group session, individuals need some time to brainstorm ideas on their own before coming back to the larger group. Once the group is reunited the individual ideas are discussed and narrowed. The process continues until a solution is achieved. We find that when people feel they have a direct impact on company change, they are more likely to participate in and advocate change.

Because Kaizen is based on the idea of continuous improvement, it is most effective for small, incremental changes. Standardizing templates for Accounts Receivable or updating a work ticket are great challenges for Kaizen to address.

For more complex business problems or if “groupthink” becomes an issue, there is a different approach: the ACE Method. ACE (which stands for accelerate, collaborate, and execute) is a workshop that effectively finds solutions for anything from brainstorming a new company logo to a complete overhaul of the company’s bid to bill process.

Trenegy developed the ACE Method to address problems quickly. A typical workshop lasts about two weeks and a clear roadmap for action is developed in that time. To combat “group think,” ACE employs a trained moderator to encourage that there are no bad ideas. Even a thought that seems unhelpful initially could spark a brilliant idea in another participant. ACE uses convergence and divergence, as the Harvard Business Review advises, to brainstorm ideas and improve upon them.

Think of Kaizen as a way to tackle the small issues and ACE as a way to tackle larger challenges, like mergers. Both methods have their advantages. And both are worth far more than just a free lunch!

A New Frontier: Securing the Internet of Things

Navigate

The IoT is a new frontier. Projected to surpass 50 billion objects by 2020, with the potential to boost the global GDP by $142 trillion, the Internet of Things offers private consumers the ability to create app-controlled “smart homes” and offers businesses unprecedented access to real-time operational data monitoring, collection and analysis. With the rapidly-evolving, demand-driven industry of IoT devices, technologies are being introduced faster than they can be protected.

For individual consumers, IoT security breaches have the potential to violate privacy, steal personal information, and generally terrorize unsuspecting people by manipulating their home devices. On the industrial or business side of the IoT, the threat of a hack presents far more widespread consequences. Unsecured IoT devices present a perfect opportunity for hackers to wreak havoc by compromising operational and/or safety data being tracked by IoT devices, causing a Distributed Denial of Service (DDoS) incident for customers (like the October 2016 Internet Outage), and accessing, compromising, or ransoming financial systems and data by using connected IoT devices to infiltrate the corporate IT firewall. It is becoming increasingly clear to private consumers and businesses that more focus needs to be dedicated to securing the IoT.

  • Manufacturers and consumers have focused on “ease of use” over security- The vast majority of IoT devices are designed with “ease of use” as the first priority, which traditionally means security must take a back seat. In the name of “ease of use,” many of these devices do not require a username and password reset at the time of setup, relying instead on a manufacturer-provided default username and password. These devices will remain actively connected to the internet without additional credential input indefinitely. These default settings are about as secure as having no password at all.
  • IoT Devices often fall outside of the Corporate IT Cybersecurity Structure– IoT devices are typically categorized as Operational Technology and therefore, managed by Operations departments. They are often excluded from the corporate IT strategy. When employees connect unsecured IoT devices to company-provided workstations, they inadvertently provide a direct portal for hackers into a company’s secured IT environment.
  • Physical Security is often impractical – In traditional IT security, physical security is one of the basic tenants. IoT Devices may be spread all over the world, on oil rigs or remote sites, making isolation impossible. By nature, IoT devices are easily accessible, residing in common operational areas of businesses, or common living areas of homes.
  • There is currently no “McAfee” equivalent– The average internet user knows that it would be reckless to leave a computer vulnerable without the protection of anti-virus software. However, this type of software has yet to be developed for most IoT devices. This means that not only are most of these devices unprotected—but that they are also unmonitored. Devices could be hacked, and the end user would never know unless the hackers make their presence known. A potential solution would be for each manufacturer to develop security software for its own devices. But the IoT is made up of thousands of devices by thousands of manufacturers, and these companies do not have the expertise, nor the motivation, to develop this kind of software.

Inevitably sufficient security measures will be developed, but these developments will take time. Until then, here are several ways consumers and companies can keep hackers at bay:

  • Set Strong Usernames and Passwords – The easiest way to secure IoT devices is to change from the factory default credentials to a strong, unique username and password. Some devices are difficult to change, and some offer no credential change functionality. If a device does not appear to offer a credential change option, contact the manufacturer to be sure. If in the market for a new IoT Device, the ability to change credentials should be a critical measure when choosing between products.
  • Bring IoT Devices under the responsibility of IT– While Operations will remain the primary end users of Industrial IoT Devices, the security of these devices must be included in the corporate IT cybersecurity structure. Whenever possible, bring IoT devices behind the corporate firewall and ensure that IT tracks and deploys any updates provided by device manufacturers.
  • Educate employees/users about IoT Security – As in general cybersecurity, the greatest defense against hacking is a well-educated user base. By informing employees/users about the threat of IoT hacks and how they can prevent them through proper device setup and use, companies can minimize the risk of a hack occurring.
  • Prioritize increased security features– End user demand will drive manufacturers to improve security features and software companies to develop an anti-virus program for IoT Devices. As long as consumers continue to purchase devices with no regard for their security, manufacturers will continue to produce status quo. If currently-owned devices offer insufficient or no security features, consider upgrading to newer, more secure options. Consumers should continue to voice their security concerns in the marketplace, and when in the market for new IoT devices treat cybersecurity as a top priority.

As the technology community begins to unravel and understand the concept of protecting vast amounts of personal data, IoT users must remain vigilant about securing their own devices. Increasing dependence on internet-connected objects makes securing them a top priority. While alluring, the new frontier of the IoT could leave many people very vulnerable.

Making Connections: The Internet of Things

Navigate

There is a lot of confusion surrounding the term, “IoT.” It sounds like something from a Sci-Fi movie. However, the world has been consumed by the Internet of Things for quite some time. People carry it around in their pockets, wear it on their wrists, or use it each day to get work done. At its most basic level, the Internet of Things (or IOT) is simply a network of internet-connected objects capable of sending and receiving data. The Amazon Echo, FitBit, smart thermostats like NEST, smartphones, and laptops are a few easily recognizable examples.

The International Data Corporation estimates the IoT currently has 13 billion connected objects, and that number is projected to surpass 30 billion objects by 2020. This substantial growth suggests the IoT will drive major changes in every industry. Executives must understand why and how to use the IoT in order to maintain a competitive advantage.

Why use the Internet of Things

It is difficult to imagine a time when a person might require an internet-enabled toaster. Yet in 1990, a toaster became the “first” IoT device. This toaster was merely an experiment, but it highlights an important concept. Just because something can be connected to the internet does not mean it should be connected to the internet. Companies considering IoT opportunities should think first about the advantages connectivity provides.

There are two main reasons to invest in IoT:

  1. Monitor remotely
  2. Collect data in real-time

Smart sensors, the nucleus of IoT, allow users to monitor people, processes, and systems from anywhere in the world. For manufacturers seeking a better understanding of their supply chain, using the IoT makes a lot of sense. Sensors provide more accurate delivery estimates and real-time changes in inventory. This added visibility detects if shipments have been tampered with and mitigates damage risk. End-to-end data can be used to assess weaknesses, identify opportunities, and establish a more efficient supply chain.

How to use the Internet of Things

Data-driven devices give companies insight to processes and operations like never before. IoT allows users to extract enormous data sets and summarize them into actionable analytics. There are four distinct types of data analytics:

  1. Descriptive Analytics – What happened
  2. Diagnostic Analytics – Why it happened
  3. Predictive Analytics – What might happen in the future
  4. Prescriptive Analytics – What to do about what is happening

Companies use IoT data to lower maintenance costs, predict equipment failures, and improve business operations. B2C companies can better understand their target market by analyzing data collected from IoT devices used by their customers.

The Industrial Internet of Things (IIoT) allows manufacturing and energy companies to leverage big data to drive future action and business strategy. The IIoT is essentially the point where traditional information technology (IT) and operational technology (OT) come together. IIoT applications use smart sensors to track inventory (as supply chain managers do) and gather data on condition-based predictive maintenance. IIoT will have a significant effect on how Operational Excellence is defined and achieved in the next decade.

Implementing the Internet of Things

The Internet of Things will continue to revolutionize the way of doing business across every industry, but the transition will not be easy. Companies who choose to implement the IoT will face many challenges. They will encounter resistance to change from their own organization, their vendors and their clients. There will be obstacles to overcome from a security standpoint, including physical security and cyber-threat. The companies will need to be flexible as best practices, standards, and regulations evolve. Organization structures will change, processes will be re-designed, and budgets will be re-allocated to support the IoT. While there are advantages to adoption, companies should look to outside resources for assistance in change and implementation management.