Your Buildings’ 21st Century Edge

Vehicle_Disruption

One way to gain perspective on disruptive technologies is to look backward to see forward. Tony Seba, an instructor in Entrepreneurship, Disruption and Clean Energy at Stanford’s Continuing Studies Program, does this with these early 20th-Century Easter Parade images. The images suggest that the vehicle market went from almost all horse-driven to almost all motor-driven in just 13 years. They support Seba’s position that today’s car industry could see a near total change over from internal-combustion-engines to electric vehicles in just a few years, not decades.

 The images are a dramatic statement about innovation in cars. But, what do they say about buildings? Not much. The bricks and mortar constructions look about the same. Nevertheless, inside some of those Manhattan buildings, there were architecture firms innovating at a pace too. Architects were sizing up the new elevator and escalator technology as well as the electric motors to drive them, just beginning to envision the new taller structures that would change the skyline. Disruption in the way we design, construct and operate buildings may never be as visible or thrilling as what is happening in cars; but, the two industries are inextricably linked. Looking at what is happening in edge control and analytics in cars is a good way to understand how comparable technology will bring buildings into line with 21st Century service demands.We see, for example, that new edge controllers – those that support the full software stack needed to collect, store and run analytics on time-series data from a building’s many digitized sources – are going to be a powerful force in building automation. They have sufficient compute and storage resources packed into a footprint not much bigger or more expensive than a mobile phone. They are of interest to stakeholders from specifying engineers to controls contractors to occupants that want to pull building performance data from wherever it exists and push it to wherever it can help them make better decisions and be more productive. But, do we see the size or speed of the disruption they are bringing?

Just as Tony Seba looks backward to see forward, there is a lot to be learned by looking sidewise at the Connected Car to see what is on the horizon for the Connected Building. Most software-is-eating-the-cardiscussions start with the On-Board Diagnostics specification, OBD-II. OBD-II specifies the standard hardware connector to vehicle computers as well as the protocol for communicating standardized diagnostic trouble codes. Car makers around the world have been evolving and supporting this specification for expressing car performance information since the first version was introduced in the early 1980s. While it is not an Application Programming Interface (API) per se, it is stable and detailed enough for software developers to use like an API.

A prime example of a car edge device that leverages OBD-II is the mobile diagnostics adapter from Automatic Labs. Unlike previous PC-based and hand-held OBD-II readers, which were targeted for use at car repair shops by certified car mechanics, the Automatic adapter and competitors like Verizon Hum are marketed directly to car-owning consumers. Their developers recognized that car owners would use car data in ways never considered by car mechanics. So, they launched their edge devices along with mileage-tracking apps that interfaced to the type of cloud accounting services used by small businesses, gig workers and entrepreneurs. And, they mashed in Google Maps and other web mapping services to create trip tracking apps attractive to more business users and parents of teen-drivers. To further popularize the device category, they worked to build a community of 3rd-party app developers around their car diagnostics adapters through an app store modeled on Apple iTunes and Google Play e-commerce sites. The ecosystem contributed a host of new online services targeted to car owners that involve mobile shopping, home-automation management, even fitness—and yes—newer, better ways to work with car repair shops and related services.

It all seems to be working well for Automatic Labs.  It was just purchased by SiriusXM. Verizon, SiriusXM, these are not traditional car companies. They are data companies carving out their share of the growing Connected Car market. They have a powerful market position in that the data they are collecting and analyzing is valuable machine-learning training data for the coming era of autonomous cars.

What are the takeaway lessons here regarding how edge analytics controllers will disrupt the current building industry? First, there is the necessity of an open specification that is sufficiently detailed and supported to serve as an API for software developers. What is the BAS-industry equivalent of OBD-II? BACnet protocol mainly addresses communications and syntactical interoperability. Project Haystack, on the other hand, focuses on semantic interoperability  and is more suitable for application development inside and outside the traditional buildings industries. Then, in the BAS world, you need BMS software to express the architecture of the target hardware and functional logic of the controller. The open-source Sedona BMS framework used in combination with open-source Haystack modeling is as close to an OBD-II equivalent as the BAS industry has right now.

Next, just as the target market for Connected-Car adapters is much larger than the car mechanic shops that previously used OBD readers, Connected-Building edge analytics controllers will not just be used by controls contractors. A primary category of user will be building optimization professionals. Building optimization is a growing field with systems engineers, commissioning advisors, energy managers as well as master systems integrators merging into it. These professionals are distinguished by their ability bring together time-series data streaming in high-volume from equipment and sensor systems of many brands and glean valuable insight from it. In a recent interview on the videocast ControlTalk Now, one leading engineer with such credentials, Matt Schwartz of Altura Associates, describes the work this way:

“It’s our job to leverage data to find issues in buildings faster and to track them to resolution. But we go further, we ask owners or other stakeholders ‘How might you leverage this data to be more successful?’  Then we work with our clients to integrate building performance data into their workflows.  Leveraging analytics to find the full value in their data is becoming key to almost every proposal we submit to our clients.”

The point being made here is analogous to the recognition by Automatic Labs and Verizon Hum OBD-II innovators that the car performance data in the hands of end-users would be used in ways far beyond car maintenance.

Matt Schwartz and others at Altura Associates are early users of edge analytics controllers. During the interview, he describes deploying about a dozen to pull pressure and flow readings at strategic locations from a massive centralized compressed air system that serves a large university campus:

“Field data is already in Haystack format, so it is a cinch to pull it into our cloud analytics platform. The EACs also store and trend data locally, so it is accessible by the customer and other parties that understand open Haystack. They have enabled us to quickly identify and implement leak repairs. The campus now knows, for example, how much compressed air it wastes and how much that is costing.”

One final point of comparison between Connected Building edge analytics controllers and Connected Car diagnostic adapters concerns the flexible nature of 21st-century work. The buzz about Automatic Labs car plug-in first gained momentum among entrepreneurs and small business people who needed to keep track of their mileage for expense reporting. With office-sharing startup WeWork reporting amazing worldwide growth and an astronomical market valuation in the 10s of $billions, it seems like the market expects a good portion of the office buildings along the Manhattan Easter Parade route and in city centers everywhere to be flexible workspace soon. It follows that core building services like power, heating/cooling, ventilation, lighting, and security will need to be tracked and expensed like car mileage when this happens. This trend alone will fuel demand for edge analytics controllers and the building performance data that they make accessible.

Tagged with: ,
Posted in MARKET ANALYSIS, ANALYTICS

Highlights from Realcomm-IBcon 2017

IMG_1961

San Diego shoreline as seen from Glider Beach in La Jolla near UC San Diego campus. Oleg Koujikov is a UCSD electrical engineering student and member of Engineers for Sustainable World (ESW).

This year’s Intelligent Buildings conference by Realcomm was again a hotbed of information exchange and business dealings for all the professions involved in smart buildings, smart cities and smart energy projects. IT/OT convergence, security of networks and data, as well as technology disruption were big themes, as they have been in past years. My biggest takeaway from Realcomm/IBcon-2017 is respect for its host region. On display at every turn was leadership in interfacing its corporate and university strengths with community efforts to increase sustainability and make the most of its water and energy resources.

University of California at San Diego is home to the J. Craig Venter Institute where ground-breaking research in genomics is taking place. The project manager and owner’s representative during construction of this world-first net-zero laboratory was on stage at the conference. Yann Palmore, now with JLL Smart Buildings, was headlining several sessions, sharing his lessons learned. San Diego has a carbon action plan (CAP), and net zero thinking is helping it reach its year-by-year goals. Other net zero project veterans like David Kaneda from the Integral Group and Ruairi Barnwell of DLR Group were also there offering insight.

I also heard from Kiva Allgood, formerly President Qualcomm Intelligent Solutions, Industrial IOT and Smart Cities and now a GE Ventures Managing Partner. She was a board member of the local trade association, Cleantech San Diego in 2015, when it pulled together the public-private support required to deploy the IoT for the Port of San Diego. Combining technology and services from Intel, OSIsoft, Black & Veatch, Dell, and San Diego Gas & Electric (SDG&E), this project uses sensors to detect energy consumption and to translate it into easy-to-manage, real-time data to advance the port’s smart city goals. Kiva offered this powerful illustration during her Smart Cities panel:

“When you have a conference in town, a ball game going on and a cruise ship coming in, planners now ‘Pi’ the energy and water data to make decisions and take action. With the data in hand, you might ask the ship to entertain passengers out on the water for a few more hours until after the ninth inning, for example.”

Of course, she was talking about OSIsoft’s signature Pi software, an industry benchmark for collecting large amounts of data to manage utility, infrastructure, and building systems. It has a long track record among municipalities, businesses and academia. In addition to the Port of San Diego, the city’s Petco Park is a customer. Realcomm/IBcon visitors had opportunity to tour the ballpark during the conference to take in all the ways Pi is maximizing efficiencies. Many took in the Padres-vs-Cincinnati game while they were there. Fun!

David Doll, Industry Principal, Facilities and Energy Management at OSIsoft participating on a panel discussion about Enterprise IoT, offered this formula for success:

“Enterprise stakeholders recognize that data is ubiquitous, and they want to own the data collected from their activities and processes. They understand they can have intelligence out at the edge and into the clouds. They want to access it wherever it makes the most sense—latency issues factored in. With OSIsoft, the cloud is an architectural decision, not a mandate. The bigger the customer, the more likely they’ll want data stored on premise, they don’t require a cloud solution to scale. Once you start harvesting value from the data, the need to scale the platform is inevitable, from one building to hundreds of buildings. Your platform needs to accommodate all the projects you’ve planned and all those you haven’t thought about yet.”

On the same panel, Steven Meyer, Sr. Principal Engineer – Manufacturing IT at Intel Corporation, offered a perspective that is often missing from such operations-technology-centric gatherings—that of the enterprise IT manager. Meyer said:

“Intel decided to lead in IoT years ago, and I’m part of the IT team overseeing manufacturing automation in our factories world-wide. An aspect of that leadership is being engaged early in responding to the IoT needs and solutions proposed by business units inside the enterprise. The ‘dreaded end-to-end solution’ is now an official term here. We say that because, once you have experience with hundreds of smart-and-connected things in many categories, you realize that managing all of them with their different cloud hosts and device configuration requirements becomes burdensome. It blows your cost model out of the water. Enterprises don’t want islands of hard-to-access data.

“So, at Intel, IT has standards for vetting cloud solutions. Security is, of course, the primary concern. But, we set the bar in other respects too. We screen for openness, interoperability and review the vendor’s APIs and developer kits. Our message to the IoT platform vendors is ‘Make money on your interoperable IoT ingredients, but don’t make me duplicate my infrastructure.’

“As IT teams like mine move into the role of vetting IoT-solutions, we need to step up and enhance our services. We need to have the knowledge, people, processes and interest to sit at the table when requirements and specifications are first being discussed. If another end-to-end offering is being proposed that would tax IT resources, we need to be able to steer the project team to a comparable solution that is a better-fit. There are actually lots of ways to use the cloud, and it is IT’s responsibility to set and enforce the standard as to how it should be done in this enterprise.”

So many conference sessions this year honed in on working through the differences in the basic charters, educations and cultures of IT and OT people. Anto Budiardjo, editor of “A New Deal for Buildings” blog just posted a new article that captures some of the different perspectives on this ‘ravine impeding adoption of intelligent buildings.’ I couldn’t say it better, so I recommend reading Anto’s IT &OT: Convergence or Divergence? Spoiler alert: he basically agrees with Intel’s Steven Meyers that, when it comes to IoT deployments, IT has a responsibility to lead and to offer better service to OT customers.

Realcomm_Analytics_Panel

Building Data Analytics panel with Alex Grace of KGS Buildings, John Gilbert of Rudin Management, Victor Sanchez of LinkedIn and John Petze of SkyFoundry.

Regarding tech disruption, all the rumbling starts with advancements in analytics as well as cloud hosting services. The IBcon expo floor offered the chance to catch up with John Petze, Principal and Co-Founder of SkyFoundry, to ask him about how analytics for buildings are evolving. He explained that the latest release of the SkySpark platform, SkySpark® Everywhere™, is advancing along two vectors: first in terms of distributed edge-to-cloud architectures and then regarding how findings are communicated to human users. He explains:

“The BMS world understands the trade-offs between centralized and distributed processing. We saw the pattern before. The earliest BMS systems were centralized based on mainframes. Then came microprocessor-equipped field panels. Next, intelligent field devices brought processing all the way down to VAV boxes. Those controllers communicated with each other and could share control actions and events. Up until recently, analytics was a cloud- and server-based function. But, analytics processing needs to be distributed as well. In the IoT era, there is an unprecedented volume of data to handle. Deploying edge analytics can result in 1000-to-1 reduction in data transmission needs. Distributed data analytics is a core enabler of next-generation IoT applications for the built environment. With seamless clustering capability, it doesn’t matter where the collecting, storing, managing, analyzing and visualizing of data and analytics results happens. The platform can distribute the processing load across available resources in a way that is balanced, optimized and transparent from the operator’s perspective.

Then regarding ‘the last mile’—the user interface—end users want ease and clarity. SkySpark® Everywhere™ melds both ready-to-go apps that require ‘no-engineering’ with user-configurable tools that generate customized apps and reports. Whoever is setting up the analytics application can now decide to communicate findings with visualizations that best fit the range of users that will interact with the data and analytic results. It never has to be one-size-fits-all.”

Every year it seems like Realcomm-IBcon grows in terms of sessions, tracks, expo square footage and people to meet; but, the length of time you have over three day stays the same. The Fear-of-Missing-Out was intense. It’s a good thing so much is captured by the Realcomm team and made available in video on the Realcomm Conference Group youtube channel. This collection of interviews is a great resource for our industry.

Tagged with: , , ,
Posted in ANALYTICS, MARKET ANALYSIS, NEWS

Haystack and IoT World Tread Common Ground

Data Interoperability Digital Twin

From Saddlebrook to Silicon Valley, everyone is talking about using data to bring order, efficiency and transparency to the operational challenges of running buildings.

It’s a new day. Visibility is good, as dust from previous storms and battles has settled across the plane. Observers wait in the shadows. An imposing hero figure steps into the open. Everyone knows that whatever went on before—that’s over. And something new has just begun. The story of the new sheriff bringing order to a lawless place is a common plot for Western films. It feels like the commercial buildings industry is at a ‘New Sheriff’ moment right now. The power of data to bring transparency, greater security, fairer market competition and rapid change to buildings was the main theme at Haystack Connect in Tampa in early May. Advancements in contributing technologies like wireless connectivity, edge computing, analytics, machine learning, etc. were well covered in the IoT Architecture Symposium that ran during the IoT World Conference in Santa Clara in the middle of the month. I was at both and heard speakers and exhibitors deliver similar news: The business practices that have kept the buildings industry seven to ten years behind manufacturing, processing, transportation and other industries when it comes to data-driven operations are about to see their eclipse at High Noon. It is no longer just a Wild Bunch working on data interoperability; the biggest companies in IT and OT want standardization and less friction in data flows. Soon, for a Fistful of Dollars, building owners will be able to integrate and analyze the digital data streaming from any piece of building equipment, per any key performance indicators (KPIs) they want to monitor and manage.

Project Haystack’s 3rd bi-annual gathering brought together experts in operational technology (OT)—specifically the equipment and controls manufacturers, software vendors and systems integrators that have been at the core of open-source metadata tagging standards. The big presence of new Board-level member Intel signaled that dominant forces in IT were merging onto the same path. There were keynote presentations from Rita Wouhaybi, who is guiding Industrial & Energy Solutions Architecture for Intel’s Internet of Things Group, and Milan Milenkovic of IoTSense, IoT Technology Strategic Advisor to Intel. Each offered a stage-setting market landscape:

Dr. Wouhaybi made the point that when it comes to data modeling, the need for semantic schemas and label dictionaries for various IoT market segments —cars, cities, homes, energy grid, factories —overlap. Buildings are at the center of it all, so Project Haystack lessons, knowledge and tag sets have potential for sharing and adoption across the board.

Milenkovic’s landscape was of all the IoT interoperability standardization organizations. His point was that the various bodies need to build better bridges between their definitions to achieve higher rates of adoption. He explained that some organizations are working on syntactical interoperability and others, like Project Haystack, are working on semantic interoperability. There is still so much work to do on both fronts. He put out a call for collaboration: the path forward is for the various standards organizations to add interoperability to their charters and reach out to one another.

Two new significant bridge-building efforts were presented to Haystack attendees:

Dave Robin, research engineer with Automated Logic Corporation, past chair of the BACnet standards committee and longtime leader of its Network Security Working Group, made the ASHRAE 223 announcement. He gave some detail about the mapping mechanism to Haystack and other ontologies. The EdgeX Foundry announcement was made at the Dell booth, as Dell contributed the initial micro-services and tens of thousands of lines of code to seed the effort. EdgeX seeks to be a resource for anyone doing an edge device to have a ‘clean’ architecture to plug into.

Anno Scholten, President of Connexx Energy, also spoke to the coming era when well-defined reference architectures will bring order to the terabytes of time-series data that will be collected for a multi-story building. Metadata tagging systems like Project Haystack dictionaries ease navigation of all this data. He used the metaphor ‘Digital Twin’ to describe the end goal—a dynamic software model that can be used to analyze and predict building systems performance. He sees all the performance and energy modeling that design engineers do before a new construction or major retrofit project is built, and all the actual time-series data that is collected once it is operating, contributing to this Digital Twin.

The participation of big architectural engineering firms as well as building commissioning/energy management firms at Haystack Connect also contributed to that new-sheriff-on-the-beat feeling. While the metaphor of the Digital Twin is a good way to personify the kind of performance authority that a complete data model will represent, the engineers that building owners hire as their trusted advocates in making technology decisions are performance authorities in the flesh. John Petze and Marc Petock of Project Haystack led an “Engineers and End-Users Panel’ that included Matt Schwartz of Altura Associates, Ben Talbot of DLR Group, Zachariah Nobel of Constellation, and Rob Murchison of Intelligent Buildings, LLC. These are the type of firms and people blazing the way toward data analytics platforms that enable monitoring-based commissioning and better energy management.

The panel talked about proof-of-concept projects that used Haystack-compliant software and edge devices to balance ‘hot-path analytics’ (acting on data as it is being generated on the edge) and ‘cold-path analytics’ (analyzing select data in the cloud). Alper Uzmezler’s presentation addressed implementing Haystack from the cloud to the edge. And  a new community of developers working toward lightweight BAS suitable for analytics on the edge, Sedona Alliance, announced its formation.

These were just a few highlights at the Haystack event. I’ll be covering more of the content presented in the next edition of Haystack Connections magazine, to be published soon. ControlTrends has posted video recaps of each day of the conference that show even more of the action. Harbor Research’s Adam Hise wrote his own reflections on the Haystack Connect event here. He too describes a palpable sense of ‘New Sheriff in Town’ among the Haystack crowd:

“Systems integrators shared anecdotes of clients who, faced with the option of a proprietary system or a BACnet compliant offering that was $100K more expensive, were effectively forced into a long-term contract for a closed system. The same SIs could hardly contain a grin as they predicted, “they won’t get away with that anymore.”

Interestingly, some of the biggest proprietary-protocol-protected OEMs are also touting digitalization, aka the coming of the Digital Twin, as opportunity to renew buildings and other industries and to remake their businesses from the inside out. The stage at the IoT Architecture Symposium during IoT World was an opportunity to learn about that. For example, Suhas Joshi, Director, Honeywell, presented on merging new IoT tech with legacy C&I equipment. In the Q&A, Joshi was asked about handling ‘contextual’ data versus ‘global’ data—another way of saying meta data tagging versus the large time-series data stores. (See Anno Scholten’s Digital Twin presentation linked above.) Joshi answered “Certain markets are working on this. Look at Project Haystack.” On the same stage, Johnson Controls VP & GM of Data-Enabled Solutions, Sudhi Sinha, presented on how such a big industrial company goes about deciding who to partner with for data services, among tech-stack behemoths like Google, Amazon and Microsoft. Even companies the size of JCI must consider how to maintain power, leverage and future growth opportunities when they decide which cloud provider should store and secure their customers’ building data. Data is where the value resides. ‘Who are you going to entrust with it?’ becomes a very strategic question. The IoT Architecture Symposium also included an EdgeX Foundry presentation—just one more point in common with Haystack Connect.

Jim Lee, CEO of Cimetrics, was one of the original BACnet authors that was present at the Haystack event. He has just launched the New Deal blog with collaborator and market-mover Anto Budiardjo, who has long been dedicated to facilitating dialog between the building systems industry and commercial building professionals. ‘New Deal’ is their metaphor for the Day of the New Sheriff, Dawn of the Digital Twin, Pivot Point of the Paradigm Shift. There are already some articles well worth reading on the blog. I highly recommend Building Blocks For The New Deal:

“The New Deal is built on three critically important building blocks: BACnet open standard, model-based analytics, and service transparency.

This new blog is another response to Milan Milenkovic’s call to action for bridge-building between industry organizations working on data interoperability in the interest of finally moving more swiftly toward better buildings and toward all the Internet of Things product categories that rely on the contextual data streaming from them.

 

Tagged with: , ,
Posted in NEWS, MARKET ANALYSIS

Will DevOps Culture Come to Smart Buildings?

Cake_toddler2

If software is eating the world, then software development operations should be of consuming interest to everyone—particularly to stakeholders in Smart Buildings and other Smart Systems. We should have already learned the lesson that you need to start product development from the perspective of end-users—building owners, facilities staff and occupants. The last generation of building automation technology offered enough instances of unplugged BMS systems sitting in corners to be wary of the possibility of not-my-piece-of-cake reactions to new technology. How can the industry do better going forward? Or as Brad White of SES Consulting asked in April “How do you spec a good user experience?”

Smart Building product developers could get in sync with enterprise software innovators and embrace the concept and culture of DevOps. Modern DevOps methodology emerged in response to the rise of open-source software, mobile computing and cloud architectures. Tasks like version control, configuration management and compliance testing have become more complex, while pressure to release software products more rapidly, frequently, and reliably is increasing. The one new point of leverage is that collaboration tools are a whole lot better. So, DevOps culture was born out of the idea that software engineers and their IT operations colleagues can manage the greater complexity and shortened schedules by working together more seamlessly.

 

 

Devops.svgOne expert on the topic, Coté, describes DevOps as “the term people use when they mean ‘doing all that new stuff, in new ways.’” That’s helpful? The DevOps Wikipedia page describes it as connecting software engineers and IT staff in a tight DevOps cycle or tool chain.

Devops-toolchain

To add the essential ingredient of customer empathy, user experience (UX) designers are also core participants in DevOps. Kai Brunner, Principal Designer for continuous delivery enterprise software at Electric Cloud, describes the dynamic this way:

“Designers have to not only adapt, but evolve proactively to continue leading with creative decisions and transform the ‘power struggle’ into the ‘power of collaboration,’ the same way development and operation teams are achieving this in the DevOps culture.”

DevOps, User Experience Design (UX Design), Customer Success Management—is all this actually new? Or is DevOps culture just a bunch of new names for classic software development and marketing principals and roles? TBD. But, it’s worth considering the fit for our industry.

DevOps & Analytics

Opportunity for DevOps methodology in Smart Buildings may have an opening with the deployment of analytics and visualization programs. In recent years, some building retrofit project teams tried to make an end-run around IT when deploying analytics. For the most part, that strategy has proven short-sighted. Typically, somewhere up the line a decision-maker has a not-my-piece-of cake reaction, and the analytics project is stymied. In his March article “Analytics Creates Transparency” Jim Lee of Cimetrics makes a good case for avoiding the end-run. He offers this framing:

“See how analytics brings value to building owners and operators, after all, they are our customers. The scope of value-creating features of analytics for owners is growing: energy management, FDD (Fault Detection and Diagnostics) and continuous commissioning were a few early areas enabled by analytics. More recently, we are crossing into corporate functions such as compliance, forensics, vendor and asset lifecycle management. On the horizon are more valuable areas for enterprises: targeting occupants to be more efficient, productive, cognitively alert and deliver the best for the enterprises which employ them.”

There are more and more companies in our industry that, like Cimetrics, see customer IT departments as their allies and partners in satisfying building owner and occupant end users. You could say that they’ve embraced the DevOps philosophy of ‘turning the power struggle with IT into the power of collaboration.

Of course, Cimetrics is a 3rd-party analytics software developer; so, it is innovating software development practices all the time. Also, the DevOps concept of a continuous cycle of planning and doing is coded into its DNA. Is a DevOps toolchain so different from a monitoring-based commissioning (MBCx) cycle with analytics software like Cimetrics Analytika at the core?

Cimetrics_SiemensMBcx

There are many different types of companies in our Smart Building/Smart System industry including OEMs, control system vendors, controls contractors, master systems integrators, energy management and commissioning consultants and service & maintenance firms. What can DevOps culture offer them?

Deploying 3rd-party analytics to the satisfaction of owner and occupant end-users takes customization of the interface for the unique building or campus under management, and sometimes it means developing a custom app. So, some of the most forward-thinking players in each category are recognizing that they need to evolve and funnel their controls expertise into a software engineering role. Training in the DevOps methodology and tool chain could help them bring structure to this new role.

DevOps_Buildings1

CMMS & Visualization Support Tools: DevOps and UX Design With Another Name

Again, ‘DevOps culture’ is in many ways just a trendy way to describe practices and tools that are already familiar to Smart Building/Smart System professionals under different names. For instance, computerized maintenance management systems (CMMSs) have been around a while, supporting the same types of activities listed in the DevOps tool chain. A CMMS helps users track who did what to which piece of equipment when, etc. BASSG’s Sparko software package, for example, is a CMMS that works with SkyFoundry’s SkySpark analytics system. Sparko will map specific SkySpark rules to operations & maintenance personnel or departments inside a company or to external team members like an automation contractor or MEP engineer. Basically, it gives users the flexibility to get corrective action in motion sooner. Tying analytics to action in this way fits neatly into the DevOps playbook. BASSG is sometimes described as ‘DevOps partner to Smart Building System developers.’

This year’s Control Trends Awards included the category ‘3rd Party Visualization Vendor of the Year’ which was won in a tie by BASSG and DG Logik. These companies are providing HTML 5 platforms for the development of custom user interfaces for smart system software. Good design principles are baked into their templates, component libraries and menus. Every small firm creating an energy or comfort performance optimization app for their building-owner customers might not have a User Experience (UX) designer on the team, but with these 3rd-party visualization platforms, you are empowered to make information as easy and compelling to interpret as possible.

Impact of Continuous SW Delivery Schedules on Marketing

I have spent my career in enterprise software marketing and, for me, what is new about DevOps culture, is the idea of letting go of strict product release schedules. Back in the day of print and paper, we timed the release of various customer and market communications with the phases of product development. White papers and trade press articles that described the upcoming market need were released when development was just getting underway; product description brochures and software manuals had to be ready in time for beta release; advertisements and website copy were needed as part of the general availability launch, etc. Today, not only is software being developed and released on a continual basis, communications with the end customer are on-demand too. Digital publishing, social media and search tools have evolved to help marketers cope with the new realities. Webinars and podcasts are part of the new mix too. And soon there will be chatbots and other AI-enhanced aids to help reach customers with messages when they are ready to hear them.

As with DevOps, the tools to communicate about a new product introduction get better, but the noise, distractions and pressures increase too. I think we are left with the same problem: how to make the customer a partner in the cake baking, so that everyone has a sense of anticipation up to the moment it is delivered, and a sense of satisfaction when they try it.

Posted in ANALYTICS, MARKET ANALYSIS

Appreciating How Metadata Makes AI Possible

label_maker

One of the most re-watched episodes of the comedy series Seinfeld is ‘The Label Maker’ when Elaine’s gift to a friend at Christmas was re-gifted to Jerry before the Superbowl. True, a thing for tagging other things was not a fun or romantic gift in 1995. And, perhaps Bryan Cranston’s fictional dentist didn’t get it. But, Julia Louis-Dreyfus’s Elaine was way out ahead in her thinking. Label making is important! If you are a data wrangler today, you should appreciate any gift that helps you tag things with metadata labels.

Frank Chen of Silicon Valley venture capital firm Andreessen Horowitz (a16z) presents a timeline in his AI and deep learning mini-course that happens to plot label-making’s journey from elephant gift to tech’s newest cool thing. Released to all interested students in June 2016, it is a fantastic history lesson and primer on what is happening in artificial intelligence (AI) today. He writes:

“One person, in a literal garage, building a self-driving car.” That happened in 2015. Now to put that fact in context, compare this to 2004, when DARPA sponsored the very first driverless car Grand Challenge. Of the 20 entries they received then, the winning entry went 7.2 miles; in 2007, in the Urban Challenge, the winning entries went 60 miles under city-like constraints. Things are clearly progressing rapidly when it comes to machine intelligence. But how did we get here, after not one but multiple “A.I. winters”? What’s the breakthrough? And why is Silicon Valley buzzing about artificial intelligence again?

The same AI entering cars is impacting buildings too. Listen to Ken Sinclair discuss the surprising rate of innovation in his latest ControlTrends interview. Chen answers his own question this way: more compute power, more data, better algorithms and more investment. His research colleague at a16z, Ben Evans explores the topic of labeling in more depth in the blog post AI, Apple and Google. Here are some key excerpts:

  • So you can say to your phone ‘show me pictures of my dog at the beach’ and a speech recognition system turns the audio into text, natural language processing takes the text, works out that this is a photo query and hands it off to your photo app, and your photo app, which has used ML systems to tag your photos with ‘dog’ and ‘beach’, runs a database query and shows you the tagged images. Magic.
  • Try it without labels (‘unsupervised’ rather than ‘supervised’ learning).  Today you would spend hours or weeks in data analysis tools looking for the right criteria to find these, and you’d need people doing that work – sorting and resorting an Excel table with a million rows and a thousand columns, metaphorically speaking.
  • The eye-catching speech interfaces or image recognition are just the most visible demos of the underlying techniques.
  • The important part is not that the computer can find them, but that the computer has worked out, itself, how to find them.

Did you catch that? The speech and image recognition technology may be superficial eye-candy compared to the feat of putting together the underlying knowledge graph. In other words, how you classify and label objects is at the core of how well your AI works. Knowledge graphs for the World Wide Web are the domain of semantic web researchers. Three leading professors in the field from the University of Zurich, Rensselaer Polytechnic Institute, and Stanford University collaborated on the September 2016 article, A New Look at the Semantic Web. Here are some key excerpts from this long-form editorial:

  • Bringing a new kind of semantics to the Web is becoming an important aspect of making Web data smarter and getting it to work for us. Achieving this objective will require research that provides more meaningful services and that relies less on logic-based approaches and more on evidence-based ones.
  • Crowdsourcing approaches allow us to capture semantics that may be less precise but more reflective of the collective wisdom
  • We believe our fellow computer scientists can both benefit from the additional semantics and structure of the data available on the Web and contribute to building and using these structures, creating a virtuous circle.

Labeling, edge computing, artificial intelligence—these are three pieces of the same puzzle—a puzzle that seems to be coming together very fast right now. (Don’t miss the recent slideshow of another a16z thinker, Peter Levine, on how edge computing will soon eclipse the cloud.) The concepts and timing that Silicon Valley’s a16z thought leaders describe are as applicable to buildings as they are to cars, dogs and beaches. And, the academics leading the semantic web conversation are saying that the mark-up languages and metadata schema are coming from all corners of the web, not just ivory towers. Frank Chen points out that the latest Google image recognition algorithms can chow down on the entire collection of videos on Youtube. But, when they do that, they get a graph that skews in favor of cats doing funny things. That doesn’t reflect the real world. The best knowledge graphs, metadata schema, neural nets—whatever you want to call this undergirding ML labeling technology that does the classifying—the versions that work best reflect the collective-wisdom and first-hand evidence of those with physical-world experience.

This brings us to Project Haystack, the open-source organization launched in 2011, devoting to developing a standard mark-up language and a tagging schema for devices in commercial buildings. Given the core importance to AI of getting standardized labeling right the first time, it is no surprise that Academia and big-IT picked up on the Haystack schema when they launched Brick schema. One way to look at it is that there is more industry, academic and government energy, focus and money being invested in label-making than ever before—what a gift! Seinfeld’s Elaine Benes would be such a supporter if she were here today. And even the dentist that became Walter White of Breaking Bad would not under-appreciate it. I hope more of those that hold the evidence and wisdom to contribute get involved. Silo-ing data was the way business was conducted in the last innovation cycle, but, it won’t work going forward in the age of AI and machine learning (ML).

Another reason to do your part: tomorrow, there may not be chief marketing officers and chief technology officers, but rather chief labelers of marketing things and chief labelers of technology things, etc. The labeling of training data for machine-learning algorithms is about to consume us all—at least everyone that works with computers, mobile phones, and Internet-of-Things devices. So, best to get ahead of the game.

Posted in ANALYTICS, MARKET ANALYSIS

Vendor Lock-in of Your Data Could Happen Again: Protocol Wars are Moving from Cable to Cloud

Today’s Smart Building Data Exhaust May Be Tomorrow’s Machine-Learning Gold

Smart Building Owners should remain vigilant in the battle against vendor lock-in of their building data.

Smart Building Owners should remain vigilant in the battle against vendor lock-in of their building data.

Take back the Internet!” is the Net Neutrality movement’s rallying cry against big telecom and cable companies. “Take back your life!“ say digital privacy advocates in warning against the online advertising industry. “Take back your enterprise!” is how the Open Source software movement positioned itself vis-a-vis big-brand business software.  And, “Take back your building!” is how the open-protocol building automation community has framed its alternative to protocol lock-in by big OEMs.

None of these battles started yesterday, and none of them will end tomorrow. Caveat Emptor, a principle of business so infernal and eternal that you have to say it in Latin, applies: If the people using data and investing in data tools don’t get educated, vendors are going to act in their own self-interest. The battle against building data lock-in could be moving to a new arena, from serial cabling protocols to the Application Program Interfaces (APIs) defined by Cloud-hosted applications. Let’s not let that happen.

Yesterday’s BMS Protocol Wars

The history of the building controls industry is marked by stages of vendor lock-in. When the buildings industry was first going digital decades ago, BMS manufacturers wrote their own protocols to transport data over serial cable. They made these proprietary, providing keys only to technicians within their sales channel with the result that they had a near monopoly when future service and upgrading was required. These service revenue streams were more valuable than the original price of the systems. It then became a practice to price low during the initial bidding process, with the intention of charging captive customers more later. This led to complacent, slow-to-innovate vendors and unhappy customers who felt they were being milked with every service call. These are all good lessons that no one wants to relearn.

BACnet and other open communications standards were meant to put an end to this syndrome and bring back market forces. With open standards, controls contractors could work on equipment from multiple brands and customers could again run competitive bids for their business. The lock-in practice died slowly though, as Darren Wright, a Director at Arup explains here.  For a while, OEMs could say they were BACnet-compliant, but still maintain proprietary protocols at the BMS controller-level.The era of the Internet of Things is expected to finally bring the serial cable protocol wars to their conclusion. But, a similar approach to vendor lock-in, or even worse, could happen with closed APIs within cloud services.

An Open Future Demands Open SDKs

In the interest of Caveat Emptor, any project teams evaluating cloud solutions should ask vendors “Will near-realtime and historical trend data be available to us when we want it?” If the answer is “We have APIs.”— red flag. APIs are not enough!  If any company is holding your building’s HVAC, lighting, power meter or other operational data in their cloud, the chance that you will be able to access it in a practical way via their APIs is quite slim. What you want to be asking cloud service vendors is “Do you supply free Software Developer Kits (SDKs) in open source languages?” If you opt for only cloud services that provide this assurance, your data won’t be held for ransom when the inevitable day comes that you need it for your company’s own internal purposes or to supply as data streams to other vendors. Here are a few more guidelines:

  • Ask about SDKs for the cloud server side as well as for the client or edge device side.
  • The SDKs should be available in JAVA C++  for client javascript and any other popular open languages for the particular service.
  • The file format should not be cryptic. Definitions to open source files are needed, even if the file format is open source. File format definitions are widely available; for example, there is open source for yaml and json.
  • If the cloud application is using custom-file-format source code to parse and compile, this file format should be supplied in all languages in the SDKs.
  • Ask if any other company is using the particular cloud vendor’s APIs, and confer with the references provided.
The Project Haystack open source community provides a good example of the type of Software Development Kit support needed to ease data interoperability in a full range of use cases.

The Project Haystack open source community provides a good example of the type of Software Development Kit support needed to ease data interoperability in a full range of use cases.

Predictive Value in Data Exhaust

As more and more sensors and digital devices are introduced into buildings, digitizing more physical processes at tighter intervals, it is a natural impulse to look for ways to hand-off the burden of managing all that data. Except for the tip-of-the-iceberg analytics results, new data just seems like new problems—so much data exhaust with no real value. But, when a cloud service takes your data and starts observing operations and applying algorithms, new value is being created. The vendor has a right to its portion of that value, but the newly calculated data should also be readily accessible to the customer, i.e. the building owner. Software companies may be a step ahead of facilities managers and building owners about machine learning and artificial intelligence concepts; however, customers too want to get all the predictive value possible from their buildings’ digital streams. As we covered in our September article, it is imperative that building owner takes the reins of their operational data strategy today. This is how they can position for success as we move into an era of more autonomous buildings.

Deciding among platforms for energy management and building operations management is challenging. There are so many. But, the lessons that the buildings industry has learned from the past should inform next steps. The general software industry offers its lessons as well. All the critical plumbing components in enterprise business stacks have moved toward open source software.

The home automation industry offers its own parable illustrating why you shouldn’t entrust your building data to a single vendor with closed APIs and data feeds. When Google Nest shut down, or ‘bricked,’ the Revolv home hub in April 2016 it rattled the tech industry at large and the emerging IoT industry particularly.  It’s notable that Google parent company Alphabet’s next step was to open source the Thread protocol that Nest uses for communications.

The takeaway lessons are, don’t be lured into a walled garden by a shiny interface, a low initial cost, and promises that your data will be well taken care of by a cloud vendor. Ask questions to ensure your data will be open to your future uses. Weigh the pro’s and con’s of on-premise, in-the-cloud, and at-the-edge computing and storage, and negotiate a fair price for keeping as much of the resulting data as is practical and affordable. It may be the cache you need for machine-learning training in the future. The motivations of your in-house IT department and/or professional hosting services are more aligned with your own regarding how your building data is best stored and secured.

The IoT era promises a new beginning and faster innovation cycles, but not necessarily simplicity — at least not in this early transition time. The buildings industry should arm itself with the right people and resources to navigate the complexity and not fall prey again to vendor lock-in as it did in the early 1990s.

Tagged with: , ,
Posted in ANALYTICS, NEWS

Climate Action, Digitalization and the Opportunity to Reenergize the Buildings Industry

Siemens smart building performance tooll

Climate laws and agreements are being enacted, and big building technology companies like Siemens are positioning to come out heroes

It has been an eventful few weeks for Climate Action watchers. From tree-huggers to eager capitalists, there are a lot of people happy to see moves that will curtail fossil fuel burning. California signed SB32 into law which commits the state to reduce its greenhouse gas emissions 40 percent below 1990 levels by 2030. A critical mass of countries formally committed to the Paris climate agreement at the UN general assembly in New York. (India may sign this week). And voters in the USA pressed presidential candidates to detail their positions on the issue of Clean Energy leadership.

Amory Lovins, Chief Scientist, Rocky Mountain Institute, holds that climate protection can be profitable. But, for which countries, regions and companies, and in what time frames? There will be winners and losers along the way as these laws and agreements are translated into actions. Certainly today’s biggest building technology companies want to be among the winners.

Lovins spoke at VERGE 16 in Santa Clara, CA, on Sept 20th. On stage with him was Rainer Baake, State Secretary at the German Federal Ministry for Economic Affairs and Energy. Baake reported that Germany is on the right trajectory to meet its target of cutting carbon emissions by more than 80 percent by 2050. A theme of the plenary session was that, if you want to see tomorrow’s clean energy system, look at Germany today. Complementing Secretary Baake’s presence at VERGE 16 were break-out sessions featuring Siemens Building Technologies (BT) Division.  Their message was ‘If you want to see what BEMS (building energy management systems) will look like tomorrow, look at what Siemens is doing today.’ 

Like Germany, San Francisco has a target of reducing CO emissions by 80% by 2050 (against a 1990 baseline). The city collaborated with Siemens on a holistic study released during the conference that applied its City Performance Tool (CyPT). The analysis included recommendations on how to allocate resources toward energy efficient buildings, clean energy, and a multi-modal transport network. Cities and utilities that want to scale building efficiency programs fast look to partner with companies that have a breadth of businesses, the financial resources and the long-standing customer relationships of a Siemens. As shown in the CyPT graphic below, the company has product and service offerings to fill in almost every box in the box-chart.

A growing business inside Siemens BT is the Building Performance and Sustainability Group, serving Infrastructure and City customers. Director of Energy & Sustainability Solutions Ari Kobb led a session on Siemens Energy Data Management services. He noted that Siemens has been moving customers to ‘digitalization’ on both the demand-side and supply-side of the energy equation for years. His group is now able to quickly bring that digital information together as data services. Of course, today there are numerous big industrial companies and IT companies striving to compete for the same enterprise data management business. With competition among the giants fierce, the pressure is on to deliver easy access to data, positive user experience and reduced complexity. The big building equipment OEMs now have extensive experience working with the data streamed from each others’ systems, neutralizing protocol lock-in and making the practice more and more a thing of the past.

Siemens is also learning from its customers in these large-scale enterprise engagements. Another Siemens BT Managing Director, Kimberly Brisley, presented on her work with a life sciences company. Leaders from tightly-regulated process industries like this customer have data science at the core of their culture. They have goals and suggestions for new ways to derive value from energy and building operational data. Siemens is helping them achieve these, and then taking back the learning, translating it into new digital solutions to add to its suite.

While the sensing, software and networking technologies needed to build the digital solutions listed by Siemens in the San Francisco City Performance study have all been generally available for years, the business models, real estate practices and laws have not encouraged fast adoption. It’s notable that researchers for this analysis found 0% of buildings doing monitoring, only 4% doing performance optimization, and 0% doing remote monitoring. In other words, the market for such building data services is truly in its infancy.

Among building types, retail stores have been early adopters of Remote Monitoring. The Siemens VERGE expo featured demonstrations of its Site Controls Energy Management System (EMS) led by Carl Anello, National Accounts Director, and Aaron Moore, National Accounts Executive, of the Retail Group. EMS reduces energy usage at stores by monitoring and/or controlling key energy-consuming devices such as HVAC (heating, ventilation and air conditioning) units, indoor lighting, store signage, indoor and outdoor temperature sensors, and refrigeration units. The EMS also monitors energy output and performance of on-site solar panels. A cloud-based data analytics platform provides dashboards, KPIs and outlier reporting to quickly pinpoint and resolve issues that would otherwise drive excessive consumption or negatively impact customer comfort.
siemenssfstudy
Siemens itself has committed to being Carbon Neutral as a company by 2030, and is preparing to lead in distributed energy products and services. For all these reasons, it is in a unique position to support customers as they take steps toward greater energy efficiency, resilience and eventually energy independence by running their buildings as microgrids. It is going to take such global powerhouse companies scaling energy efficiency at a fast pace, if the world is to meet recently legislated Climate Action goals.

The big companies are defining their terrain in this new era of Climate Action like snow grooming machines on a ski slope in a blizzard.  But, they are leaving plenty of space for innovative start-ups to roll their own snowballs through this landscape to amass a growing business of their own. …I’m sure that metaphor came from me hoping that Northern California will get sufficient snow in the winter again, once we start taking action to reduce the earth’s GHG blanket.

Tagged with: , ,
Posted in MARKET ANALYSIS, NEWS
Content Marketing Services

BuildingContext is a marketing partner that understands data-driven buildings. We can create compelling original content that speaks to your target audience. We'll also work to promote your messaging across blogs, social, search engines and other channels to attract, inform and convert the right people at the right time.

Follow @BuildingContext