Haystack and IoT World Tread Common Ground

Data Interoperability Digital Twin

From Saddlebrook to Silicon Valley, everyone is talking about using data to bring order, efficiency and transparency to the operational challenges of running buildings.

It’s a new day. Visibility is good, as dust from previous storms and battles has settled across the plane. Observers wait in the shadows. An imposing hero figure steps into the open. Everyone knows that whatever went on before—that’s over. And something new has just begun. The story of the new sheriff bringing order to a lawless place is a common plot for Western films. It feels like the commercial buildings industry is at a ‘New Sheriff’ moment right now. The power of data to bring transparency, greater security, fairer market competition and rapid change to buildings was the main theme at Haystack Connect in Tampa in early May. Advancements in contributing technologies like wireless connectivity, edge computing, analytics, machine learning, etc. were well covered in the IoT Architecture Symposium that ran during the IoT World Conference in Santa Clara in the middle of the month. I was at both and heard speakers and exhibitors deliver similar news: The business practices that have kept the buildings industry seven to ten years behind manufacturing, processing, transportation and other industries when it comes to data-driven operations are about to see their eclipse at High Noon. It is no longer just a Wild Bunch working on data interoperability; the biggest companies in IT and OT want standardization and less friction in data flows. Soon, for a Fistful of Dollars, building owners will be able to integrate and analyze the digital data streaming from any piece of building equipment, per any key performance indicators (KPIs) they want to monitor and manage.

Project Haystack’s 3rd bi-annual gathering brought together experts in operational technology (OT)—specifically the equipment and controls manufacturers, software vendors and systems integrators that have been at the core of open-source metadata tagging standards. The big presence of new Board-level member Intel signaled that dominant forces in IT were merging onto the same path. There were keynote presentations from Rita Wouhaybi, who is guiding Industrial & Energy Solutions Architecture for Intel’s Internet of Things Group, and Milan Milenkovic of IoTSense, IoT Technology Strategic Advisor to Intel. Each offered a stage-setting market landscape:

Dr. Wouhaybi made the point that when it comes to data modeling, the need for semantic schemas and label dictionaries for various IoT market segments —cars, cities, homes, energy grid, factories —overlap. Buildings are at the center of it all, so Project Haystack lessons, knowledge and tag sets have potential for sharing and adoption across the board.

Milenkovic’s landscape was of all the IoT interoperability standardization organizations. His point was that the various bodies need to build better bridges between their definitions to achieve higher rates of adoption. He explained that some organizations are working on syntactical interoperability and others, like Project Haystack, are working on semantic interoperability. There is still so much work to do on both fronts. He put out a call for collaboration: the path forward is for the various standards organizations to add interoperability to their charters and reach out to one another.

Two new significant bridge-building efforts were presented to Haystack attendees:

Dave Robin, research engineer with Automated Logic Corporation, past chair of the BACnet standards committee and longtime leader of its Network Security Working Group, made the ASHRAE 223 announcement. He gave some detail about the mapping mechanism to Haystack and other ontologies. The EdgeX Foundry announcement was made at the Dell booth, as Dell contributed the initial micro-services and tens of thousands of lines of code to seed the effort. EdgeX seeks to be a resource for anyone doing an edge device to have a ‘clean’ architecture to plug into.

Anno Scholten, President of Connexx Energy, also spoke to the coming era when well-defined reference architectures will bring order to the terabytes of time-series data that will be collected for a multi-story building. Metadata tagging systems like Project Haystack dictionaries ease navigation of all this data. He used the metaphor ‘Digital Twin’ to describe the end goal—a dynamic software model that can be used to analyze and predict building systems performance. He sees all the performance and energy modeling that design engineers do before a new construction or major retrofit project is built, and all the actual time-series data that is collected once it is operating, contributing to this Digital Twin.

The participation of big architectural engineering firms as well as building commissioning/energy management firms at Haystack Connect also contributed to that new-sheriff-on-the-beat feeling. While the metaphor of the Digital Twin is a good way to personify the kind of performance authority that a complete data model will represent, the engineers that building owners hire as their trusted advocates in making technology decisions are performance authorities in the flesh. John Petze and Marc Petock of Project Haystack led an “Engineers and End-Users Panel’ that included Matt Schwartz of Altura Associates, Ben Talbot of DLR Group, Zachariah Nobel of Constellation, and Rob Murchison of Intelligent Buildings, LLC. These are the type of firms and people blazing the way toward data analytics platforms that enable monitoring-based commissioning and better energy management.

The panel talked about proof-of-concept projects that used Haystack-compliant software and edge devices to balance ‘hot-path analytics’ (acting on data as it is being generated on the edge) and ‘cold-path analytics’ (analyzing select data in the cloud). Alper Uzmezler’s presentation addressed implementing Haystack from the cloud to the edge. And  a new community of developers working toward lightweight BAS suitable for analytics on the edge, Sedona Alliance, announced its formation.

These were just a few highlights at the Haystack event. I’ll be covering more of the content presented in the next edition of Haystack Connections magazine, to be published soon. ControlTrends has posted video recaps of each day of the conference that show even more of the action. Harbor Research’s Adam Hise wrote his own reflections on the Haystack Connect event here. He too describes a palpable sense of ‘New Sheriff in Town’ among the Haystack crowd:

“Systems integrators shared anecdotes of clients who, faced with the option of a proprietary system or a BACnet compliant offering that was $100K more expensive, were effectively forced into a long-term contract for a closed system. The same SIs could hardly contain a grin as they predicted, “they won’t get away with that anymore.”

Interestingly, some of the biggest proprietary-protocol-protected OEMs are also touting digitalization, aka the coming of the Digital Twin, as opportunity to renew buildings and other industries and to remake their businesses from the inside out. The stage at the IoT Architecture Symposium during IoT World was an opportunity to learn about that. For example, Suhas Joshi, Director, Honeywell, presented on merging new IoT tech with legacy C&I equipment. In the Q&A, Joshi was asked about handling ‘contextual’ data versus ‘global’ data—another way of saying meta data tagging versus the large time-series data stores. (See Anno Scholten’s Digital Twin presentation linked above.) Joshi answered “Certain markets are working on this. Look at Project Haystack.” On the same stage, Johnson Controls VP & GM of Data-Enabled Solutions, Sudhi Sinha, presented on how such a big industrial company goes about deciding who to partner with for data services, among tech-stack behemoths like Google, Amazon and Microsoft. Even companies the size of JCI must consider how to maintain power, leverage and future growth opportunities when they decide which cloud provider should store and secure their customers’ building data. Data is where the value resides. ‘Who are you going to entrust with it?’ becomes a very strategic question. The IoT Architecture Symposium also included an EdgeX Foundry presentation—just one more point in common with Haystack Connect.

Jim Lee, CEO of Cimetrics, was one of the original BACnet authors that was present at the Haystack event. He has just launched the New Deal blog with collaborator and market-mover Anto Budiardjo, who has long been dedicated to facilitating dialog between the building systems industry and commercial building professionals. ‘New Deal’ is their metaphor for the Day of the New Sheriff, Dawn of the Digital Twin, Pivot Point of the Paradigm Shift. There are already some articles well worth reading on the blog. I highly recommend Building Blocks For The New Deal:

“The New Deal is built on three critically important building blocks: BACnet open standard, model-based analytics, and service transparency.

This new blog is another response to Milan Milenkovic’s call to action for bridge-building between industry organizations working on data interoperability in the interest of finally moving more swiftly toward better buildings and toward all the Internet of Things product categories that rely on the contextual data streaming from them.

 

Tagged with: , ,
Posted in MARKET ANALYSIS, NEWS

Will DevOps Culture Come to Smart Buildings?

Cake_toddler2

If software is eating the world, then software development operations should be of consuming interest to everyone—particularly to stakeholders in Smart Buildings and other Smart Systems. We should have already learned the lesson that you need to start product development from the perspective of end-users—building owners, facilities staff and occupants. The last generation of building automation technology offered enough instances of unplugged BMS systems sitting in corners to be wary of the possibility of not-my-piece-of-cake reactions to new technology. How can the industry do better going forward? Or as Brad White of SES Consulting asked in April “How do you spec a good user experience?”

Smart Building product developers could get in sync with enterprise software innovators and embrace the concept and culture of DevOps. Modern DevOps methodology emerged in response to the rise of open-source software, mobile computing and cloud architectures. Tasks like version control, configuration management and compliance testing have become more complex, while pressure to release software products more rapidly, frequently, and reliably is increasing. The one new point of leverage is that collaboration tools are a whole lot better. So, DevOps culture was born out of the idea that software engineers and their IT operations colleagues can manage the greater complexity and shortened schedules by working together more seamlessly.

 

 

Devops.svgOne expert on the topic, Coté, describes DevOps as “the term people use when they mean ‘doing all that new stuff, in new ways.’” That’s helpful? The DevOps Wikipedia page describes it as connecting software engineers and IT staff in a tight DevOps cycle or tool chain.

Devops-toolchain

To add the essential ingredient of customer empathy, user experience (UX) designers are also core participants in DevOps. Kai Brunner, Principal Designer for continuous delivery enterprise software at Electric Cloud, describes the dynamic this way:

“Designers have to not only adapt, but evolve proactively to continue leading with creative decisions and transform the ‘power struggle’ into the ‘power of collaboration,’ the same way development and operation teams are achieving this in the DevOps culture.”

DevOps, User Experience Design (UX Design), Customer Success Management—is all this actually new? Or is DevOps culture just a bunch of new names for classic software development and marketing principals and roles? TBD. But, it’s worth considering the fit for our industry.

DevOps & Analytics

Opportunity for DevOps methodology in Smart Buildings may have an opening with the deployment of analytics and visualization programs. In recent years, some building retrofit project teams tried to make an end-run around IT when deploying analytics. For the most part, that strategy has proven short-sighted. Typically, somewhere up the line a decision-maker has a not-my-piece-of cake reaction, and the analytics project is stymied. In his March article “Analytics Creates Transparency” Jim Lee of Cimetrics makes a good case for avoiding the end-run. He offers this framing:

“See how analytics brings value to building owners and operators, after all, they are our customers. The scope of value-creating features of analytics for owners is growing: energy management, FDD (Fault Detection and Diagnostics) and continuous commissioning were a few early areas enabled by analytics. More recently, we are crossing into corporate functions such as compliance, forensics, vendor and asset lifecycle management. On the horizon are more valuable areas for enterprises: targeting occupants to be more efficient, productive, cognitively alert and deliver the best for the enterprises which employ them.”

There are more and more companies in our industry that, like Cimetrics, see customer IT departments as their allies and partners in satisfying building owner and occupant end users. You could say that they’ve embraced the DevOps philosophy of ‘turning the power struggle with IT into the power of collaboration.

Of course, Cimetrics is a 3rd-party analytics software developer; so, it is innovating software development practices all the time. Also, the DevOps concept of a continuous cycle of planning and doing is coded into its DNA. Is a DevOps toolchain so different from a monitoring-based commissioning (MBCx) cycle with analytics software like Cimetrics Analytika at the core?

Cimetrics_SiemensMBcx

There are many different types of companies in our Smart Building/Smart System industry including OEMs, control system vendors, controls contractors, master systems integrators, energy management and commissioning consultants and service & maintenance firms. What can DevOps culture offer them?

Deploying 3rd-party analytics to the satisfaction of owner and occupant end-users takes customization of the interface for the unique building or campus under management, and sometimes it means developing a custom app. So, some of the most forward-thinking players in each category are recognizing that they need to evolve and funnel their controls expertise into a software engineering role. Training in the DevOps methodology and tool chain could help them bring structure to this new role.

DevOps_Buildings1

CMMS & Visualization Support Tools: DevOps and UX Design With Another Name

Again, ‘DevOps culture’ is in many ways just a trendy way to describe practices and tools that are already familiar to Smart Building/Smart System professionals under different names. For instance, computerized maintenance management systems (CMMSs) have been around a while, supporting the same types of activities listed in the DevOps tool chain. A CMMS helps users track who did what to which piece of equipment when, etc. BASSG’s Sparko software package, for example, is a CMMS that works with SkyFoundry’s SkySpark analytics system. Sparko will map specific SkySpark rules to operations & maintenance personnel or departments inside a company or to external team members like an automation contractor or MEP engineer. Basically, it gives users the flexibility to get corrective action in motion sooner. Tying analytics to action in this way fits neatly into the DevOps playbook. BASSG is sometimes described as ‘DevOps partner to Smart Building System developers.’

This year’s Control Trends Awards included the category ‘3rd Party Visualization Vendor of the Year’ which was won in a tie by BASSG and DG Logik. These companies are providing HTML 5 platforms for the development of custom user interfaces for smart system software. Good design principles are baked into their templates, component libraries and menus. Every small firm creating an energy or comfort performance optimization app for their building-owner customers might not have a User Experience (UX) designer on the team, but with these 3rd-party visualization platforms, you are empowered to make information as easy and compelling to interpret as possible.

Impact of Continuous SW Delivery Schedules on Marketing

I have spent my career in enterprise software marketing and, for me, what is new about DevOps culture, is the idea of letting go of strict product release schedules. Back in the day of print and paper, we timed the release of various customer and market communications with the phases of product development. White papers and trade press articles that described the upcoming market need were released when development was just getting underway; product description brochures and software manuals had to be ready in time for beta release; advertisements and website copy were needed as part of the general availability launch, etc. Today, not only is software being developed and released on a continual basis, communications with the end customer are on-demand too. Digital publishing, social media and search tools have evolved to help marketers cope with the new realities. Webinars and podcasts are part of the new mix too. And soon there will be chatbots and other AI-enhanced aids to help reach customers with messages when they are ready to hear them.

As with DevOps, the tools to communicate about a new product introduction get better, but the noise, distractions and pressures increase too. I think we are left with the same problem: how to make the customer a partner in the cake baking, so that everyone has a sense of anticipation up to the moment it is delivered, and a sense of satisfaction when they try it.

Posted in ANALYTICS, MARKET ANALYSIS

Appreciating How Metadata Makes AI Possible

label_maker

One of the most re-watched episodes of the comedy series Seinfeld is ‘The Label Maker’ when Elaine’s gift to a friend at Christmas was re-gifted to Jerry before the Superbowl. True, a thing for tagging other things was not a fun or romantic gift in 1995. And, perhaps Bryan Cranston’s fictional dentist didn’t get it. But, Julia Louis-Dreyfus’s Elaine was way out ahead in her thinking. Label making is important! If you are a data wrangler today, you should appreciate any gift that helps you tag things with metadata labels.

Frank Chen of Silicon Valley venture capital firm Andreessen Horowitz (a16z) presents a timeline in his AI and deep learning mini-course that happens to plot label-making’s journey from elephant gift to tech’s newest cool thing. Released to all interested students in June 2016, it is a fantastic history lesson and primer on what is happening in artificial intelligence (AI) today. He writes:

“One person, in a literal garage, building a self-driving car.” That happened in 2015. Now to put that fact in context, compare this to 2004, when DARPA sponsored the very first driverless car Grand Challenge. Of the 20 entries they received then, the winning entry went 7.2 miles; in 2007, in the Urban Challenge, the winning entries went 60 miles under city-like constraints. Things are clearly progressing rapidly when it comes to machine intelligence. But how did we get here, after not one but multiple “A.I. winters”? What’s the breakthrough? And why is Silicon Valley buzzing about artificial intelligence again?

The same AI entering cars is impacting buildings too. Listen to Ken Sinclair discuss the surprising rate of innovation in his latest ControlTrends interview. Chen answers his own question this way: more compute power, more data, better algorithms and more investment. His research colleague at a16z, Ben Evans explores the topic of labeling in more depth in the blog post AI, Apple and Google. Here are some key excerpts:

  • So you can say to your phone ‘show me pictures of my dog at the beach’ and a speech recognition system turns the audio into text, natural language processing takes the text, works out that this is a photo query and hands it off to your photo app, and your photo app, which has used ML systems to tag your photos with ‘dog’ and ‘beach’, runs a database query and shows you the tagged images. Magic.
  • Try it without labels (‘unsupervised’ rather than ‘supervised’ learning).  Today you would spend hours or weeks in data analysis tools looking for the right criteria to find these, and you’d need people doing that work – sorting and resorting an Excel table with a million rows and a thousand columns, metaphorically speaking.
  • The eye-catching speech interfaces or image recognition are just the most visible demos of the underlying techniques.
  • The important part is not that the computer can find them, but that the computer has worked out, itself, how to find them.

Did you catch that? The speech and image recognition technology may be superficial eye-candy compared to the feat of putting together the underlying knowledge graph. In other words, how you classify and label objects is at the core of how well your AI works. Knowledge graphs for the World Wide Web are the domain of semantic web researchers. Three leading professors in the field from the University of Zurich, Rensselaer Polytechnic Institute, and Stanford University collaborated on the September 2016 article, A New Look at the Semantic Web. Here are some key excerpts from this long-form editorial:

  • Bringing a new kind of semantics to the Web is becoming an important aspect of making Web data smarter and getting it to work for us. Achieving this objective will require research that provides more meaningful services and that relies less on logic-based approaches and more on evidence-based ones.
  • Crowdsourcing approaches allow us to capture semantics that may be less precise but more reflective of the collective wisdom
  • We believe our fellow computer scientists can both benefit from the additional semantics and structure of the data available on the Web and contribute to building and using these structures, creating a virtuous circle.

Labeling, edge computing, artificial intelligence—these are three pieces of the same puzzle—a puzzle that seems to be coming together very fast right now. (Don’t miss the recent slideshow of another a16z thinker, Peter Levine, on how edge computing will soon eclipse the cloud.) The concepts and timing that Silicon Valley’s a16z thought leaders describe are as applicable to buildings as they are to cars, dogs and beaches. And, the academics leading the semantic web conversation are saying that the mark-up languages and metadata schema are coming from all corners of the web, not just ivory towers. Frank Chen points out that the latest Google image recognition algorithms can chow down on the entire collection of videos on Youtube. But, when they do that, they get a graph that skews in favor of cats doing funny things. That doesn’t reflect the real world. The best knowledge graphs, metadata schema, neural nets—whatever you want to call this undergirding ML labeling technology that does the classifying—the versions that work best reflect the collective-wisdom and first-hand evidence of those with physical-world experience.

This brings us to Project Haystack, the open-source organization launched in 2011, devoting to developing a standard mark-up language and a tagging schema for devices in commercial buildings. Given the core importance to AI of getting standardized labeling right the first time, it is no surprise that Academia and big-IT picked up on the Haystack schema when they launched Brick schema. One way to look at it is that there is more industry, academic and government energy, focus and money being invested in label-making than ever before—what a gift! Seinfeld’s Elaine Benes would be such a supporter if she were here today. And even the dentist that became Walter White of Breaking Bad would not under-appreciate it. I hope more of those that hold the evidence and wisdom to contribute get involved. Silo-ing data was the way business was conducted in the last innovation cycle, but, it won’t work going forward in the age of AI and machine learning (ML).

Another reason to do your part: tomorrow, there may not be chief marketing officers and chief technology officers, but rather chief labelers of marketing things and chief labelers of technology things, etc. The labeling of training data for machine-learning algorithms is about to consume us all—at least everyone that works with computers, mobile phones, and Internet-of-Things devices. So, best to get ahead of the game.

Posted in ANALYTICS, MARKET ANALYSIS

Vendor Lock-in of Your Data Could Happen Again: Protocol Wars are Moving from Cable to Cloud

Today’s Smart Building Data Exhaust May Be Tomorrow’s Machine-Learning Gold

Smart Building Owners should remain vigilant in the battle against vendor lock-in of their building data.

Smart Building Owners should remain vigilant in the battle against vendor lock-in of their building data.

Take back the Internet!” is the Net Neutrality movement’s rallying cry against big telecom and cable companies. “Take back your life!“ say digital privacy advocates in warning against the online advertising industry. “Take back your enterprise!” is how the Open Source software movement positioned itself vis-a-vis big-brand business software.  And, “Take back your building!” is how the open-protocol building automation community has framed its alternative to protocol lock-in by big OEMs.

None of these battles started yesterday, and none of them will end tomorrow. Caveat Emptor, a principle of business so infernal and eternal that you have to say it in Latin, applies: If the people using data and investing in data tools don’t get educated, vendors are going to act in their own self-interest. The battle against building data lock-in could be moving to a new arena, from serial cabling protocols to the Application Program Interfaces (APIs) defined by Cloud-hosted applications. Let’s not let that happen.

Yesterday’s BMS Protocol Wars

The history of the building controls industry is marked by stages of vendor lock-in. When the buildings industry was first going digital decades ago, BMS manufacturers wrote their own protocols to transport data over serial cable. They made these proprietary, providing keys only to technicians within their sales channel with the result that they had a near monopoly when future service and upgrading was required. These service revenue streams were more valuable than the original price of the systems. It then became a practice to price low during the initial bidding process, with the intention of charging captive customers more later. This led to complacent, slow-to-innovate vendors and unhappy customers who felt they were being milked with every service call. These are all good lessons that no one wants to relearn.

BACnet and other open communications standards were meant to put an end to this syndrome and bring back market forces. With open standards, controls contractors could work on equipment from multiple brands and customers could again run competitive bids for their business. The lock-in practice died slowly though, as Darren Wright, a Director at Arup explains here.  For a while, OEMs could say they were BACnet-compliant, but still maintain proprietary protocols at the BMS controller-level.The era of the Internet of Things is expected to finally bring the serial cable protocol wars to their conclusion. But, a similar approach to vendor lock-in, or even worse, could happen with closed APIs within cloud services.

An Open Future Demands Open SDKs

In the interest of Caveat Emptor, any project teams evaluating cloud solutions should ask vendors “Will near-realtime and historical trend data be available to us when we want it?” If the answer is “We have APIs.”— red flag. APIs are not enough!  If any company is holding your building’s HVAC, lighting, power meter or other operational data in their cloud, the chance that you will be able to access it in a practical way via their APIs is quite slim. What you want to be asking cloud service vendors is “Do you supply free Software Developer Kits (SDKs) in open source languages?” If you opt for only cloud services that provide this assurance, your data won’t be held for ransom when the inevitable day comes that you need it for your company’s own internal purposes or to supply as data streams to other vendors. Here are a few more guidelines:

  • Ask about SDKs for the cloud server side as well as for the client or edge device side.
  • The SDKs should be available in JAVA C++  for client javascript and any other popular open languages for the particular service.
  • The file format should not be cryptic. Definitions to open source files are needed, even if the file format is open source. File format definitions are widely available; for example, there is open source for yaml and json.
  • If the cloud application is using custom-file-format source code to parse and compile, this file format should be supplied in all languages in the SDKs.
  • Ask if any other company is using the particular cloud vendor’s APIs, and confer with the references provided.
The Project Haystack open source community provides a good example of the type of Software Development Kit support needed to ease data interoperability in a full range of use cases.

The Project Haystack open source community provides a good example of the type of Software Development Kit support needed to ease data interoperability in a full range of use cases.

Predictive Value in Data Exhaust

As more and more sensors and digital devices are introduced into buildings, digitizing more physical processes at tighter intervals, it is a natural impulse to look for ways to hand-off the burden of managing all that data. Except for the tip-of-the-iceberg analytics results, new data just seems like new problems—so much data exhaust with no real value. But, when a cloud service takes your data and starts observing operations and applying algorithms, new value is being created. The vendor has a right to its portion of that value, but the newly calculated data should also be readily accessible to the customer, i.e. the building owner. Software companies may be a step ahead of facilities managers and building owners about machine learning and artificial intelligence concepts; however, customers too want to get all the predictive value possible from their buildings’ digital streams. As we covered in our September article, it is imperative that building owner takes the reins of their operational data strategy today. This is how they can position for success as we move into an era of more autonomous buildings.

Deciding among platforms for energy management and building operations management is challenging. There are so many. But, the lessons that the buildings industry has learned from the past should inform next steps. The general software industry offers its lessons as well. All the critical plumbing components in enterprise business stacks have moved toward open source software.

The home automation industry offers its own parable illustrating why you shouldn’t entrust your building data to a single vendor with closed APIs and data feeds. When Google Nest shut down, or ‘bricked,’ the Revolv home hub in April 2016 it rattled the tech industry at large and the emerging IoT industry particularly.  It’s notable that Google parent company Alphabet’s next step was to open source the Thread protocol that Nest uses for communications.

The takeaway lessons are, don’t be lured into a walled garden by a shiny interface, a low initial cost, and promises that your data will be well taken care of by a cloud vendor. Ask questions to ensure your data will be open to your future uses. Weigh the pro’s and con’s of on-premise, in-the-cloud, and at-the-edge computing and storage, and negotiate a fair price for keeping as much of the resulting data as is practical and affordable. It may be the cache you need for machine-learning training in the future. The motivations of your in-house IT department and/or professional hosting services are more aligned with your own regarding how your building data is best stored and secured.

The IoT era promises a new beginning and faster innovation cycles, but not necessarily simplicity — at least not in this early transition time. The buildings industry should arm itself with the right people and resources to navigate the complexity and not fall prey again to vendor lock-in as it did in the early 1990s.

Tagged with: , ,
Posted in ANALYTICS, NEWS

Climate Action, Digitalization and the Opportunity to Reenergize the Buildings Industry

Siemens smart building performance tooll

Climate laws and agreements are being enacted, and big building technology companies like Siemens are positioning to come out heroes

It has been an eventful few weeks for Climate Action watchers. From tree-huggers to eager capitalists, there are a lot of people happy to see moves that will curtail fossil fuel burning. California signed SB32 into law which commits the state to reduce its greenhouse gas emissions 40 percent below 1990 levels by 2030. A critical mass of countries formally committed to the Paris climate agreement at the UN general assembly in New York. (India may sign this week). And voters in the USA pressed presidential candidates to detail their positions on the issue of Clean Energy leadership.

Amory Lovins, Chief Scientist, Rocky Mountain Institute, holds that climate protection can be profitable. But, for which countries, regions and companies, and in what time frames? There will be winners and losers along the way as these laws and agreements are translated into actions. Certainly today’s biggest building technology companies want to be among the winners.

Lovins spoke at VERGE 16 in Santa Clara, CA, on Sept 20th. On stage with him was Rainer Baake, State Secretary at the German Federal Ministry for Economic Affairs and Energy. Baake reported that Germany is on the right trajectory to meet its target of cutting carbon emissions by more than 80 percent by 2050. A theme of the plenary session was that, if you want to see tomorrow’s clean energy system, look at Germany today. Complementing Secretary Baake’s presence at VERGE 16 were break-out sessions featuring Siemens Building Technologies (BT) Division.  Their message was ‘If you want to see what BEMS (building energy management systems) will look like tomorrow, look at what Siemens is doing today.’ 

Like Germany, San Francisco has a target of reducing CO emissions by 80% by 2050 (against a 1990 baseline). The city collaborated with Siemens on a holistic study released during the conference that applied its City Performance Tool (CyPT). The analysis included recommendations on how to allocate resources toward energy efficient buildings, clean energy, and a multi-modal transport network. Cities and utilities that want to scale building efficiency programs fast look to partner with companies that have a breadth of businesses, the financial resources and the long-standing customer relationships of a Siemens. As shown in the CyPT graphic below, the company has product and service offerings to fill in almost every box in the box-chart.

A growing business inside Siemens BT is the Building Performance and Sustainability Group, serving Infrastructure and City customers. Director of Energy & Sustainability Solutions Ari Kobb led a session on Siemens Energy Data Management services. He noted that Siemens has been moving customers to ‘digitalization’ on both the demand-side and supply-side of the energy equation for years. His group is now able to quickly bring that digital information together as data services. Of course, today there are numerous big industrial companies and IT companies striving to compete for the same enterprise data management business. With competition among the giants fierce, the pressure is on to deliver easy access to data, positive user experience and reduced complexity. The big building equipment OEMs now have extensive experience working with the data streamed from each others’ systems, neutralizing protocol lock-in and making the practice more and more a thing of the past.

Siemens is also learning from its customers in these large-scale enterprise engagements. Another Siemens BT Managing Director, Kimberly Brisley, presented on her work with a life sciences company. Leaders from tightly-regulated process industries like this customer have data science at the core of their culture. They have goals and suggestions for new ways to derive value from energy and building operational data. Siemens is helping them achieve these, and then taking back the learning, translating it into new digital solutions to add to its suite.

While the sensing, software and networking technologies needed to build the digital solutions listed by Siemens in the San Francisco City Performance study have all been generally available for years, the business models, real estate practices and laws have not encouraged fast adoption. It’s notable that researchers for this analysis found 0% of buildings doing monitoring, only 4% doing performance optimization, and 0% doing remote monitoring. In other words, the market for such building data services is truly in its infancy.

Among building types, retail stores have been early adopters of Remote Monitoring. The Siemens VERGE expo featured demonstrations of its Site Controls Energy Management System (EMS) led by Carl Anello, National Accounts Director, and Aaron Moore, National Accounts Executive, of the Retail Group. EMS reduces energy usage at stores by monitoring and/or controlling key energy-consuming devices such as HVAC (heating, ventilation and air conditioning) units, indoor lighting, store signage, indoor and outdoor temperature sensors, and refrigeration units. The EMS also monitors energy output and performance of on-site solar panels. A cloud-based data analytics platform provides dashboards, KPIs and outlier reporting to quickly pinpoint and resolve issues that would otherwise drive excessive consumption or negatively impact customer comfort.
siemenssfstudy
Siemens itself has committed to being Carbon Neutral as a company by 2030, and is preparing to lead in distributed energy products and services. For all these reasons, it is in a unique position to support customers as they take steps toward greater energy efficiency, resilience and eventually energy independence by running their buildings as microgrids. It is going to take such global powerhouse companies scaling energy efficiency at a fast pace, if the world is to meet recently legislated Climate Action goals.

The big companies are defining their terrain in this new era of Climate Action like snow grooming machines on a ski slope in a blizzard.  But, they are leaving plenty of space for innovative start-ups to roll their own snowballs through this landscape to amass a growing business of their own. …I’m sure that metaphor came from me hoping that Northern California will get sufficient snow in the winter again, once we start taking action to reduce the earth’s GHG blanket.

Tagged with: , ,
Posted in MARKET ANALYSIS, NEWS

Recapping Realcomm/IBcon 2016

IBcon2016_SV

As a Silicon Valley local, I was asked multiple times during last week’s Realcomm/IBcon conference in San Jose whether I was a fan of HBO’s Silicon Valley TV series. Of course, I am!  Anyone following the show knows that the latter part of the Season 3 storyline hinges on the right metric to gauge the success of a software app. For, the fictional Pied Piper app, the choices were number of New Downloads (NDs) versus Daily Active Users (DAUs). That got me thinking about the right metric for measuring momentum in Intelligent Building awareness. Is it the number of new faces drawn to this 18th annual event – ie, the NDs? Or the number of exhibiting companies and session speakers that come back every year – ie, the Smart Building industry’s DAUs. In either case, Realcomm/IBcon 2016 Silicon Valley set new records.

I started noting all the new people and some new metrics at the Smart Building Integrator Summit, one of the eight pre-conference events. Jacob Jansen, managing director of HC RT of the Netherlands and a member of the InsideIQ Building Automation Alliance, presented on ‘The Edge’ office building in Amsterdam. ‘The Edge’ is widely recognized as a world-leading project in sustainable design and incorporates a Schneider Electric Smartstruxure backbone to connect systems and people.

AMS_to_Cologne

According to MSI of The Edge, even today’s most advanced Smart Building is far from the full potential of the embodied concepts. He estimates that, if Smart Building innovation were a journey from Amsterdam to Rome, we’re approaching the city of Cologne right now.’All roads lead to Rome,’ whether you are driving toward a building with the greatest comfort, operational efficiency or micro-grid capabilities. For one description of what it will be like once we get there, listen to the interview with Terry Casey of Intellastar. Tune to about minute 4:30 when he says ‘Hey, as a control engineer, this is going to be fun!’

Looking back at The Edge design-build cycle, Jansen offered these six success factors:

  • Include all partners from Day #1 of the project, especially your master system integrator and IT/data specialists
  • Build on a single IP backbone
  • Put the data in the cloud for easier collaboration and highest security
  • Deploy a wireless sensor network that will support a full range of location services
  • Make it easy: A single app for all occupant services for the whole building
  • Aim for continuous performance improvement guided by data analytics that pull in historical data, real-time data and predictive algorithms

That list was a stage-setter for the rest of the Integrator’s Summit, and the breakout sessions over the next two days of the conference.

Another ‘new face of the Master Systems Integrator ‘ (MSI) was Paul Maximuk of Ford Land Energy, his first time at a Realcomm event. Paul’s big challenge at Ford has been converging all of its facilities globally onto one standard app for energy reporting and continuous performance improvement. He combines equal parts Facilities and IT know-how and gave some sage advice when fielding Panel Moderator Scott Cochrane’s question “How are we screwing up the world with the IoT?” Maximuk replied “We can give a lot of data to a lot of people that don’t need it. Plan for the possibility of a malicious insider. Assign access levels to data. As a general rule, share only enough information for each employee or partner to do their job. And no shadow IT! All new solutions need to be run through IT processes.”

IBCon2016_PAULofFord

Smart Building Integrator Summit Panel: Eric Stromquist of Stromquist & Co.; Paul Maximuk of Ford Land Energy; Alex Waibel of BuildingLogix; Scott Cochrane of Cochrane Supply.

Maximuk and his team at Ford as well as Jansen and The Edge project team won Digi awards this year. Another winner was Massachusetts  Institute of Technology (MIT) in the category of Most Intelligent College Campus. MIT has relied on Clockworks™ from KGS Buildings for realtime monitoring and automated FDD since 2010.  You can see the full video of the award ceremony here.

IBCon2016_Ping

Pook-Ping Yao of Optigo Networks addresses how to securely integrate building services on an IP backbone network at the Integrator Summit pre-conference.

 

IBcon2016_Petze

John Petze of SkyFoundry Analytics addresses solution categories in the marketplace –FDD, ADR, CMMS, BIM and more.

In the category of familiar faces, John Petze of SkyFoundry gave an overview of Smart Building and IoT topics and vocabulary, including the Project Haystack meta-data tagging methodology. One of his messages is that a comprehensive IoT data strategy needs to support analytics performed at the “edge” as well as at the cloud. This message was underscored on the expo floor where the EAC – Energy Analytic Controller line from BASSG, the Iotium iNode, the Intellastar family of T-Star Controllers, and the Dell Edge Gateway 5000 Series were all demo-ed. Lynxspring was showing its Edge Data Pump for Haystack as well as its JENEsys Edge controllers.

Better cross-discipline knowledge on the part of the MEP engineering firms that spec IoT was another common topic at the Integrator’s Summit. Towards this goal, the Realcomm/IBcon 2016 organizers extended invitations to a number of prominent engineers this year. David Kaneda of the Integral Group, presented on the topic of Net Zero design. He owns and helped to design Silicon Valley’s first Net Zero Energy project over a decade ago. Now Integral Group’s global network of engineers has provided building system design and energy analysis services for some of the most energy-efficient and sustainably built projects on the planet. Kaneda went to the Intelligent Buildings Boot Camp chaired by the Intelligent Buildings Ltd team on Pre-conference Tuesday and spent a lot of time visiting all the booths at the Expo. His question: “Where are all the other engineers?”  I think we will see more MEP firms at future RealComm/IBcon events.

IBCon2016_Kaneda

David Kaneda presents on how measurable energy efficiency is designed into buildings. Derek Jones of Navigant moderated this session and Landry Watson of DPR Construction, Kevin Bates of Sharp Development and Carrie Brown of Resource Refocus also presented.

Some of the most active of the Smart Building industry’s DAUs were the team from Hepta Systems. Jason Houck and Etrit Demaj of Hepta were on the stage moderating and presenting on topics from IoT for Buildings, Smart Sporting Venues and a deeper dive into Data Platforms. They brought the message that systems integration should be on a single IP network and the user experience should be accessible under a single pane of glass. You can watch the demo they showcased on the expo floor in this interview with the father-son team.

IBCon2016_Casey

Terry Casey of Intellastar LLC presents during the session covering Operating and Data Aggregation Platforms. Brian Oswald of CBRE/ESI as well as Michael Marcotte of SAP America and Haritharan Gunasignham of Eutech were also on the panel.

IBCon2016_Lindsay

Ken Smyers moderates the panel on HVAC and the Connected Integrated Building with Lindsay Baker of Comfy, Matt Eggers of Yardi, Kevin Facinelli of Daikin, and Gary Kohrt of Iconics.

A motivator that reaches exaggerated levels in Silicon Valley, and at Realcomm/IBcon, is FOMO, Fear of Missing Out. (FOMO drives the plot line in the HBO series this season) There is so much going on at once! The organizers have eased the anxiety by incorporating a TV station, so we can catch some of what we missed and review what we liked later on YouTube. Realcomm’s Gerry Katzman and the ControlTalk team have done an outstanding job of interviewing. Josh Bradshaw also covers the conference and some of the Friday building tours on his blog worktechwork. You should ease up on the FOMO and watch all of the great recordings from Realcomm/IBcon 2016 and plan to attend in 2017.

Posted in NEWS, Uncategorized

IoT Makers in Legoland

AllAboutDesk_ArupLike kids set loose in a roomful of legos, IoT product developers around the world are making brand new things to connect to the Internet out of smartphone components and open source software stacks. Makers are taking advantage of the supply chain built by the mobile computing industry. They have low-cost compute devices like Arduino and Raspberry Pi as well as affordable sensors (accelerometers, gyroscopes, magnetometers, GPS, temperature, etc) at their disposal. Open source organizations and big tech companies are trying to harness Maker Movement power by sponsoring hackathons, contests and test bed sites. Even the US Department of Energy now sponsors a crowd-sourcing initiative with the specific goal of jumpstarting citizen building technology innovation. DOE’s JUMP program brings innovators together with national laboratories and private sector partners to help them test their ideas and secure funding. The impact that Internet of Things makers are going to have on Smart Buildings shouldn’t be underestimated. Here’s a few of the concepts to watch:

Design-thinkers at Arup are experimenting with all sorts of emerging technologies. They’ve created desks with features such as extra low-voltage DC power charging for mobile devices. And they’ve deployed rapid manufacturing techniques including open-hardware 3D printing to realize their concepts.

Why adjust temperature across a whole space when occupants can adjust for their comfort themselves through their furniture? That’s a question that launched a project at the Center for the Built Environment, University of California at Berkeley, which evolved into the PCS Hyperchair from Personal Comfort Systems. This personal heating and cooling chair adjusts to the sitter’s preferences like a car seat. Chairs include Bluetooth and WiFi connectivity to share useful data about occupant preferences with HVAC systems. Chair settings can be controlled via a mobile app or directly.

PCS_hyperchair_App

The PCS Hyperchair is covered in this CityLab article: “To Save Big on Energy, Heat People, Not Air.” There are a few other IoT Maker-Movement inventions in that story as well. Providing occupants with connected furniture might seem like a big expense. However, manual overrides of thermostats in response to occupant hot/cold calls is often the first step on the slippery slope toward control programming that is non-functioning. Simultaneous heating/cooling, full-blast services overnight and on week-ends, and other common patterns of energy drift in buildings start here. Compared to the cost of that waste, these chairs could be an economy.

More IoT use cases are showcased in this video about new digital services at The Edge in Amsterdam, touted as the world’s “greenest office.” Check out the occupant mobile app for room booking/environmental control and the QR code-activated system for lockers.

The price of entry into the Maker Movement is low and Do It Yourself (DIY) assembly is easier than it has ever been. VizLore presents its simplified vision of the management of EnOcean sensors and switches in this video. EnOcean devices require no cable and no battery. They can be used to switch lights, control blinds and get sensory information like temperature, humidity or presence detection. Data captured by the IoT devices can be managed through VizLore Cloud Services.Wi_Ocean_Video_Vizlore

Makers look to open source communities for the connectivity software to integrate their IoT devices with existing systems. To build viable products for commercial building and industrial settings, they need to be concerned about more than just connecting through a standardized communication protocol, they need to design for security, use standard semantics for naming, and accommodate efficient transport of metadata. Connectivity is not data interoperability. VizLore lists open-source initiative Project-Haystack as one of the protocols it supports to streamline semantic data modeling.

Two complementary open-source initiatives aimed at data interoperatiblity among industrial Internet of Things (IIoT) devices announced their intentions to cooperate this month, just in time for Hannover Messe 2016. Both are based on publish/subscribe (pub-sub) communications familiar to mobile app users; however, the OPC Foundation supports OPC UA and the Object Management Group supports DDS.  The agreement between the two organizations clears up some market confusion that was believed to be holding back IoT developers and buyers. In a move that should increase market confidence in a future of more seamless data interoperability, Microsoft just announced extended support of the OPC UA open-source software stack in Windows devices and the Azure cloud platform. This announcement should please the makers.

This IoT wave is more about empowering people than fitting within existing enterprise department and practices. Its adoption curve could resemble the BYOD (bring your own device) movement that has transformed enterprise IT over the last ten years. Prior to BYOD, IT staff could hold to the policy of only supporting hardware and software that they selected, procured and distributed. But, when everyone from the CEO on down joined BYOD, IT departments eventually had to acquiesce and work with their vendors to figure out how to license applications and provide data security for a much larger range of devices and more mobile workforces. Building operations managers could be under similar pressure to change as occupants demand more personalized control over their temperature, lighting and other digital services made possible by this Maker Movement.

Tagged with: ,
Posted in MARKET ANALYSIS, NEWS
Content Marketing Services

BuildingContext is a marketing partner that understands data-driven buildings. We can create compelling original content that speaks to your target audience. We'll also work to promote your messaging across blogs, social, search engines and other channels to attract, inform and convert the right people at the right time.

Follow @BuildingContext