Pepco’s Monitoring-Based Commissioning Incentive Program

FAQ’s

What are Pepco Incentive Programs?

  • Pepco Incentive Programs are funded through the EmPOWER Maryland program, and pay commercial and industrial customers for things that will reduce a buildings annual energy consumption like equipment upgrades, LED lighting retrofits, and tracking adjustments to a building’s operations via Monitoring-Based Commissioning by Datakwip.

What is the EmPOWER Maryland Program?

  • The EmPOWER Maryland program is funded through a tax on your monthly utility bill that is based on your buildings total energy use each month. The more energy you use, the more you pay. The tax can be located on the second page of your bill, under a line item titled “Empower MD Chg”.  The funds are then re-distributed to customers that elect to participate in programs such as Monitoring-Based Commissioning (MBCx) by Datakwip.

Do we have to spend any money to get the Monitoring-Based Commissioning rebates?

  • In some cases, there may be up-front costs. With the MBCx program, these are relatively small, and the customer is always provided with the information they need to choose how they would like to proceed at each stage of the program. The goal of MBCx program is to find the most valuable opportunities that require little or no capital investment. At each stage of the program, customers are given an option to cancel without penalty.

What if we’ve already taken advantage of other incentive programs like lighting upgrades?

  • The MBCx program can be performed before, after, or concurrently with other incentive programs. Pepco encourages its customers to participate in as many programs as possible.  Even if you have done lighting and HVAC retrofits in the past, you are still eligible for the MBCx program.

Will this affect our tenants?

  • Building owners are never forced or obligated to implement any particular energy conservation measure.Not in a negative way. One of the additional benefits of the MBCx program is improved tenant comfort monitoring. There would never be a measure implemented that would conflict with your tenant’s leasing requirements.

Will we have to implement every energy saving measure found?

  • Building owners are never forced or obligated to implement any particular energy conservation measure. Additionally, if at any point after a measure is implemented a customer decides it is not feasible or is negatively impacting their buildings operations, they can opt out and simply forgo the incentive payment.

How do I get started?

  • It’s easy! Send us your last thirteen utility bills and your building’s total square footage and we’ll tell you if your building qualifies for MBCx incentives!

Click Here To Get Started!

Additional Details Available on Pepco’s Website

www.pepco.com/monitoring-based-commissioning

Openness is Key to the Future of Energy Efficiency

I attended a reverse trade mission event recently, and had the opportunity to speak with foreign delegates representing nations in South America and the Caribbean regarding energy efficiency. Everyone attending was asked to introduce themselves to the delegates, speak a little about their company, and offer some advice. We were all given three minutes to speak, most took that or more. I only took 45 seconds.

I thanked them and the event host for the opportunity, introduced myself, and gave my standard explanation of what Datakwip is and does. Fifteen seconds tops.

Then my advice…

My advice to the visiting delegates was this, “When evaluating any technologies to advance your nations goal of achieving smart buildings, smart cities, any type of smart technology, openness should be priority number one. If proprietary systems are installed, you will not only be committed to that particular brand or manufacturer for years or decades to come, you will also risk missing out on the benefits of new technologies that could enhance or benefit from those proprietary system, due to an inability to integrate.”

After the event, one delegate approached me and expressed how happy he was to hear what I had to say. He said that the group had recently met with representatives from large USA based corporations, who showcased their products for energy management systems, high efficiency equipment, and general “Smart Building” products. He told me that in all of their conversations, there wasn’t a single minute dedicated to the topic of “openness”.

To successfully commit to any smart technology initiative, numerous technologies must be implemented, working seamlessly together to achieve the “smart building effect”. The only way that’s possible is with open, non-proprietary communication protocols, which will allow the many technologies from numerous manufacturers to work together.

I’m extremely grateful for the opportunity to offer my humble advice, and would hope any decision maker or stakeholder in the commercial and industrial sectors, from the single building owner to the Chief Technology Officer of a Nation, consider openness as a top priority when evaluating “smart” technologies.

Resource for understanding and implementing open protocol Energy Management Systems

Contact us to learn how your existing systems can be leveraged to gain greater efficiency, in a single building or across an entire portfolio.

Sam Wilson, LEED GA, FMC
Director, Business Development
Datakwip, Inc.
www.datakwip.com
sam.wilson@datakwip.com
1.866.278.7915

Successful Analytics Start with Controls

Building controls system design is a critical element of successful analytics deployment. Each day, more than 10,000 data points are collected in 5 to 15 minute intervals, generating as much as 3 million data points. In one year’s time, that number increases to 1 billion data points.

Understanding the Methods of Integration

There are numerous options for connecting analytics platforms to building control systems. Key items to consider include:

  • Analytics server location, which includes local and cloud servers
  • Data collection strategy, which includes live data polling, history syncing, and direct database connection

The best practices in controls architecture and specifications to support include physical layer and data collection strategies, as well as open communications protocols, point naming conventions, data modeling and tagging, and IT and security.

Building controls system design is a critical element of successful analytics deployment. When you understand the potential integration methods between analytics and controls systems and you use the best practices in control system design to support analytics integration, then you can leverage the analytics platforms to support the use of new best-in-class control sequences.

You might also consider analytics tools to support the traditional commissioning and retro-commissioning processes.

Taking Control of Analytics

We know that analytics is the discovery, interpretation, and communication of meaningful patterns in data. It is a structured approach to data collection, cleansing, statistical analysis, presentation, and conclusions.

When it comes to building analytics, we have to look at three key components: the overall number of buildings, the scope of the equipment being used, and the data analysis. For example, let’s look at the frequency of data collection. The frequency must be sufficient to furnish useful interpretation based on the probable degree of change in the time span and the use of the data.

Keep It Simple and Useful

With push-based notifications, you need to be as specific as possible. For example, if the immediate problem is you suspect the valve damper in zone 3 is affecting tenant comfort, your future problem might be, based on your observations, that pump #3 has less than 1,000 operating hours before it needs to be replaced.

Remember, all of these things provide an opportunity to save energy – and money – for the building owners.

In 2015, a sample of 16 central chilled water plants (chillers, fans, pumps) for all equipment in the second full year after commissioning saw an estimated 27 percent energy waste overall. It’s always important to know that there is a way to save energy and money – you just need to find the right people and the right building automation to do it.

NOI Vs. Asset Valuation Increase

Whether you own a commercial office building, run a data center, manage a hospital, or oversee the operations of a school district, you have a budget and you have operating expenses. Chances are you’ve heard the phrase “low hanging fruit” in referring to savings, such as updating to more efficient lighting fixtures, installing low flow toilets and other “swap & save” solutions to lower operating expenses, but what about the added benefits of lowering the cost to run your facility?

That’s where Net Operating Income (NOI) and Asset Valuation come into play. By taking measures to lower the cost of running your facility, you are, in fact, increasing its value.

ENERGY STAR® calculates that a 10% decrease in energy use could lead to a 1.5% increase in NOI — with even more impressive figures as the energy savings grow. In light of the current compression of capitalization (cap) rates (NOI divided by the sales price or value of a property expressed as a percentage), it is possible to turn pennies into millions.

For example, in a 600,000-square foot office building that pays $2 per square foot in energy costs, a 10% reduction in energy consumption can translate into an additional $120,000 of NOI. At a cap rate of 8%, this could mean a potential asset value boost of $1,500,000.

Ironically, that 10% reduction is the minimum energy savings estimate for an energy analytics software solution. For roughly the annual cost of one building engineer, a building using an energy analytics service can leverage their existing staff to take low or no cost action to address high cost operational issues, which will decrease the strain on it’s operating budget and provide a healthy return should the facility be sold.

Office buildings with pass through leases benefit as well, in the form of more competitive leasing rates. Building A leases for $51/ftand has $9/ftin pass through expenses, Building B leases for $50/ftand has $10/ftin pass through expenses. The burden on the tenants is equal at a total of $60/ft2, but Building B is taking in $1 less per square foot due to the higher operating expenses, and must lower it’s rates to stay competitive. If Building B decreases the cost to operate his building by 10%, it will be able to charge the same rate as Building A, and increase profits.

By decreasing the cost to operate a facility, it will be more competitive in the marketplace for leasing, lease renewals and potential sale. Ultimately, a reduction in energy use, or the cost to operate your facility, will benefit everyone, from the building owner, to the leasing agent, the facility manager, tenants (or hospital patients, data center clients, school district tax payers, tuition paying students at a university, we can keep going…), and the environment.

The energy analytics experts at Datakwip are ready to show you how easy it can be to achieve greater efficiency in your facility, visit us at www.datakwip.com or call 1-866-278-7915 to get started.

Efficient Inefficiencies: The Disparity between Investment and Performance

LEED, Energy Star, Utility Rebates, Market Perception, lowering Operating Costs, increasing Net Operating Income. There are many reasons to invest in Energy Efficiency. For most buildings, this comes in the form of major capital investments for infrastructure improvement. We operate mostly on the assumption that things are expensive because they’re old, and something has happened in recent history to make a newer option better. To be clear, this isn’t entirely wrong. Newer equipment does have the potential to operate more efficiently and improve our bottom line, but how direct is that relationship? Does investment always lead to return?

The chart below was compiled by the Energy Information Association using information from the Commercial Building Energy Consumption Survey (CBECS). CBECS is a survey performed every four years and includes details such as what investments have been made in a building and how it performs from an energy usage standpoint.

Figure 1. Building Performance Defined: the ENERGY STAR National Energy Performance Rating System Bill Von Neida, U.S. Environmental Protection Agency Tom Hicks, U.S. Environmental Protection Agency https://www.energystar.gov/ia/business/tools_resources/aesp.pdf

How to Read this Chart

I promise this chart is valuable, but first, a note on how to interpret it. We can see the top of the chart is divided into four categories: ENERGY Star Offices, CBECS Average, CBECS Upper Quartile, CBECS Lower Quartile. These categories denote a building’s energy performance in terms of Kwh per square foot. Working from right to left, CBECS Lower Quartile are the worst performers, CBECS Upper Quartile are the best performers, and CBECs Average is the middle of the pack. Now, ENERGY Star Offices are special. These are buildings that have received their ENERGY Star designation. To have an energy star designation, a building must be among the best performers and apply to Energy Star program. In this way, ENERGY Star buildings can be thought of as buildings from the CBECS Upper Quartile that made a commitment to track and publish their energy performance.

Within in each category (column) we see percentiles in rows. These rows represent the percentage of buildings in each category that have the feature denoted in that row. For example, if we look at the first row, under “Construction,” “Concrete,” we see that 30 percent of ENERGY Star Building have concrete, 16 percent of CBECS Average buildings have concrete, 10 percent of CBECS Upper Quartile have concrete, and 23 percent of the CBECS Lower Quartile have concrete. One more example, if we want to know what percentage of the CBECS Upper Quartile have VSD’s (Variable Speed Drives), we simply find the intersection of the CBECS Upper Quartile column and the VSD’s row, giving us the percentage at 19 percent.

Revelations

Let’s start by looking at EMS systems. Energy Management Systems are considered the de facto first step in achieving energy efficiency by many, and are pretty much standard in modern commercial buildings. However, if we look at the CBECS Upper Quartile (our best performers) we see that only 23 percent have EMS systems, while 56 percent of Lower Quartile (our worst performers) have EMS systems. Doesn’t this relationship seem inverted? Shouldn’t our best performers be equipped with the systems that we expect to help them get there? The situation is further blurred by the ENERGY Star offices with 84 percent having EMS systems.

Let’s move down the rows and examine Economizers, VSDs, and Motion Sensors. All of these features that are categorized as “Energy Efficiency” are not positively correlated with the performance of the buildings in which they are installed. Furthermore, we see that the features (and as a result, investment) installed in the ENERGY Star offices is similar to the Lower Quartile (worst performers), but we know based on the definition of ENERGY Star, that these buildings perform like ones from the Upper Quartile (best performers).

Confused? Good, that means you’re not crazy. Long story short, we see evidence here that investing in Energy Efficiency doesn’t guarantee performance (Upper CBECS vs. Lower CBECS). However, we also see that you can invest in Energy Efficiency and become an ENERGY Star building.

Don’t take my word for it

I really wish I could take credit for seeing this myself, but I owe this article to the following quote, taken from the same report as the table above:

“This paradox – the apparent similarity of efficient equipment between the lowest and highest performing buildings – challenges a longstanding misconception that building efficiency can be defined by the presence of efficient equipment.” The report goes on to say, “… problems with energy efficient equipment is frequently a primary source of energy inefficiency.”

What does this all boil down to? What are we looking at here? In the end, this all adds up to a vicious cycle where we make investments in energy efficient equipment, but we don’t take the necessary steps to ensure that we achieve our expected return in the form of performance improvements.

At Datakwip, we call this Performance Drift, and it is one of the key enemies that we try to eliminate. Using a combination of machine learning algorithms, rules, and alerts, we monitor all the equipment in a given building and ensure that it performs to the baselines needed to ensure building owners achieve the return on investment they expect. Visit us at datakwip.com, or email us at info@datakwip.com to learn more.

Technological Convergence Enabling Real-Time Facility Analytics

In “The Sociology of Science”, Robert K. Merton concludes that, “discoveries become virtually inevitable when prerequisite kinds of knowledge and tools accumulate in man’s cultural store and when the attention of an appreciable number of investigators becomes focused on a problem, by emerging social needs, by developments internal to the science, or by both.”  Merton shared this insight in 1973, summarized now as the “Multiple Discoveries” theory, with no knowledge of how technology would evolve.  However, this idea, that the convergence in demand for a technology and our own ability to implement it explains many of our fastest growing industries today.  Chief among those is Energy and Facilities Analytics.

A long time, ago, yesterday…

At first blush, “multiple discoveries” seems simple enough; more than one person is aware of a need, so more than one person attempts to build a solution.  However, what is remarkable is the multiplicative effects that one need and its solution has on our ability to identify and satisfy other needs.  Some say the first control systems were found in Egypt over 2000 years ago in the form of water clocks (https://en.wikipedia.org/wiki/Control_engineering).  Throughout the industrial revolution, the capabilities of control systems were further improved by the convergence of enabling developments in science and mathematics. The demand and advantages of such systems were as obvious then as it is now, improved control, more efficient processes, better profits, increased reliability, and user acceptance.  However, after the advent of pneumatic controls, technology remained relatively stagnate as demand, largely driven by the success of such systems, increased.  It wasn’t until the advent of the transistor, integrated circuits, and ultimately, direct digital controls (DDC) that we saw another huge leap in capabilities.  Of note, is that the transistor and its subsequent implementations were not designed for the explicit purpose of building automation or energy management.  It was the convergence of demand and the availability of the necessary technologies that brought about the DDC revolution.  Fast forward to the early 1990’s and we see the demand for remote management and 24/7 awareness converge with the enabling technologies of the internet.  Today, remote connectivity, or at least the IP-based technologies that enable it, are considered standard components in a modern building automation system.  What’s more, is despite its many varied forms (tenant leases, federal energy mandates, energy ratings and standards organizations, etc…), the original demand to implement more efficient processes, drive profits, increase reliability, and user acceptance hasn’t changed in the last 2000 years.

Today, and hopefully tomorrow…

So, we all know where this is going…I’m going to say we are at another amazing convergence of Big Data, IoT, Open Protocol BAS, the Internet and Mobile Applications, Machine Learning, driving applications that we have yet to even imagine!  (Oh and by the way, Datakwip is the one who can do it all!  Our marketing team refers to that as a “call to action” and they get a little sour if I fail to include it.  Box checked.  Scowl averted.)  Here’s the thing, all that is true, but there are a couple of demands and enabling factors that are both critical and overlooked.

But first, a note on cost.  Developing new technologies is expensive.  Making sure they work right is even more expensive.  Making sure they work right even when everything else is going wrong, is REALLY expensive.  The next great convergence is not just a convergence of possibilities, but a convergence of feasibilities.  The technologies we see today not only enable us to do things we couldn’t do before, but by lowering the financial barrier to entry we are enabling more people to participate. Borrowing from Merton, “an [even greater] number of investigators [will] become focused on [this] problem”.

Serverless architectures

Services like AWS Lambda, AWS Kinesis, AWS Redshift, AWS S3, Azure Streams, Azure Blob, and Google Dataflow all have two things in common.  They are data processing and storage tools, and they allow people to write applications as opposed to starting and maintaining server farms.  This is a really big deal.  OK, it’s 2017, so you’re not going to go out and buy a server and put it on a rack in the back of your office.  You’re going to go with an offsite infrastructure, maybe it’s a private data center, maybe it’s a public cloud like AWS or Azure, but more than likely you’re still just turning a Windows or Linux machine on over the internet and logging into it just like it was sitting right behind you.  The problem here is you still, for the most part, must manage it.  This means updating the operating system, configuring and maintaining security patches, upgrading hardware, not to mention monitoring the actual application you brought online to serve your customers.  However, in a serverless architecture, you write your code once, and then (literally) turn up the volume as needed.  This means reduced costs in maintaining the deployment and a lower time to market.  Leaving more time for solution providers to talk to domain experts about the products they’re building.  Even better, on many of these services, developers are charged based on what they use.  Imagine you want to simply check a data stream for temperatures above 75 degrees every five minutes.  You only pay pieces of pennies every 5 minutes.  No multi-thousand-dollar upfront investment for onsite hardware or paying for unused compute time in traditional cloud deployments.  If that wasn’t enough, these services have built in mechanisms to increase the security and resiliency of your applications, no extra coding required.  Imagine building the latest generation of energy analytics applications without having to seek major corporate funding, without having to hire 24/7 devops staff, and being able to leave your office more than once a month and interact with your end users.

(Good) Open Source Software

Software licenses suck almost as bad as multi-thousand dollar servers.  There’s nothing worse than visiting a website, getting excited about how a new piece of software is going to help you or your customers, only to find out that you are looking at a six-figure investment just to kick the tires.  Enter open-source software.  The open source movement has been around for a long time, but over the last 5 to 10 years, we have a seen a deluge of extremely high-quality big data and IoT-oriented technologies become available.  What’s even better is that many of the technologies, while open source, are also backed by full-service software companies.  So, after you play around with these solutions, if you need to talk to the experts, you can sign up for a support agreement at a lower rate than having to “buy” a traditional software license from a major software company.

The IoT Movement

While I referred to the IoT movement somewhat sarcastically above, it is helping with a multiple discoveries convergence, just not in the way some people think.  IoT represents the idea that devices will provide real-time data updates over the internet to various services that will then derive additional value from that data.  This is a critical concept; however, it’s not that much different than an internet based DDC system.  In fact, I may have used the phrase “I was doing IoT before it was cool” at least one time too many.  Personal biases aside, the IoT movement and the corresponding media frenzy, industry hype, customer hoopla serves a critical purpose.  It raises awareness and generates need.  As our friend Mr. Merton stated, we don’t just need technology, we need problems to solve with it.  The IoT revolution, while in some ways a marketing revolution more so than a technology revolution, makes people aware of our technological capabilities and gives them a context in which they can share the problems they believe technology may solve.

Wondering what technologies will converge and help your organization operate more efficiently?  Talk to Datakwip about our real-time machine learning powered energy and facility analytics today (scowl averted, check).

Bitnami