MatrikonOPC OPC Exchange

Archive for the 'OPC Specifications' Category

OPC, Smart Grid and I2G

Friday, February 5th, 2010


The National Institute of Standards and Technology (NIST) recently issued an initial list of standards and other elements needed to support an interoperable smart grid. The NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 1.0, concentrates on standards that will help achieve interoperability among devices and systems. OPC Unified Architecture (UA) is one of the few non-industry specific standards to make the list.

OPC I2G SmartGridThe whole process of choosing which standard makes the cut requires a lot of effort to ensure that those chosen meet the industry specific needs and have sufficient vendor support to encourage a market of compatible products. In order to coordinate work within the SmartGrid community to work towards the achieving interoperability, NIST has coordinated the formation of Domain Expert Working Groups (DEWGs). The DEWG members are subject matter experts representing from utilities, vendors, academia, industry, standards groups, and federal agencies. The working group most relevant to OPC UA is the Industry-to-Grid (I2G) collaboration. As on OPC UA vendor, MatrikonOPC is an active part of the I2G Working group.

I’ll be attending the upcoming ARC forum in Orlando, where Keith Stouffer, the co-chair of the I2G Working group will be presenting on topics related to SmartGrid initiatives. If you’re going to be at the conference and looking to learn more on OPC applications to the SmartGrid or on the I2G working group, let me know. Hope to see you there.

If not you can always read more on it on our OPC and the SmartGrid page.

Express Interface Xi

Friday, October 30th, 2009

Happy Halloween to all the ghouls and goblins out there.  Halloween is really a time of excitement, expectations and a little uncertainty.  What’s in the treat bag? Candy or Apples?  Who’s that behind the mask?  It’s that combined factor of anticipation and unknown that makes Halloween fun.  The same can be said for new interfaces. The recent news about the .NET based Express Interface (Xi) and the inclusion of Xi as part of the OPC Foundation portfolio is bound to raise feelings of hopeful anticipation about new connectivity options and probably some questions as to how everything will fit together.

The coming weeks will bring information on how OPC Xi is moving forward and where it best fits in user’s architectures.  For now, I’m most concerned with what I’m going to wear to the Halloween costume party, and what effect copious amounts of junk food will have on my kids.  What about you?


Have a Safe and Spooktacular Week-end!

P.S.  As a Halloween treat there are a few more questions added to the Ask The Experts pages. 

Is OPC UA as Simple as OPC DA?

Wednesday, July 22nd, 2009

I’ve been away on vacation for the last few weeks, and will be on the road again for the next few weeks.  In the meantime a few more questions have been added to the “Ask The Experts” section.

Speaking of questions, I had a query from Gary Mintchell regarding comments he’s heard about how OPC UA should be ‘simple’ like DA. (Look for some upcoming OPC discussions from Gary at Automation World).

I’d thought I’d share some of my thoughts on the topic, since I’ve come across this more than once myself.


The first thing is what your perspective is on the matter. There is a difference between being “simple to use like DA” and “simple to develop like DA”. 

The end-user experience of starting an OPC UA client application, discovering the list of available servers, connecting to the OPC UA server, browsing for available points and subscribing to value updates does not change very much from what we do today with classic OPC.  What does change is that this process now has more built in security, reliability and integration of other data models like history, alarms and conditions and programs.  Also the infrastructure is no longer tightly dependant on Microsoft operating systems and the challenges of DCOM.  Of course these additions mean that OPC UA product developers now have some more work to do.


To put it in everyday concepts: The mechanics of driving an ’82 Dodge K Car, and a 2009 Electric Tesla Roadster are the same, but how they are designed, manufactured, maintained and work under the hood is VERY different.  The same applies to OPC DA and OPC UA.


With OPC DA, a C++ programmer with a good understanding of COM could download the 200 page OPC DA 3.0 specification and basically start coding.  A bit of a simplification, but that one document bounded what the programmer had to implement.  A programmer sitting down to develop an OPC UA server opens a layered set of specifications broken into thirteen Parts. These documents are purposely described in abstract terms and in later parts are married to existing technology on which software can be built. They also have to consider options such as programming language implementation, security, information model etc.  (Not saying that’s a good or bad thing, just stating some facts.)  For many people, their first reaction is ‘this is complex’. The discussion on ‘how simple or complex’ OPC UA is really a reflection on the difference in scope between OPC DA and OPC UA.


The classic OPC specifications were COM implementations therefore the constraints of COM dictated many implementation details, including target operating system, discovery mechanism, wire protocol, security etc. 10 years ago, developers were mostly concerned with solving the interoperability problem, so accepted these constraints in order to achieve an acceptable standard.  As the OPC Foundation website states “The existing OPC COM based specifications have served the OPC Community well over the past 10 years, but as technology moves on so must our interoperability standards.” 


Users and developers now require more, several factors influenced the decision to create a new architecture:

  • Microsoft has deemphasized COM in favor of cross-platform capable Web Services and SOA (Service Oriented Architecture)
  • OPC Vendors want a single set of services to expose the OPC data models (DA, A&E, HDA …)
  • OPC Vendors want to implement OPC on non-Microsoft systems, including embedded devices

Let’s look at each of these factors, and what impact that has on the scope of OPC UA.


Choosing Microsoft COM as the basis for classic OPC meant that many decisions were already made for the developer, but this also brought with it all the configuration pains of DCOM, close reliance on Microsoft platforms and limited ‘web’ application integration.  Selecting a Service-based model for OPC UA provides cross-platform functionality, and removes the reliance on any one vendor or technology.  In ten years from now, when the protocols used by Microsoft, IBM or Linux change (and they will), then the OPC UA applications will not need to be re-written, only the underlying mappings need be changed.  This abstraction adds scope to OPC UA that OPC DA did not have, but on the other hand by not being bound to any particular technology means that the OPC UA specifications will be timeless.


OPC UA stands for ‘Unified Architecture’, which encompasses all the classic OPC specifications: DA, HDA, A&E, Commands and Complex Data.  So comparing OPC UA to OPC DA is a bit of apples to oranges.  The base OPC UA specifications contain the common components to integrate all these features.  Again this is a larger scope than OPC DA and developers need to understand what things included in the base and what things are Access Type specific.  That said, not every OPC UA server will be required to implement all 13 Parts.  OPC UA provides multiple ‘Profiles’ that allow developers to choose the right level of functionality for their application, yet still ensure that the base level of interoperability exists will all OPC UA products. 


OPC UA has been designed to be cross-platform and scalable from embedded devices all the way to Enterprise spanning applications.  Offering this level of flexibility while at the same time guaranteeing a usable degree of interoperability means developers must make some decisions on target programming language (C, .NET, Java) and communication stack (Binary, TCP, XML)  their OPC UA products will support.  In classic OPC, COM dictated these things, but with OPC UA developers have more choices. The OPC Foundation provides multiple SDK, communication stacks and sample code to accelerate adoption, but some vendors may choose to implement these lower layers on their own.


All these factors put together means the since OPC UA offers all the functionality of the classic OPC specifications and new features plus removes many existing constraints, that the structure and depth of material to absorb in learning OPC UA is harder than the OPC COM specifications. Or as some people say “OPC UA is not as simple as DA”.


The focus of the OPC UA Working Group over the last few years has been to ensure that specification and supporting deliverables met all the criteria discussed above, while ensuring certifiable interoperability and backwards compatibility and providing increased reliability and security.  Producing a ‘simple OPC UA quick start guide for the new developer’ was not a main priority.  Now that the specifications are nearing final completion, the Early Adopter team validates that things work as expected when the ‘paper become code’, and OPC UA vendors are developing their own products, the priorities are changing. 


The next phase of OPC UA is ensuring that developers have what they need to successfully implement and adopt OPC UA. There is a large segment of the OPC community saying “As a first step we want to just provide our existing OPC functionality on the OPC UA infrastructure.  What do I need to know to do that?”   It’s not really a matter of ‘changing’ the OPC UA specifications to ‘make it simpler’, rather it’s presenting the specifications, documentation and code deliverables in a form that meets this important first step use case. 


That is the focus of the newly formed “Accelerated Adoption Working Group”.  This group is working to create the documentation, OPC UA Profiles and jump start code kits that allow product developers to quickly understand what aspects of OPC UA are required to duplicate their existing classic OPC functionality.  These implementations will still have all the core components needed for interoperability and for added extended functionality in the future.


Under the hood OPC UA is still a powerful ‘Swiss Army Knife’, but if all you want to do is cut something with the big blade, here are the steps you need to follow. You don’t need to know how the cork screw works or where it is.  However if you want to use it in the future, you don’t need to build a new knife, the functionality is there waiting to be opened.


My $0.02


Those interested in learning more of OPC UA should check out “OPC UA: 5 Things Everyone Needs To Know

OPC Ask the Experts – New Resource Section

Wednesday, June 17th, 2009

Based on the feedback I get both on and off-line, the audience of the blog really consists of two main groups of OPC users.  Those that know OPC and are interested in how others are using OPC in their systems and keeping abreast of what’s happeing with OPC and OPC UA.  The other large segment are those new to OPC and looking for general information or an answer to a specific question.  To better address the needs of the latter group, we’ve added a new resource section to the blog “Ask The Experts”

Here’s the list of the first ten questions, and others will be added over the next little while.

As always if you have any OPC questions, drop me a comment or an e-mail.

Introducing Coffee Break OPC

Thursday, April 16th, 2009

This is the 200th posting for the OPC Exchange blog, so I wanted to do something a little different.  We’ve been kicking around the idea for ‘Coffee Break OPC’ for a while now.  Everyone is very busy, and few people have much time to spend on learning new things, or even the time to find out what they should be making time to learn!  Everyone takes a few minutes a day for a coffee break, so why not use that five minutes to learn a little bit?  The animation format lets you read along if you want or just sit back and listen, and hopefully enjoy it a bit.  I did my best to keep things more interesting than a powerpoint bullet list.

Let me know what you think, and what other topics you’d like to hear 5 minutes on.

01. Coffee Break OPC by MatrikonOPC

 If you do have more time to spend and would like to get more in depth information, here’s a list of resources:

One MILLION dollars…

Wednesday, April 8th, 2009

I just can’t say that without quirking one eyebrow and bringing my pinky to the corner of my mouth.  I blame Mike Meyers.  It’s been a busy few weeks, with lots of stuff happening on the OPC front.  Since I’ve been negligent in getting blog posts out, I’ll list them all now…

  1. The blog post refers to MatrikonOPC donating $1 million in kind to the NAIT Johnson Controls Centre for Building Environment Technology.  MatrikonOPC has always had a close relationship with NAIT, and it just makes sense that the technologists of the future are well versed in real-world tools like OPC that are used in industry today.  Could you consider this a stimulus package that helps students become successful HVAC specialists?  I assume this includes free copies of Extending Building Automation Data Visibility Using OPC for every student J
  2. The Live Multi-Vendor OPC UA Demonstration at the MatrikonOPC Houston User Group. This demonstration will showcase OPC UA and recently released UA components. Don’t miss it.
  3. The OPC Foundation TAC has approved the Charter for the “OPC UA Accelerated Adoption Working Group”.  This is the ‘working group for the second generation of the OPC Unified Architecture’ that was mentioned in the latest OPConnect newsletter.  More details on this very soon.
  4. OPC UA Java Software Development Kit now available:  The first beta for the OPC UA Java SDK is now available for download.  A lot of Corporate Members have been waiting on this SDK for some time now.  Let the Java fun begin.

That about brings everyone up to date for now.  On a personal note, this counts as the 199th blog post for OPC Exchange!  I’ll have to do something special for number 200.

Top O’ The Mornin’ To Ya!

Monday, March 16th, 2009

…Or Afternoon or Evening or Night.  Whenever and wherever you are reading this, anytime on or around St Patrick’s Day is a good time for Irish salutations and libations (like a Guinness or Murphy’s of course). In the spirit of the day, here’s an Irish toast to live by… 

May you have the hindsight to know where you’ve been,

The foresight to know where you are going,

And the insight to know when you have gone too far.


To best apply that to OPC, you might want to read about reducing risk in complex OPC projects J.   


May your St Patrick’s Day be safe and happy and may the Guinness flow free. Speaking of free… For those that haven’t heard, MatrikonOPC now offers a free OPC HDA Explorer test client.  Feel free to try it out (preferably before the free flowing Guinness).


Beannachtam na Feile Padraig!

OPC HDA Questions and Tools

Thursday, March 5th, 2009

I’ve been getting a lot of questions about OPC HDA this week, as there are a few on the OPC Foundation forums.  Maybe it has to do with the OPC Interoperability conference going on right now, or maybe people are working on the OPC UA servers.  In any case, if you have a HDA question you’re looking to get answered drop me a line.

If you’re looking for a HDA test client, check out the new MatrikonOPC HDA Explorer.  Our developers have been using it as an internal test tool for years, and it’s been finally given a user-friendly overhaul.  Check it out.

Critical Thinking in Engineering OPC

Thursday, May 22nd, 2008

I often follow the forums on, and this latest thread that asks the question “Where is the Critical Thinking in Engineering” caught my eye.  One of the posters follows up with several good questions to ponder…

·         Where is the critical thinking?

·         What is the role of critical thinking in Engineering as a profession?

·         Where does it come from in the development of a competent engineer or technical specialist? Is it taught? Demonstrated, or merely stumbled upon?

This particular topic is referring to Engineering as a whole, and sprouted from the originating topic of a dubious perpetual motion machine patent.  (Let’s not talk patents, shall we).  The subject matter got me thinking about the role of ‘critical thinking’ in OPC architectures.  There has been a lot of news lately focusing on OPC certification, the independent test labs and interoperability testing.  That’s all great stuff, since you can’t build a good OPC network without robust building blocks, but a good network also demands good thinking.  I’ve said it before, and I’ll say it again; a solid, interoperable OPC network requires informed input from those that know and work with OPC.  With more OPC UA products hitting the market everyday, this is becoming even more important.    Regardless of what OPC flavor you are using, you need to be working with a trusted vendor who understands your requirements and has the products and services to meet your needs.   OPC has done amazing things with leveling the playing field for system interoperability.  However, no protocol, technology or product can remove the planning and understanding needed in creating industrial strength connections between different systems. 

The OPC Foundation and its members know this to be true, and are working on things to make it easier for end users find these knowledgeable vendors.  On the OPC Foundation front, the creation of the ‘SI&D (System Integrators and Distributors) category is a first step.   On the vendor, initiatives like the MatrikonOPC Integrator Program are designed to ensure system integrators have access to the necessary OPC architecture and design experts, products, training and supported for successful project implementation.  It’s all about education and communication on what works and how OPC fits best in your system.  (Without this, you have people getting the wrong impression like those Carl recently posted on.)

Where is the critical thinking in terms of OPC?  It’s with those that know and work with OPC every day.  Where does it come from?  Is it taught?  Demonstrated or stumbled upon?  In a word…  Yes.

Open Standards and Paying-to-Play

Sunday, May 4th, 2008

The ongoing discussion on ‘Is OPC really considered an open standard if you have to pay for the specifications’ has recently gathered steam again on the forums and other sites.  I posted on this topic last summer and the same arguments and counter arguments are still going around.  Most center on two points; definition of an open standard and definition of nominal fee.  As I’ve said in the past, these are really the wrong things to be focusing on, since there really is no one, accepted definition by everyone.   Wikipedia’s got a good explanation of the various open standard definitions.   The one thing that most agree on is that open means available to everyone without discrimination and does not use a ‘pay-to-use’ or royalty model.

Where things are a bit gray is that some definitions of open do allow ‘pay-to-produce’ fees, and what are acceptable fees to charge.  Some folks argue that not everyone can afford the set fees, so therefore are being discriminated against.  A grad student or one-main integrator shop has a different view of ‘nominal’ than a multi-million dollar global vendor.  No method is perfect.  Even the WC3 which is ‘open to everyone’ and ‘free to access and use’ has its critics.  So, I say these arguments can’t really be won since everyone finds a definition that supports their view.  The reason the OPC Foundation now requires membership to access the specs is an attempt at improving the overall quality and interoperability of OPC products.  I’m all for a debate on the topic.  But let’s talk about the real topic. The question that should be asked is ‘Does a pay-to-produce model improve overall quality?’  

Personally, I don’t know.  What I do know is that the OPC specifications used to be available for free download and there where many interoperability problems.  The message coming from a lot of end users was:  “We don’t what more OPC products, we want better OPC products.”   The stance the OPC Foundation has taken is if you plan to develop OPC products, it requires membership which comes with access to the specifications and the associated code base and tools to help ensure a solid quality baseline.

In the early days of ‘classic’ OPC there were many client applications built on the OPC technology by non-members that did not measure up to expected quality and had many interoperability problems.  The argument that ‘more eyes are better’ and having an Open Source model will create a more reliable infrastructure doesn’t seem to be supported by OPC history.  Take the Automation wrapper and server sample code for example.  The code was freely available, yet it didn’t evolve over time.  Many people just took the sample code and released products; good, bad or ugly.   This became one of the biggest contributing factors to OPC interoperability problems.  It was only after the OPC Foundation took back ownership and provided dedicated support for the wrappers did all the fixes get integrated.  Why?  Maybe it’s because the majority of the OPC community is vendors, users and integrators whose core competence is Industrial Automation. Does it work for something like Linux because of the sheer volume of developers in their community?  Now that the scope of OPC UA extends beyond just the Microsoft platform does that mean there is a much larger pool of developers waiting to contribute?  It doesn’t seem to me that the team of volunteers working on the specifications and code has suddenly grown exponentially.

The OPC Foundation’s solution to the problem was to set the bar so only those individuals and companies that are committed to developing OPC products have access to the tools to do so.   Membership signifies commitment.  Adoption without verified interoperability is not adoption at all.  I suppose that OPC could move to a ‘Brand Licensing’ model such as Bluetooth uses.  The specifications are free to view by anyone, however only the Adopter Members are allowed to develop commercial products and use the brand name.  (Of course the other side of that is, what good is looking at a specification if you aren’t developing a product with it?  If I am a company developing a new headset or hand held peripheral am I really going to say “Well, we need a communication protocol to connect with all the PCs, phones and Blackberries on the market.  Let’s take a good look at the technical details of the Bluetooth protocol and see if we want to go with that or roll our own?”  Just my $0.02)

Is demanding membership to access the OPC specifications the right solution?  Someone like IEC would say yes, where XML would say no.  The important thing is what do you say?  As Randy pointed out on the forum “OPC is a member driven organization and if members believe that the specs should be made available for free then these members need to make their feelings known”.  I’ll make it easy for everyone.  Post a comment that answers these questions:

  • Are you an OPC member?
  •  Does ‘pay-to-play’ mean better quality products? (i.e. member products are built on reference code, interoperability tested and certified.)
  •  Do you agree with the current policy of ‘pay-to-view’?
  • Why/Why not?