This is the second installment in our 3-part series reviewing InfoComm 2015. Please check out the other installments, Recapping the Trends and Keynote at InfoComm 2015, and InfoComm 2015: Products and Technologies That Captivated.
The UCC Solutions Summit at InfoComm 2015 was a big success. Panels were well attended, the presenters were extremely knowledgeable, and audiences were very engaged. While it would be impossible to share everything I learned at the show, I hope the following will give you a good feel for the general tone and content of the presentations and discussions.
We all know what UC is. Or do we? As the very definition of the term continues to shift, it was much appreciated to have Dr. S. Ann Earon pull it all together at the start of the day to give us a good sense of what the IMCCA was hoping to accomplish through this year’s sessions. After laying the groundwork, Dr. Ann invited users from Aol, Brystol-Myers Squibb, Citi, and Walt Disney up on the stage for the first panel discussion of the day.
End User Panel Discussion
The story I heard at this panel was a mix of bad news, good news, and best news. The bad news is that we still have challenges. A big one being what I am calling “fragmented communications.” We use different tools to communicate with different people. Each vendor says you can use their app to talk to anyone, but everyone wants to use their own app of choice, so we wind up using a handful of disparate tools on a daily basis.
The good news is that despite the remaining challenges, things are much better than they were in the past and continuing to improve. Even with the fragmentation problem, we are using these tools and enjoying massive benefits as a result.
The best news, in my opinion, was the fact that we no longer need to spend a lot of time in these sessions discussing the benefits of the empowered, flexible, connected worker. There were no presentations about the classic benefits of videoconferencing. No discussion about travel savings. It is a given that we benefit from providing our teams with the right set of productivity tools, especially as part of a fully integrated environment designed particularly to accommodate their unique workflow. In other words, panels of the past discussed why we should empower our workers, while today’s panels focused on how to make it happen.
Workspaces of Tomorrow
This panel was hosted by David Danto (please click here for Danto’s take on the session), and included reps from Acano, AVI-SPL, Kramer, Pexip, and Smart. The discussion covered the expected topics, including huddle rooms, wireless share, persistent work spaces, UC integration, etc. I appreciated the fact that the panel spent as much time on workflow and uses as they did on products and technology. My favorite take-away from this session was the fact that everything we are talking about is available today. It is just a matter of implementation and adoption.
Anyone attending this session should have a very clear understanding about how David Danto feels about the Microsoft Surface Hub. Rather than attempt to restate his deep and wide critique of this product, I will simply link you to his own words. Personally, I think the AD support is a pretty big deal, but I still wouldn’t count this product out just yet. I think there is interest, and some of the other concerns can be addressed through updates, proper integration, and training. I think these will get sold and hung up on some walls. But it is a “wait and see” game as to whether they will enjoy proper integration, adoption and ROI, or be another lesson to learn from.
On the other hand, there is nothing but near universal love from our industry analysts for more simple and affordable collaboration tools, with the SMART kapp iQ being a prime example. Please be sure to read the next, and final, part of this series to read my thoughts on this product after being briefed by SMART CEO Neil Gaydon.
UC as a Service
This session, hosted by Simon Dudley, had a rock star panel representing a range of vendors and integrators. It included execs who have been directly involved in some of the events which have shaped our industry over the last decade or more. Obviously, these guys get it, as there was almost no talk of speeds and feeds, and plenty of discussion about how to make things simpler for their customers. I was not surprised that things got heated over the very definition of the term UC, but it really got me thinking about the term itself and whether it still applies to our industry or should be replaced. Now, if these experts can’t agree on what is wrong with the term, I couldn’t have the audacity to think I can figure it out, right? Well, it just so happens that I am feeling a bit audacious, so please check back soon to read the article where I will take off the kid gloves and bash the term “UC”.
Touch and Collaborate
Putting aside the controversy over specific products, there is no doubt that users want these tools. How fortunate that giving them what they want will result in more productivity for our organizations? Case Murphy, Senior Manager of AOL’s Global Comm & Collab Group, shared a presentation and hosted a panel providing an overview of the space, an explanation of the old “failure cycle” of implementing these products, and guidelines for how to get it right in the future.
The State of Collaboration Tools – Lunch and Learn
This was a bit of an unusual session, and I was pleased to be given a unique role. The idea was to include vendors and end users on the same panel, and try to mix things up a bit, with me acting as “referee” if things got out of hand. While there is certainly some frustration on both sides of the fence, as adoption of collab products is still far behind where it could and should be. I was pleased to see that while some “blame” can be placed on both vendors and users, at the end of the day we agree that these are problems we should be working together to solve. However, despite the amicable nature of the discussion, I did feel the need on a few occasions to step in as referee and defend one side or the other.
The first issue I was compelled to jump in on was the issue of backwards compatibility and support for older offerings. We all appreciate new versions of our favorite products, but we don’t want to be forced into them by the failure of our existing offerings. If the new stuff is good enough, we will buy it. That being said, in general most vendors do a reasonable job of supporting older products and services, or helping users transition to the new stuff. The examples given for things no longer supported were, in general, pretty old products. But at the end of the day, the right thing to do is to support products for as long as customers are putting them to real use.
I next found myself jumping in on the other side of the fence, defending the vendors on the issue of software bugs. The users are frustrated that in order to enjoy new features, they have to risk software updates which can introduce bugs that “break” existing features. This is, unfortunately, just the nature of software development. Users need to understand that in the old hardware days, we basically had 5 appliances that dominated the market. When a new feature was introduced in any of the five, it was simple matter to make a few test calls to the other four to make sure everything still worked.
Today we have countless software offerings on a huge variety of platforms. If a vendor has 1000 beta testers working on the latest version, they will find the 1 in 1,000 problem. But they won’t find the 1 in 10,000 problem, the 1 in 100,000, the 1 in 1,000,000, etc. Once you release the software to the users, they will find those rarer problems. The users have to understand there is no way to perform millions of hours of testing on every possible configuration before release. On the other hand, one of the users had a great point that it would be nice if vendors put out more “bug fix” patches in between the “new feature” patches. The next feature release will introduce new bugs, so if we wait for it to fix today’s bugs, we are basically ensuring a cycle of constant bugs.
John Antanaitis from Polycom made a great point about the difference between using open standards, and using standards openly. One pain point for users has always been, and continues to be, interop. While the use of open standards is a great step in the right direction to providing interop, it is not the end of the story as vendors can implement these standards in proprietary ways. If vendors are open with how they use these standards, it helps other vendors to ensure interop, even with advanced features.
At the end of the session, I took advantage of the mix of vendors and users to ask a question that is always fun in these situations. I asked the Kleenex question. Kleenex was originally sold as a makeup tool. Customers preferred to use it as a disposable handkerchief. When Kleenix found out, they surveyed their users, started marketing it for this new use, and made a lot of money. So I asked the panel if they had any stories about products being used in completely unintended ways, and I was rewarded with a great one. Stratus Video and Cisco were both represented on the panel and explained how Cisco’s executive desktop appliance was re-purposed by Stratus for use by sign language interpreters in a contact center configuration. A perfect example of how users can add more value and ROI to a given product than the vendors can ever imagine, and a great reminder to pay attention to what the users are doing, even after the sale has been made.
Clearing Video Collaboration Adoption Hurdles
The bane of our industry has always been adoption. The technology is comparatively easy to sell as the benefits are well documented and demos tend to have that “wow” impact. But adoption is elusive. People are just incredibly slow and reluctant to change their existing workflows, particularly communication workflows, to accommodate new technology. While I expect resellers and integrators to offer tools to help with adoption, I was really impressed with the depth of the adoption program described by James Koniecki from Dimension Data. As Jim explained, there is no typical user, as everyone has different roles, responsibilities, and workflow preferences. This makes things even more complicated as with no typical user, there can be no perfect adoption program that will work for everyone.
Each program must be tailored on a case by case basis for each implementation. This requires a lot more than the expected user surveys. In fact, Jim described a multi-phase process involving focus groups, a feed back loop, policy reviews, training, adoption plan checklists, a 12 week adoption management program, and much more. It is almost disheartening that such an extensive program is required to ensure we achieve ROI from our wonderful collab tools, but it is good to know that the integrator community is well aware of the issue.
The IMCCA is providing a much needed forum between vendors and end-users. In fact, some of the most interesting moments from the sessions came from audience comments and questions, rather than the presentations themselves. This isn’t a one way learning opportunity, this isn’t about getting certified in UC, this is a place to discuss our current direction and even have an impact. I look forward to continuing the discussion with the IMCCA and its user community as UC continues on its path to ubiquity.