Thursday, March 28, 2024

Interop 2007 Buzz: Security to Web 2.0

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Interop is back. My last few visits to the show left me wondering whether it was worth coming out. News was stale, crowds were subdued and vendors seemed to be in a holding pattern.

This year, there is real buzz once again – and by buzz I don’t mean hype lacking substance. This year’s show features 475 vendors, 100 more than last year, and Interop organizers estimate that there have been 20,000+ attendees, also well up from recent years.

NAC is making news at the show in a big way. Enterprises are waking up to the fact that perimeter security isn’t enough. Employees, partners, and customers enter the network from a variety of locations on any number of endpoints. To prevent intrusions and maintain network integrity, end points must be vetted as stringently as users. Well, more so, since user authentication is only now moving beyond single-factor.

Simply put, the goal of NAC is to put IT back in charge of the network. Users and their devices are checked for identity and system health, and rights can be determined beyond the simple all-in or all-out methods of the past. Access to network assets can be based on roles, location, and context.

Even though NAC makes sense and provides an obvious security benefit, many potential customers have adopted a wait-and-see attitude. Interoperability, or the lack thereof, has kept IT decision makers on the sidelines, with various vendors slicing and dicing NAC differently. With few of the major vendors playing well together, customers rightly feared vendor lock-in.

The big NAC news is that Microsoft and the Trusted Computing Group (TCG) have reached an interoperability agreement. The two announced that they will work together on NAC interoperability. Microsoft’s NAP (network access protocol) will be supported by TCG’s Trusted Network Connect (TNC) architecture.

For IT, this means that two of the three major NAC players will work together, with only Cisco remaining a question mark. While this news is only a single step forward along the path to standards and interoperability, it’s still a big step.

Should Vendors or Customers Drive Interoperability?

Identity Engines Identity Engines, a NAC startup, believes that the push toward vendor-initiated interoperability is important, but even if the major vendors drag their feet, other solutions can fill the void.

Along with five other networking and security vendors (Extreme Networks, Infoblox, Symantec, TippingPoint, and Trapeze Networks) and the UKERNA (United Kingdom Education and Research Networking Association), Identity Engines has formed the OpenSEA Alliance. The open-source group intends to tackle NAC from the client side, with an open-source 802.1X “supplicant” (the piece of software that communicates with an authentication server).

While 802.1X is an IEEE standard that provides port authentication in LANs, it has largely been adopted only in wireless environments, although it can function perfectly well in wired environments. By focusing on a client-side supplicant, the alliance believes interoperability can be driven from the client side. Extending support from the open-source client to various NAC servers is much less challenging than the reverse.

“Strong identity is the only way security will keep up with intruders,” said Pete Selda, Identity Engine’s CEO. “An open-source, client-side approach helps push interoperability forward.” As a result, NAC can be widely adopted without worry.

What about Broader Policy Enforcement?

Vetting users and devices is only half the battle, though. Once NAC has granted certain privileges, how do you go about enforcing enterprise policies on an application-by- application and event-by-event basis? Data leakage, IP theft, insider attacks, and threat response are issues even with NAC in place.

Another security startup at the show, ExaProtect, previewed its security appliance that combines a security event information management solution (SEIM) and a network security policy solution (NSPS). The appliance monitors security events, which is nothing new, but beyond monitoring, it gives IT the ability to make immediate, system-wide changes based on those events – even in multi-vendor, heterogeneous networks.

Think of it as security event and system interoperability. According to Jason Holloway, VP of marketing for ExaProtect, many SIEM products view events from a single point on the network, from a firewall perhaps, without correlating that event with, say, an IPS to understand the nature of the event. For security to be effective, IT must understand the context of the event.

“A good analogy is a car alarm,” Holloway said. “If you live in the country and have your car garaged, when the alarm sounds, it is unusual and worth investigating. If you live in the city and have a neighbor whose over-sensitive car alarm goes off every time the wind blows, you’ll do your best to ignore it. It’s the same event, but it requires two very different actions.”

With contextual awareness, security managers are freed from spending their time studying event logs. They don’t react to each and every alarm, responding only to those that represent a real threat. Not only is the threat understood, but this broad view of the event also enables real-time remediation. Policy and network changes can be made directly through the ExaProtect console, so security staff is further spared the cumbersome task of making changes manually on a device-by-device basis.

What Do IT Managers Really Want?

Since so much trade show buzz is generated by marketing and PR, rather than by the true concerns of IT managers, NetQoS and Network Instruments representatives sought out engineers and IT pros to ask them what they were really focusing on at the show.

The show-floor survey of 101 IT pros found that VoIP, MPLS, and WAN optimization were generating the most interest. NetQoS and Network Instruments are both in the network performance monitoring game, so they see these trends as having an underlying performance commonality. “All require an understanding of network and application performance before and after deployment to determine the impact,” said Steve Harriman, NetQoS VP of marketing. “End-to-end performance monitoring, namely application response time, is the first step to effective network performance management, because it is the best indication of how well the network is delivering applications to the end-user.”

A full 68% of those surveyed have implemented or will implement VoIP in the next 12 months. About 40% percent have deployed MPLS or will do so in the next year, and 50% have implemented or plan to implement a WAN optimization technology in next 12 months.

Wrapping up with Web 2.0

As I wrote about in a companion story for our sister site, CIOUpdate, Web 2.0 generated a good deal of buzz at the show, sparked by Cisco CEO John Chambers’ keynote address.

Chambers touted Web 2.0 tools as the next frontier in enterprise communications. He noted that he felt he communicated better over video than other forms of communications and predicted that future enterprise communications will be about choice, flexibility, and more intimate forms of interaction.

With so many vendors claiming some sort of Web 2.0 angle, this keynote was well received. A few of the Web 2.0 (or Enterprise 2.0) offerings that caught my eye included those from LifeSize, /n Software, and FireScope.

LifeSize’s HD videoconferencing suite is a low-cost solution that seeks to bring “telepresence” to the masses.

/n Software’s RSSbus product takes the RSS concept and applies it to corporate data. Users can use RSSbus to create and manage feeds from databases, spreadsheets, directories, and other corporate applications.

FireScope, meanwhile, intends to bring the mashup concept to IT management. While general, non-techie consumers have more and more powerful and easy-to-use technology tools at their fingertips, IT is stuck with cumbersome, disparate management consoles. FireScope believes that the same sorts of tools that allow users to create media-rich blogs can be leveraged to create simplified and unified IT management portals.

When I talked to Andi Mann, a senior analyst at EMA, about Web 2.0 in the enterprise, he noted that adoption is happening whether organizations like it or not. “The use of online content technologies is significant. It is also growing at a fairly significant rate,” he said.

Over the next 12 month, EMA expects blog usage to triple, wiki adoption to double, and the use of RSS and Atom feeds to also double. Meanwhile, the Web 2.0 tool with the widest adoption, IM, will grow at an 11% clip. All of this is “official” adoption, meaning the organization has somewhat sanctioned the use, but unofficial adoption rates will be as high or even higher.

The message here is that your employees are using IM, wikis and blogs, so it’s time to stop wondering if and when the enterprise will embrace these new forms of communications and start figuring out how to manage them.

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles