Home > AWS, CFD2, Cloud, Tech Field Day > Cloud Field Day 2 Preview: Gigamon

Cloud Field Day 2 Preview: Gigamon

July 21st, 2017

Cloud Field Day 2, part of the Tech Field Day family of events is happening in San Francisco and Silicon Valley, from 26-28 July and I’m super excited to be invited as a delegate.

We are hearing from a number of companies about how they cloud!

Gigamon is an established vendor which provides network traffic visibility. In its simplest form it is a large network tap. You chose what traffic you want to inspect more closely and run it through Gigamon’s devices. Gigamon then can hand off to other vendor products to then analyse the data. It could be security scanning with an intrusion detection system or watching traffic for data loss prevention or seeing if you have a bot net running internally.

In terms of virtualisation inspection, Gigamon already has its GigaVUE solutions which provide visibility into virtual workloads running in VMware networking with ESXi and NSX as well as OpenStack KVM powered clouds. Its Cloud Field Day so of course Gigamon is heading to the clouds and has recently announced the Gigamon Visibility Platform for AWS.

Enterprises love the simplicity of cloud networking, create a VPC with pretty much all the address space you need. Connect via an API and easily connect servers and clouds together. Nothing can communicate unless you specifically say it can so some of your firewalling is already taken care of and all the config can be more easily managed as code. Amazon looks after all the underlying compute, network and storage so you don’t have to, sounds great. It can be easy to think you then don’t have to worry about more security at the network level. Well, you may have permissioned a web server to be able to talk to an app server but how do you know what is actually running across port 443. What if the web server is in AWS but your app server is on-prem?

Visibility Platform for AWS

This is where Gigamon can come in. You can now use the Visibility Platform for AWS to provide the deep packed inspection capabilities to peek inside network flows within AWS and find out exactly what data is flowing where. This has now been extended to the AWS GovCloud U.S. Region as well. You can now do forensics and DLP within AWS which you probably couldn’t before. As Gigamon has a huge set of technology partners you can have the network packets opened for handing off to any number of additional solutions.

This can also connect to your on-prem Gigamon installation to now provide end to end visibility between your on-prem workloads and what you have running in AWS so your security people can sleep a little better at night.

You install a single user agent within your EC2 instances which then mirrors the traffic you want and sends it to virtual GigaVUE visibility nodes which are the gatherers. The single agent is an important part. Network scanning isn’t a single thing, its forensics, DLP, performance monitoring, AV, anomoly detetcion etc. Without Gigamon you could land up having to manage a number of vendor agents within each VM. These agents would take resource and potentially conflict. Having a single agent model which then hands off to other vendor products lightens the VM resource requirements and provides a far simpler system to manage. You can also then swap out your analysis product without having to change the agent on every VM.

All the traffic is then aggregated from the various agents to a Visibility Node which is an AWS AMI. This can filter and optimise the traffic to save on bandwidth. It can even only selectively sample packets so grab only 1 in 10 to save bandwidth yet still get a decent idea of network performance. You can also mask data as it flows through Gigamon so can ensure sensitive production customer data doesn’t land up in the wrong place.

The Future?

I’d like to hear Gigamon’s view of the future particularly regarding public clouds. Public cloud use is moving on from individual VMs and now containers to more of a bunch of services. You don’t manage your own database running on an EC2 VM but rather store data in DynamoDB. You store data natively in S3 and write Serverless functions to move it around. This even further removes the infrastructure visibility. Does this hamper Gigamom somewhat? Is there a role for Gigamon to trap traffic moving between S3 buckets and check it isn’t customer credit card information that lands up in a public S3 bucket? Could Gigamon ensure that data going into DynamoDB is secure?

If the future is very much public cloud and it isn’t VMs with an OS with an agent, where does Gigamon fit? Looking forward to hearing more.

Gestalt IT is paying for travel, accommodation and things to eat to attend Cloud Field Day but isn’t paying a penny for me to write anything good or bad about anyone.

Categories: AWS, CFD2, Cloud, Tech Field Day Tags: , , , ,
Comments are closed.