A few weeks ago I posted a blog on what I have experienced over the past four years at Plexxi. That post led the Packet Pusher team of Ethan and Greg to reach out and we recorded a podcast about the changing role of the network engineer and IT silos. In preparation for the podcast, my colleagues Mat Mathews, Mike Welts and I collaborated on the following that I edited a final time after the podcast. This post started as a dialog about what we are seeing in the market, what our customers and prospects want to engage about, how we position Plexxi to the network engineer and where we see this all going now that market clarity has begun to emerge. Continue reading
Today is my four-year anniversary at Plexxi. I was in New York the week before Christmas to attend an investor conference focused on security and networking. It was a two-day trip that I expected to go by quickly as it was full of meetings and dinners. A colleague and I met with a number of crossover investors, analysts as well colleagues in compatriot companies. In our very first meeting an investor asked “four years in, how has it turned out compared to how you thought it would go when you started?”
The best quote in this article is “Everything made sense except that nobody gives a shit.” When I think about trends in the networking space over the past five years, that is how I would summarize most of the efforts labeled “disruptive” or “revolutionary.” When I can, I attend various local Meetups, which like a quasi-sales call. I get to hear end-users talking about what they are working on, what issues they are facing, etc. Meetups are kind of like fishing, some days they are a complete waste of time and other days you catch a lot of fish and in my world information is fish. I like to hear what end-users are saying, what they are working on and what keeps them up at night.
Most consumers are familiar with the availability of over the top (OTT) content. Examples are Netflix, Amazon Prime, Hulu and we could even include gaming services in the description. The model for an OTT content provider like Netflix is to ride over a user’s data plan, and that data plan can be DSL, FTTH as well as wireless to deliver content. The consumption model is disaggregated between the data plan (i.e. internet) provider and the content (i.e. service) provider. This is also the point at which there is tension between both parties in terms of the cost to deploy bandwidth and which party profits from the services that ride over the bandwidth. That is not a topic for this post.
35,000 feet over Utah, one glass of scotch down, ear buds in and my internal notes sent; it is time to write some VMworld 2015 impressions for the blog. In no particular order, here they are: Continue reading
A few weeks ago I spent the morning in New York City presenting to a room full of people about networking. Networking is typically not a really big draw on a Friday in NYC during August, but the turn out was great and the morning was quite pleasant. Continue reading
It is Sunday morning and I am on a 7am flight to SFO from Boston. When I left the house, no person was stirring; not even the dogs. Sipping a morning mimosa or two on the flight to SFO, I read this article that I saw tweeted. The article is about work-life balance in the eyes of Pat Gelsinger and how tech companies overwork their employees. I found one quote very applicable to myself. Continue reading
Note to readers this is a self-promotional post. On August 14, Plexxi will be hosting a morning discussion in New York City at The Cornell Club, located at 6 East 44th Street. I will be the speaker for Plexxi. We will serve some food and talk networking for a couple hours.
The primary agenda will be around how to transition legacy networks to hyperconverged rack scale systems using a controller architecture, which is often referred to as SDN.
If you are interested in attending, please register here.
I took a day in May to spend on the corporate development function at Plexxi, which means I spent a day in Boston with a sell side analyst meeting with buy side clients of his firm. It was a really fun day talking networking with new acquaintances and old colleagues alike. In one meeting, I was greeted with “I read your blog,” which was reminded me that I had not written a blog post in few months. My writing time is correlated to the pace of work at Plexxi. When the pace is fast and the activity levels are high, I need a break from work and the blog suffers. Continue reading
A couple of weeks ago, I had an exchange over email with a sales prospect. I had initiated the conversation, thinking that this would be a good place to sell Plexxi. Below is the thread, which I edited for anonymity: Continue reading
I was listening to an episode on Planet Money last week regarding the first spreadsheet program called VisiCalc. If you listen to the podcast there is a discussion of the accounting profession before and after the creation of the spreadsheet. Before the creation of the program VisiCalc a spreadsheet was really a spreadsheet. “If you ran a business, your accountant would put in all your expenses, all your revenues, and you’d get this really detailed picture of how the business worked. But even making a tiny tweak was a huge hassle.” Teams of accounts and bookkeepers would spend days reworking sheets of papers to maintain the accuracy of the books. Continue reading
When should the incumbent pay attention to an upstart competitor?
A couple of weeks ago Arista Networks (ANET) reported their quarterly numbers and they were fantastic. No need to sugar coat the numbers, they were excellent. A decade ago I worked at Ciena and I remember when we started to see a new competitor in our market called Infinera. At the time Infinera came to market, Ciena was recording ~$100-120m quarters. Within a year of seeing them, Infinera was recording $8-20M quarters that had grown to $40M a quarter by mid-2006. Running a $500M business is vastly different than running a $150M business, which were the annual revenue runs rates for CIEN and INFN in 2006. Looking at the Arista results, the thought occurred to me to look back a decade. Continue reading
- Bandwidth is the Software of the Network
- Regulating the Single Network Pipe is Driving Forward while looking in the Rearview Mirror
With age and experience, time provides the ability to clearly spot irony. In 1976, Bill Gates sent an open letter to computer hobbyists expressing his displeasure for software piracy. The letter even has a Wikipedia page. When I read the FCC proposals regarding new neutrality, I feel like we have been over and over this ground before. Thirty-nine years ago Bill Gates wrote his letter to hobbyists and the majority of it is worth reading in the context of the net-neutrality debate: Continue reading
For me the last several weeks of 2014 had been running to stand still. I made one last sales call before Christmas Eve and then eased into a long beak till the New Year. I had some interesting sales calls over the past year. I wrote about the perfect Clayton Christensen, hydraulics versus steam shovels moment here. I learned a lot from that sales call and went back to using a framing meme we had developed a couple years earlier. That meme I posted in this blog here, seven months ago. In this post I am refreshing that meme and highlighting a few insights I read and thought were meaningful. Most if not all of the mainstream tech media is some technology company’s marketing message in disguise; hence it might be entertaining, but it is not informative and or thought provoking. Continue reading
There is an enjoyable Sidnay Pollack movie called Absence of Malice. The plot of the movie is about an investigation of a murder and how press leaks are used to manipulate people via public opinion. As I watched the Deflategate drama unfold over the past few weeks, the whole affair reminded me of this movie. No one has died and we are not talking about Federal crimes, but from the coverage of the affair a person in another country not immersed in our football culture, would think we were discussing high crimes against the state. Continue reading
Do you have that annoying friend who absolutely hates your sports teams? I am describing the person who sends a weekly barrage of emails full hate and over indulges in schadenfreude when your team loses. I have that friend and he is a Miami Dolphins fan. I am a Patriots fan and season ticket holder for more than twenty years. The Tom Brady era has been toxic for the Miami Dolphins and the AFC East in general. This toxicity manifests itself in a weekly barrage about Patriot cheating, film crews, playbook theft, hometown refs, video recording innuendo and general hatred towards Bill Belichick. Continue reading
The seminal achievement of SNA in the late 1970s to mid 1980s was to make minicomputers viable from an enterprise market perspective. Enterprise computer networks were completely dependent on the mainframe computers supplied from IBM or one of the minor mainframe suppliers. SNA was a proprietary solution implemented by IBM, but it was an open source solution. This enabled the suppliers of mini computers such as DEC, Wang, Prime, Data General, Apollo, and others to use SNA technology to deploy their systems into the network. Open source meant that competitors as well as providers of non-competitive systems had access to the technical implementation of SNA and thus could use SNA to add their computers to an SNA network. The mini-computer vendors implemented a PU_Type 2 node capability on their computers, which enabled these machines to seamlessly interact with mainframe computers as well as each other. This was the genesis of distributed computing (Platform 1.0). It was a seminal moment that gave birth to the commercial network within the enterprise market and started the progression towards the client/server network, which is Platform 2.0. This occurrence may not have had the dramatic overtones of Roger Kildall flying his plane while IBM waited in his lobby to license CP/M for the personal computer – but it is significant because networking of computers started with IBM. Continue reading
In order to understand Boyd’s model for operations, we must understand his premise that there is a fundamental need for decisions. He states, “Against such a background, actions and decisions become critically important. Actions must be taken over and over again and in many different ways. Decisions must be rendered to monitor and determine the precise nature of the actions needed that will be compatible with the goal. To make these timely decisions implies that we must be able to form mental concepts of observed reality, as we perceive it, and be able to change these concepts as reality itself appears to change. The concepts can then be used as decision-models for improving our capacity for independent action. Such a demand for decisions that literally impact our survival causes one to wonder: How do we generate or create the mental concepts to support this decision-making activity?” [see Boyd, Destruction and Creation]. This quote highlights the basic contribution that Boyd provided. He developed a model that can be extrapolated into a process for decision-making. Boyd called the model he developed the O-O-D-A loop. Continue reading
Post the end Vietnam War and the entrance of America in the 1970s, it seemed that America had lost their way. The mathematicians and hard science was again in decline and the philosophical and social sciences came to the fore. It would be in the 1990s, that hard science and mathematics would once again gain a dominant position in thought leadership. Out of the counterforce debacle in Vietnam and the dominance of defense and public policy by neoclassical theories, emerged John Boyd who developed a philosophy and process that formulates strategy based on all data and uses dynamic analytics to continuously evolve strategy to achieve objectives. This is not a game theory strategy that mathematically outlines various outcomes based on strategies employed. It is not a precise mathematical formula that defines risk. Boyd believed that strategy is an ever evolving and highly iterative process designed to achieve victory. It requires assumptions of risk with constant analytics of the operating environment. Boyd believed that the real target was your enemy’s perception for it is enemy who decides when they are defeated – not you. Keynes would describe this as the participants in the financial market who decide when you have won or lost – it is the not the companies. In the business market it is the companies competing for market share who decide when they have lost. Continue reading
In 1928 a young, brilliant mathematician named John van Neumann, devised a theory that would affect economic and military thought for many years to come. The theory that Neumann developed was based upon several observations he made during a game of poker. The first observation was that wining and losing was the interdependent on all players. A wining strategy was not solely based upon a single player’s strategy – but rather the product of all the player’s strategies. In order to devise a winning strategy, Neumann had to account for other player’s strategies, assuming that each player’s objective was to win the game. From these observations Neumann developed what became known as game theory and he applied the theory to economic markets. Previous economic models used traditional neoclassical economic assumptions that a seller and buyer acted solely on the mission to maximize their gains. A seller is looking to maximize profit and a buyer wants the maximum value for capital spent. The contribution that Neumann made was to define a seller and buyer as a transactional unit that was dependent upon each other, but not necessarily to achieve maximize profits and value for capital. Continue reading