Sunday, April 26, 2009

Earth Day: The Day After

"We're not dropping out... we're infiltraing and taking over."

So the weekend has passed and news of Earth Day has quickly died off. No new press releases, fewer bicyclists on the road, and the general rabble from pseudo-environmentalists claiming everyone can save the earth by driving a hybrid resumes in the blogosphere.


Let me just come out and say it: All the hoopla answering "What can I do?" with "ride a bike" and "plant a tree" is absolute useless drivel. The correct answer is: "Go out and get a degree in engineering, math, or science."

Bear with me for a moment... The last time I went to India was 17 years ago in 1992. In two weeks we hit four cities: Mumbai (Bombay), Ahmedabad, Rajkot, and Bhavnagar. In all four cities, pollution was becoming a real problem. Cars, trucks, motorcycles, scooters, and rickshaws filled the road burning everything from diesel to kerosene. Amongst the very poor, the question wasn't whether it was the right fuel as much as it was whether the fuel was cheap enough. The resulting layer of hydrocarbons was imposing. I used to joke as a kid living in Southern California "you can't trust air you can't see." Forget being able to see the air... I felt like I had to carve my way through the pollution.

Everyone wanted cleaner air in India, but the reality was that legislation wasn't going to fix the problem. What were they going to do? Fine someone that can barely earn enough to eat? The reality was that there was a significant population that were constrained not by their desire to act and improve their living situation, but by the economics of it. Until doing the right thing either cost the same or less than their current options, getting a sufficiently significant change that would make a marked impact on the environment simply wasn't going to happen.

The answer was in Compressed Natural Gas (CNG). (Nice summary on CNG here.) In 1998, the Indian national government forced the city of Dehli to move their entire rickshaw fleet to CNG based on the availability and cost of the fuel. With an entire fleet moved over, the cost would come to a point that would make burning anything else pointless. The initial pilot, after some hiccups along the way, was eventually successful and the entire country and been making the plunge one city at a time.

The impact was significant. It was the difference between people needing to wear masks to walk outside and feeling like they could step outside and want to take a deep breath.

Now here is the key: The impact was not achieved by (forgive me my peace loving Bay Area friends) crystal gripping hippie freaks advocating unattainable livestyle changes. The answer was achieved by environmental sciencists disecting the problem (some of whom I had the privledge of working with), chemists and materials scientists figuring out the fuel and its container, automotive engineers figuring out how to burn it in cheap/hacked/modified engines, and so on. You get my drift.

There are an endless series of similar projects that need to be done. After all, there is a worldwide economy that needs to be rebuilt around sutainable engineering. And to make that happen we need more graduates in math, science, and engineering.

So the next time someone asks "What can I do to help?" Don't advocate dropping out of consumerism -- a change that simply won't take hold across a sufficiently significant enough number of people to matter. Instead, advocate infiltrating and taking over the way we build and consume. It's only when we make changes at the root causes do we succeed at making an impact.

Tuesday, April 21, 2009

MySQL's best days are yet to come

Just because I'm running a fever and it feels like slightly under a billion degrees outside doesn't mean I'm losing my mind. I really do think MySQL has a lot more good than bad coming in its future with Oracle at the helm.

The difference between Sun and Oracle is that while Sun did grok Open Source (mostly), MySQL was never a particularly good strategic fit for them. One of the great values of MySQL is that it's very easy to get started with and runs well on a moderately powered x86 box -- you don't need to buy Sun gear running Solaris and ZFS to do that. Thus the synergies that Sun hoped for never emerged. Instead, the heavy handed approach to software management and release necessary for extremely large projects like Solaris and Java was applied to a moderately sized project (by comparison) and the results were not good. Developers coming from the startup environment and transparent development cycle slowly moved on and the project started languishing with the poor acceptance of 5.1 highlighting this failure.

Oracle, by comparison, does get value for MySQL.

Oracle has never been known for low end products. They do big enterprise databases and big enterprise applications. Their idea of a small CRM deployment easily runs into the hundreds of thousands of dollars. For a long time, there was simply a void at the low end that was never filled by upwardly mobile software.

That is, until Microsoft Access joined hands with Microsoft SQL server and sang happy songs of tight integration. If you're doing a small Microsoft application, using Access and MSDE is a quick way to do it with great links to the entire Office Suite. Now your upward migration as the application grows is clear: SQL Server. The other players don't have a real chance. (Yes, yes... Access does ODBC with connectors to Oracle, MySQL, and others, but come on... let's be realistic here...) And the cost of entry to this game? $175 on Amazon for a single user license, no additional discounts. If you're even a moderately large company, you probably already got it as part of a site-wide license.

Meanwhile, Oracle priced itself out of the low end and essentially gave the .com boom to MySQL which has bitten them ever since. What started as a small, simple, database has grown considerably with large noteworthy installments in the ecommerce space. This is showing the enterprise that MySQL can scale to support large, complex applications at a fraction of the cost. As a result, Oracle has been getting the bottom end of its market threatened by MySQL. Between clouds, SaaS, and an increasing number of applications that leverage MySQL, the market impact is starting to creep into Oracle's home turf.

While the bulk of Sun's value to Oracle comes from Sun's hardware, operating system, and Java holdings, I doubt that the ownership of MySQL is lost on them. Whereas MySQL lacked solid direction in Sun, Oracle will likely lay out clear vision and weave a compelling story that will put Microsoft and IBM on edge. MySQL will not be a second class citizen because it will evolve into Oracle's entry market with a compelling transition path to the enterprise class stuff that Oracle already sells. The good news for MySQL is that there is a lot of headroom for growth and improvement in this model since Oracle's current products start near the stratosphere and go up.

MySQL's best days are yet to come.

Tuesday, April 22, 2008


I posted a small entry on polish over at BitCurrent.

Friday, October 19, 2007

Out with Norton Antivirus.

Time for another example of "It takes ten atta-boys to get rid of one awww-sh*t," for us product types...

Usability is something that I take to heart and for a very long time, Norton related products generally got a nod from me. I especially recommended Norton security products (anti-virus, personal firewall, etc.) to friends and family since it did a good job with protecting their machines while being easy to use. The latter was especially crucial for family that would otherwise call me for help.

Over the last few years I've progressively seen the Norton family of security products get bulkier, slower, and increasingly tedious to use. Most recently I purchased and installed Norton Internet Security 2006 and used it for 10 months. This took the cake for miserable.

The most visible problem was performance. It was excruciatingly slow and made boot/wakeup times downright pathetic. My Pentium III-1Ghz which also runs Windows XP-SP2 was more responsive than my Centrino-Duo-2Ghz laptop! My grumble slowly turned into a roar - unless the new version that I planned to test drive was significantly better, it was the end of the road for Norton. But even with performance being sluggish for me, I hadn't completely changed my mind about my family yet. Did they feel the same pain?

Turns out they did. Both my sister and my dad who have relatively modern machines with ample memory and similar Windows XP configurations complained about the performance of their machines. I initially wrote it off as yet another application they installed, but after checking their machines over I realized that they were feeling the same pain around Norton that I was. Slow boot times and slow scan times topped the list. My dad felt the Outlook-integration pain that I felt as well. Basically, any application that had Norton integrated with it would slow to a crawl when trying to startup.

But isn't performance orthogonal to ease of use? No. It isn't. Work with anyone dealing with a slow application (whatever platform it happens to be) and the frustration they feel with a slow application is very similar to that of the frustration with a difficult to use application. The only way to make the situation worse is to make the application unreliable.

Norton, thankfully, wasn't unreliable. With exception of a bad drivers or bad hardware, I haven't had a crashing problem with Windows for years. (Literally, since I started using Windows 2000.) There are times when Windows get cranky, but no lost data unless the error was self inflicted.

As for the user interface itself, I qualified it as humdrum. If you left the default settings on, you were fine. If you needed to alter any of the settings, you quickly found yourself in a somewhat complex set of menus which required that I paid close attention. Obviously, the menus were nowhere near acceptable for my family of users.

So with 10 months of usage under my belt, it was time to take a look at the upcoming 2008 release and give it a shot. If they fixed the performance problems, I'd do the upgrade. As if the software was reading my mind, it popped up a little dialog box warning me that I was down to 60 days on my virus definition updates license. This is good -- it's one of the features that I like since it reminds me well in advance rather than surprising me when it runs out. Good for the family too. Even better? There was an offer to try Norton Internet Security 2008 for the remainder of my current license for free! Perfect -- a 60 day trial would be plenty long enough to really give it a test drive. I even mentioned to the Smarter Half that offering the upgrade as part of the remainder of my license was a great idea and made me optimistic.

Unfortunately, I went from optimistic to stunned in about 15 minutes. The upgrade went smooth enough, but ended with a note saying that the software wasn't licensed and I needed to cough up $50 for the upgrade -- immediately. I poked around for the UI element that allowed me to use my old license and for any mention on the web site about how to enable my old license. No luck.

So without fanfare, Norton Internet Security 2008 was removed. With license renewal coming up for my dad as well, I'll be telling him to skip renewal as well. The bait and switch move was out of line and I neither have the time nor energy to fight Symantec on the issue. I'm sufficiently annoyed that I've ceased recommending their products as well.

At casual glance, this may seem like an over the top tantrum. I was headed for paying the upgrade shortly anyway after all. But its a little more detailed than that:

1. The product has become incredibly heavy and slow. Since having removed the product, my laptop literally feels like it was been reborn. Everything responds much faster and sleep/hibernate recovery times are much faster.

2. The product security coverage is getting increasingly segmented. Symantec appears to have a different product for each kind of coverage (malware, spyware, rootkits, viruses, etc.) and each requires a different product. The product integration is tedious and keeping track of what products do what is not worth the effort anymore. How is the casual user supposed to keep up with this mess?

3. Symantec marketing has been tedious for a while, but the glitch in upgrade is inexcusable. Whether it was intentional or a bug is moot - I don't care and have no interest in fighting the matter. If I wanted to battle with my software, I'd run FreeBSD and compile everything by hand.

So if you're keeping score, Symantec is going to have to come up with some significant atta-boys to recover from this aww-sh*t. It's also a good reminder for us product types - these details matter. When there are competitors in the field and the core value isn't highly differentiated, the polish and performance matters. For anti-virus, the core value isn't different from any other anti-virus product out there so the differences fall immediately onto the soft touches.

I'm evaluating other anti-virus products now. Kaspersky is on the top of list and is currently installed. So far so good. Polished, lightweight, some nice features. A little more expensive, but for an extra $10, I'll keep my system performing well thank you. Look for a review/decision soon.

Wednesday, September 26, 2007

Marketing Must Reads

One of my more interesting strengths centers around being a geek. I still get excited about nuanced technical things like operating systems, compilers, and algorithms for the same reason I get worked up about Alan Greenspan has to say: there be potential in those words. Being fluent in the lingua franca of technology and product marketing means I tend to meet interesting people with interesting technology that are baffled why "nobody gets it". Thus my opportunity: translate geek to marketing, marketing to geek, strategize, and drive execution.

Somewhere along the way, I find myself talking to a frustrated geek. We've translated his work into English which is essentially saying we've gone from logical to logical. However, translating the nuances of marketing isn't so easy because it isn't based on logic; rather marketing is based on the expected responses of humans and humans aren't always logical.

The essence of my conversation with the frustrated geek revolves around "Give me an algorithm! Give me a table! Please - wrap this stuff up in a pretty layer of logic for me to digest!" Unfortunately, it isn't that easy and I end up giving two pieces of homework: read Positioning: The Battle for Your Mind and Made to Stick. While Positioning can be a bit nauseating at times, Made to Stick offers that touch of cynicism that makes them credible. Made to Stick is also written by two mildly geeky guys which makes their use of language comfortable to the geeky reader. E.g., Gregorian vs. Julian calendars in date/time functions come up in chapter two.

Of course neither achieve the goal of wrapping logic around human response, but they both provide critical elements to understanding where we're headed and a lot of common stories for us to reference as needed. Both also give a series of concrete grips on an otherwise squishy topic.

After geeks have read both books, conversations on critical points that involve them almost always become more productive. Words like "simplicity" start to resonate much better and they don't get nearly as frustrated about the fact their buyer neither knows nor cares why most things work.

If you're dealing with a geek of your own or you are a geek looking to better understand the apparent lunacy of marketing, both books are recommended reading.

Sunday, August 26, 2007

Random Aside: Burning Man

I'm usually not a follower of ValleyWag, but sometimes you gotta appreciate the gems they post... Like this one:

"We've said it before, and we'll say it again: The only green Burner is a dead Burner. This year's Burning Man arts festival in the Nevada desert has an environmental theme. But an environmental analysis has shown that more than 90 percent of the carbon dioxide spewed by Burning Man participants comes merely in getting to and from Black Rock City, the festival's temporary site. So by all means, pack up your RVs, buy that planet-destroying bottled water, and run your stereos and air conditioning all week off of diesel generators as you celebrate the greening of Burning Man. Go ahead, claim that you're raising "awareness" -- at the same time that you're raising the planet's temperature. You're not fooling anyone -- least of all Mother Nature."

Virtual Machines vs. Bigger SMP

Okay, I admit it. SMP is cool. More processors in the machine makes coolness rise exponentially, not just additively.

So why don't we have a bunch of big SMP machine running the enterprise? Well, actually, we do. While 90% of the servers are at most 2-way x86 machines, they only represent about 50% of the server revenue as of Q4, 2006. The other 50% of the revenue comes from big SMP beasts that cost quite a bit more but represent far fewer machines. So big SMP is not only there, but it is quite alive and kicking.

Why is it though, that we have come to the point where the ratio of physical machines is 9:1 in favor of small servers?

Ahhh... here is where a little IT experience goes a long way. :-)

IT, like any organization, has all of the pros and cons of being run by humans. One of the cons is that humans, for the most part, prefer small succinct solutions to point problems. This makes the problem easier to comprehend and it makes the solution more malleable. A server dedicated to DNS for instance is easy to digest. I know exactly what expertise I need to run the server and the interaction between the environment and the application (DNS) is clear and well defined. There aren't other applications creating complexity. It is this same logic that has created the market for network appliances - one application with one hardware/OS combination with one owner that understands the whole system inside out. Thus, there is only one finger necessary when things go amok and that finger cannot be passed along to others.

Big integrated "God boxes" by comparison are a bit of an uphill climb. They require that the administrator truly grok all of the elements and their interactions. The adoption of a God box happens in one of three situations: (1) The interaction between the elements is so complex that it is clear to the administrator he will never comprehend it and thus an integrated solution with one vendor is preferred, (2) The interaction between the elements so well understood that the benefits of a highly integrated solution can be seen, or (3) There is a cost benefit that is so astounding that the administrator would be foolish to overlook it.

Integrated fax/printer/scanners are an example of this - the elements and their interactions are well understood and there is a significant cost benefit to using it. This makes an administrator overlook the risk of losing all three functions if one of them should go bad. Integrated networking systems are in a similar boat where there is a combination of cost benefit as well as simplification of an otherwise intensely complex system. By abstracting the complexity to within one system, there is now one vendor to point at.

In the world of systems administration, such integrations from vendors never really emerged. A support contract with Sun for instance guarantees that they'll replace/fix hardware and provide software bug fixes, but configuration hiccups are my own problem. If I choose to run many services on a server at once with a complex interaction, I'm responsible for their configuration and management.

In the early 90s, with no appliance vendors to speak of, the solution to derisking complex configurations in Unix servers was simple - buy a few small servers instead of one large one. Each small server could then provide a single function thus reducing complexity. If there is a problem with the software, there is no question regarding configurations interacting poorly with other applications. The rise of x86 based servers running Linux and Windows drove the transition home. Today, only those applications which mandate large SMP systems get them.

With servers doing fewer things and servers getting more powerful, the market for virtualization was created. We want to keep the compartmentalization of the application but gain the benefit of running multiple applications on a single CPU. As a result, the non-SMP to SMP ratio is likely to remain high and possibly get higher.

So amongst the SMP crowd, is there opportunity to eat into that market? Possibly... There are two approaches to further removing the need of large SMP systems: (1) SOA-ification of applications, and (2) Creating virtual SMP clusters with commodity x86 hardware.

Let's start with item 2 first. Historically, creating virtual SMP machines has been a tough sell. The technology has been around since the early 80s in the form of MOSIX. Efforts around distributed shared memory in the early 90s furthered the process. Unfortunately, these efforts largely stayed with the academics. That is until Qlusters came around in the early 2000s. Part of the team that started OpenMOSIX created a company around their effort so that they could sell a commercially supported implementation of MOSIX. Early adopters were in the HPC crowd that needed the massive scalability x86 clusters but didn't want to have to adopt their applications to use clustering software like MPI or PVM. With MOSIX, they just wrote their program as if they had infinite processor and memory space - a much easier proposition, especially for people that weren't programmers by nature. (e.g., scientists, mathematicians, etc.) However, none of these projects ever really took off in the enterprise. This is most likely due to the fact that large ISVs never supported the configurations.

Another issue with virtual SMP clusters is that of host to host latency. If you have an application that needs to do a lot of random memory accesses, getting memory from another host becomes a very expensive part of the equation. Low latency fabrics like LLE and Infiniband help with this, but add significantly to the cost of the overall solution.

Item 1 by comparison is gaining serious momentum, especially with the ISVs. I've written about this before -- the use of SOA is partially the creation of Microsoft with their push on the .Net side as well as the overall industry making a move to XML for everything. The great thing about SOA is that is compartmentalizes things in a way that classic system administrators like - one server doing one thing really well.

Which wins in the end? Well SOA is without a doubt going to be a big part of the solution, but I'm not ready to write off virtual SMP yet. I think there are some startups doing some neat work here and there is the VMware card as well as they are well poised to extend their virtual SMP model across multiple physical hosts.

Let the games continue...

Friday, August 03, 2007

Social bookmarking in business

I've been using for a few weeks now and I have to say that I'm pleasantly surprised. Social bookmarking is a far more efficient way to pass around links to associates, friends, family, etc. than popping open another email since the links can be pulled off the web site or taken from an RSS feed. Especially nice is the fact that readers can pull feeds by tag. For example, if I tag a link as "Sangeet" (meaning that the link is for my niece), my sister simply goes to Now my sister has the link bookmarked and she can skip all of the other links that are unlikely to interest her.

Unless of course she is interested in a pragmatic QoS solution for wireless mesh networks.

Hint: she isn't.

But a business associate I'm working with is. He has to make a significant business decision and needed to understand some details around wireless mesh networks in order to make that decision.

This started me thinking about social bookmarkings' use within business. We pass links around every day to coworkers and associates. We pass around useful things like articles related to our business. Occasionally we even pass around slightly less serious stuff like how math nerds have solved checkers. But each time we pass something around, we add to the email noise pollution. Even worse, readers on their Blackberry and Treos are unlikely to follow through on links - even if the links are relevant and important. Bottom line: if the reader can't click on a link and immediately browse there on their desktop machine, they are unlikely to follow through. Plain and simple.

With the use of RSS feeds growing amongst the ranks of business users, the ability to plug my links into someone's RSS reader means they are seeing the links when its convenient to them - not when they're hopping cell stations on a fast moving train that dives into and out of tunnels on a moment's notice. It's easier to get back to. It's easier to be reminded of the next time they pull up their news reader/bookmarks. It's easier because the link is no longer competing with 200 other incoming emails that arrived the same day. It's easier because by subscribing to the feed, they're implicitly saying that they care about those links because they control who they have to subscribe to and who they don't. Something they can't do with email.

Of course, the proof is in the pudding. This is a change in how people work and I don't expect it to take off immediately. The fundamental step is accepting the use of RSS feeds - something that, ironically, the business side of technology companies have been slow to adopt. However, the notion of feeds are here to stay. Heck, some folks are even using it to deal with... *gasp* email overload.

I'll be trying it with some of my cohorts... stay tuned...

(if you subscribe to my blog's RSS feed that is...)