Today is a good day to code

The Biggest Trick that Jeff Bezos Ever Pulled

Posted: April 19th, 2012 | Author: | Filed under: Amazon, Apple, Companies, Facebook, Google | Tags: , , , , , , , , | No Comments »

Boa Constrictor

For reasons unknown, it seems that the tech media completely fails to give Jeff Bezos and Amazon the recognition that they deserve.  I believe that this is due to a deliberate strategy executed by Amazon to quietly grab as much mind and market share as they can.  If they continue on their trajectory, they may become unassailable, in fact, they may be already.

There are blogs and podcasts called things like Apple Insider, This Week In Google, Mac Break Weekly, etc… I have yet to hear about any blogs or podcasts about what Amazon is doing week-in and week-out, but in many ways it is much more interesting.  Amazon now handles 1% of consumer internet traffic, pushing all of through its near ubiquitous compute cloud infrastructure.  They are rapidly and efficiently dismantling existing retail.  Amazon is probably on their way to completely owning web commerce.  Amazon has mass amounts of data on what people have, want, and will want based on what they own and buy.  Through their mobile applications they are gathering pricing signals from competitors so that they can use their own cluster computing prowess to spot change pricing.

What is shocking about this is, despite their proficiency, no one discusses how absurdly dominant Amazon has become.  Everyone just treats Amazon running all internet commerce and large swaths of its infrastructure as “the way it is.”  Amazon is more a force of nature at this point than a company.

It isn’t just the tech media that doesn’t give them the credit they deserve, major tech companies aren’t either.  Google and Apple seem ready to laugh off the Kindle Fire while Amazon soaks up more signals.  Microsoft doesn’t even try to match them.  Google’s commerce efforts look half-baked compared to what Amazon does, and they show no signs of trying to do better.

It is absurd to think that with the bitter rivalries we constantly hear about between Apple and Google, Microsoft and Google, Microsoft and Apple, etc… that someone would start a podcast about Amazon.  Fifty years from now technology changes will have toppled Apple, Google, Facebook, and Microsoft, but I’d bet that Amazon would still be around.

Jeff Bezos and his company wield algorithms and data more effectively than anyone else in the industry, despite all the credit we give Google for search.  Their suggestion and comment filtering algorithms are bar none, the best around.  Amazon is integrated into the fabric of our lives and that is something that no other tech company has done to that level.

Amazon will keep doing what Amazon does best, being ruthless, being efficient, executing better than anyone else, and staying ahead of the curve.  As long as we keep ignoring them they are doing their job.  The greatest trick Amazon ever pulled was convincing the world that they didn’t exist.  They have convinced the world that they are just retail.


Google’s Vision of the Future is Correct… But They May Not Be The Ones Who Implement It

Posted: January 8th, 2011 | Author: | Filed under: Companies, Google, Lifestyle | Tags: , , , , , , , , | No Comments »

On a drive from Colorado to Las Vegas this past week my daughter and my son were in the back seat of our car using my daughter’s netbook, she has recently turned 7 years old so I bought her a netbook and I am starting to teach her how to code.  My son wanted my daughter to change the video that they were watching and she began to explain how the internet works to him.

She told him that all of her stuff was on the internet ( emphasis mine ) and that the movie that they were watching was the only one that was on her netbook, she explained how her computer was barely useful without the internet, that the internet came from the sky and her computer needed to have a clear view of the sky to receive the internet.  In addition she said that since we were in the car and the roof was obscuring said view that they couldn’t get the internet, and couldn’t change the movie.

Listening to this conversation gave me a bit of pause as I realized that to my children, the internet is an etherial cloud that is always around them.  To me it is a mess of wires, switches and routers with an endpoint that has limited wireless capabilities.  When I thought through it, however, I realized that my kids had never seen a time when someone had to plug in their computer to get to the web.  Plugging in an ethernet cable is as old school as dial-up.

Once that sunk in, I understood that the Cr-48, Google’s Chrome OS netbook is a step in the right direction, and while I am very enthusiastic about several aspects of Google, and in all fairness others’ vision of a web based future, I do not feel that the current approach will work.

A centralized system where all of users’ data lives, and all communications go through is not an architecturally sound approach.  As the number of devices that each user has goes up, the amount, size and types of connections is going to stress the servers exponentially.

It is already incredibly difficult to keep servers running at internet scale, we need entire redundant data centers to keep even small and simple web scale endeavors running.  When you take a step back you realize that a system like Facebook is barely working, it takes constant vigilance and touching to keep it running.  It isn’t like a body where each additional bit adds structural soundness to the overall system, instead each additional bit makes the system more unwieldy and pushes it closer to breaking.

Google is another example of a system that is near the breaking point, obviously they are struggling to keep their physical plant serving their users, and like Facebook they are so clever that they have always been able to meet each challenge and keep it running to date, but looking at the economics of it, the only reason this approach has been endorsed is because of how wildly lucrative mining usage patterns and the data generated by users has been.

I don’t think this will continue to be the case as the web reaches ever larger and larger groups of people. I don’t think any particular centralized infrastructure can scale to every person on the globe, with each individual generating and sharing petabytes of data each year, which is where we are going.

From a security and annoyance perspective, spam, malware, and spyware is going to be an ever increasing, and more dangerous threat.  With so much data centralized in so few companies with such targeted reach, it is pretty easy to send viruses to specific people, or to gain access to specific individuals’ data.  If an advertising company can use a platform to show an ad to you, why can’t a hacker or virus writer?

The other problem that is currently affecting Google severely, with Facebook next is content spam.  It is those parking pages that you come across when you mistype something in Google.  Google should have removed these pages ages ago, but their policy allows for them to exist.  Look at all of the stack overflow clones out there, they add no real value for themselves except for delivering Google adsense off of creative commons content.  What is annoying is that because of the ads, they take forever to load.  Using a search engine like Duck Duck Go things are better, but this is likely only because it is still small.  DDG also says that it will not track its users, that is awesome, but how long will that last?

It is possible for a singly altruistic person to algorithmically remove the crap from the web in their search engine, but eventually it seems that everyone bows to commercial pressure and lets it in in one fashion or another.

Concentrating all of the advertising, content aggregation, and the content in a couple of places seems nearsighted as well.  The best way to make data robust is to distribute it, making Facebook the only place where you keep your pictures, or Google, or Apple for that matter is probably a bad idea, maybe it makes sense to use all three, but that is a nuisance, and these companies are not likely to ever really cooperate.

It seems to me that something more akin to diaspora, with a little bit of Google wave, XMPP, the iTunes App Store, and BitTorrent is a better approach.  Simply, content needs to be pushed out to the edges with small private clouds that are federated.

This destroys most of the value concentrated by the incumbents based on advertising, but creates the opportunity for the free market to bring its forces to bear on the web.  If a particular user has content that is valuable, they can make it available for a fee, as long as a directory service can be created that allows people to find that content, and the ACLs for that content exist on, and are under the control of the creator, that individual’s creation can not be stolen.

Once the web is truly pervasive then this sort of system can be built, it will, however, require new runtimes, new languages, protocols, and operating systems.  This approach is so disruptive that none of the existing large internet companies are likely to pursue it.  I intend to work on it, but I’m so busy that it is difficult.  Fortunately, however my current endeavor is has aspects that are helping me build skills that will be useful for this later, such as the Beam/Erlang/OTP VM.

The benefit is to individuals more than it is to companies, it is similar to the concept of a decentralized power grid.  Each node is a generator and self sufficient and the system is nearly impossible to destroy as long as there is more than one node.


Steven Wolfram’s Computation Knowledge Engine

Posted: March 9th, 2009 | Author: | Filed under: artificial intelligence | Tags: , , , , | 2 Comments »

Ars Technica has no faith.  They are already saying that Wolfram’s knowledge engine will fail based, I’d imagine on the complete and utter disaster that Cuil and other would be google challengers have been.  Here’s why I think that the computation knowledge engine can be a success.

First of all, its Stephen Wolfram, who truly shouldn’t be underestimated.  He is also not trying to say that it can cure cancer, really he isn’t saying what it can do, or what its ultimate goal is.  Except to say that it is going to answer simple questions.  I don’t understand why this is impossible.  Technology is clearly accelerating at a near exponential rate.  The same improvement in technology and science between 1997 and 2000, was probably accomplished by June 2002, and so on.  If that is to be accepted, then you have to believe that at some point soon we should get to an intelligent system that can answer a simple question like what color is the sky.  Not by looking it up in a database, but by actually reasoning out the answer.

I think that Ars isn’t giving these guys enough credit.  I can’t wait to see what they have cooked up.


A Response To: “The CSS Corner: Using Filters In IE8”

Posted: February 23rd, 2009 | Author: | Filed under: Companies, CSS, Microsoft | Tags: , , , , , , , | No Comments »

Well, the IE team has posted an excuse for why IE 8 will not handle widely used CSS 3 extensions.  The reason, its hard, and it was a stretch goal.  Instead we are left with a slightly more standard implementation of the filter css attribute, -ms-filter, as opposed to filter.

Furthermore, the IE team claims that they are doing this so that “web authors do not have to rewrite their stylesheets”. 

OK.  Let’s look at this objectively.  It is indeed hard, building a web browser from scratch is no joke.  I have tried several times, and I am still trying to build a web browser.  I have tried this in C++, Java, even Ruby.  It is always hard.  Most of the difficulty comes from trying to render pages that aren’t formatted properly.  Right or wrong, it is how the web is currently built.  However; I have a radical solution, I apologize in advance for the shout, *USE WEBKIT*.  Why is this a problem?  It would be easy to use the standard msie7.dll or whatever for pages that need the *broken* button in IE 8.  Then use a new WebKit based render mswebkit.dll for pages that are standards compliant, or not using that strange IE 7 tag.  If Multiple IEs works, this would be completely possible.

Let’s take a quick look at why Microsoft might not want to do this.  Google uses WebKit, and Apple uses WebKit.  As far as the technical difficulty in this, many lesser organizations have implemented a WebKit based browser from the webkit source without hiring a million developers.  I think that an organization like Microsoft should be able to handle building a browser using or based on WebKit within a few months.  I wish Microsoft could occasionally be more like Google and throw out the product managers and just build what the world wants. I don’t understand why they can’t consider this.

Now about the sentence, so that “web authors do not have to rewrite their stylesheets.” I am a web author, and I will not rewrite my stylesheet.  IE users, I am sorry, you will just have to live with a broken layout.  I do not have the time or the interest in rewriting my cool, cutting edge web applications to work with 10 year old technology.  They said this stuff was written originally in IE 4.  It came out for the PC originally in 1997!  Come on, advance!  I will not write anything for IE.  I will make sure it functions and none of the tasks that a user would do in my web applications are blocked, but I am not going to try to make it have rounded rects, or opacity, if IE doesn’t support web standards.  That sentence alone indicates Microsoft’s hubris, note the “have.”  If it were mozilla, they would say so that web authors don’t want to rewrite their stylesheets, not that they would ever have that problem.  Microsoft is still pretending that IE is relevant as far as developer mindshare.  

Microsoft does some amazing things, but as far as the web is concerned, it is pretty much off my radar.  Users, please, please upgrade your browser to Chrome, Firefox, or Safari.


NBC Still Doesn’t Get It

Posted: February 23rd, 2009 | Author: | Filed under: Hulu, Media, NBC | Tags: , , , , , , , | No Comments »

I was wondering how long it would be before NBC started to bite the hand that feeds it with Hulu.  I still think that eventually NBC will completely kill it, with help from Comcast.

First of all, I was amazed that Hulu was allowed to exist, and once it did, I started to count the days until it was killed.  Not that I don’t love Hulu, I do.  I think that Hulu, Joost, and other sites that blend big media programs with net content are awesome.  It allows me to watch television again.

For several years, I didn’t care what was on TV, I didn’t really watch TV.  All I did was watch Netflix and YouTube, when I wanted to consume video.  I know that I am not the only one that doesn’t have time to sit down and watch my favorite TV shows in primetime.  Not to mention that I don’t even know when most of  the shows I like air.  Before Hulu, and Joost, I didn’t even care, I just stopped watching.

What I can’t understand is why NBC seems to not understand that no one wants to watch TV at preset times anymore.  Not to mention that if I am going to be advertised to, I don’t want to pay for the “privilege” to watch shows at a time of my own choosing.

As far as Comcast is concerned, I can’t believe that it is a mistake that everytime I watch a show on Hulu, now that Comcast has run my local ISP out of business and bought it at bargain basement prices, all I see are US Military and Comcast ads.  Comcast, I am not going to pay more for your cable package, I am not going to pay for your “on demand,” and as soon as there is an alternative that can provide some semblance of decent speed, I am not going to pay for your compromised internet.  They claim that they are packet prioritizing to ensure network integrity, but Hulu is much slower than it used to be, even while doing speed test show something like 14MB burst downloads.  That doesn’t make sense.  I went from 4.5 MB down to 6 with 14 burst and it is slower?  I have all this speed, but I can’t use it for anything… Fail…

By removing its content from Hulu affiliate sites, NBC is proving that they don’t get that consumers want to consume video in the way they want to consume it.  I am seriously considering just buying this stuff from iTunes and being done with it.  CBS gets it, and are doing a good job, the only problem is that they just don’t have the content.

I think they must believe that if people can’t get the NBC content anywhere except for the TV, that they will just sit in front of the Boob Tube and watch it, but they are wrong.  People will stop knowing about the shows, and will begin to look for alternatives like video games, or short, indie programs that will be readily available on online only networks link ON Networks, Revision 3, etc…  I already consume way more video podcasts than TV shows anyway, it wouldn’t take much for me to just drop TV entirely.  What would that do to NBC’s ad revenues?  Comcast needs to get a clue and realize that they are a dumb pipe, and they need to forget about the Coax business and get with the TV over IP business.  If they want to compete, how about creating their own quality content to get the ad business instead of crapifying my internet connection, and spamming me to try to get me to embrace their dying business model.

NBC will never get it.  They need to just go away.  I like a few of their programs, like Battlestar and Heroes, but I am not sure that it is worth the effort, especially with iTunes and Netflix around.  If they take that away, well I just don’t know, perhaps I’ll have to write and produce my own Sci-Fi stories.

NBC (Hulu) Removes content from Boxee


Internet Explorer 6 Hangs with Multiple Connections

Posted: December 31st, 1969 | Author: | Filed under: ColdFusion, JavaScript, Programming, Uncategorized | Tags: , , , | 6 Comments »

Internet Explorer 6 Hangs with Multiple Connections

At work we are using the demis map server, which by itself is an incredible application. We had built a flash based client as our application to allow people to see images overlaid on top of the vector data digested by the map server. One of the issues we had observed with the application was that it tended to hang, or stop responding when a user would ask for many images to be shown on top of the vector map, then they navigated away from the current screen. Now, since I had seen the code and it was a mess with JavaScript setting cookies that ColdFusion was supposed to read and pass to flash, and images for checkboxes, I automatically suspected the code. However, the problem was deeper than that.

The code needs to be rewritten no doubt, there are many more efficiencies to be had, but that didn’t explain the hang. I combed over the server, watching response while a user was using the application. The map server stresses the machine, because it needs a ton of I/O and it would spike the CPU frequently, but no processes went to 99% CPU utilization, and the server seemed to respond to other clients even when one of them was hung up. It was pretty clear then that the problem wasn’t with the server. To take this logic a little further, we built a load test using wget and saving the result to a file. We looped over the calls as fast as we could and we never caused the map server to hang. It performed as expected.

The next logical step was to look at the possibility of corrupt files. We did notice that we could get the map server to crash when we fed it corrupt files, but we found no eveidence that the files that we were using in production were corrupt in any way. At this point we were plenty dejected, because we had spent something like 35 hours over a couple days working on this problem and we had nothing. We performed a new ColdFusion install on a different server, we built a server with better hardware, we reinstalled the map server application multiple times, nothing seemed to affect it. We even improved the network bandwidth available to the client, still nothing. At that point I was down to either it was the code, or it was the client.

To test this theory I commented out all of the flash calls on every page and went through the application to try to cause the system to hang. I couldn’t do it, so I had effectively limited the possible cause to the Flash movie. I started to go through what the Flash movie was doing, and what could cause it to fail. The demis people told us that they had seen hangs when the map server wasn’t responding, and the Flash player was parsing XML. This lead me to try the application in Firefox, and lo and behold, it never hung up. It worked like a charm. The only problem was that our client was set on Microsoft Internet Explorer

I started about the arduous task of removing all XML parsing from the Flash code, then I tried it and it still hung. I was truly disappointed, but I rethought what was happening with the XML. It was making server calls, I realized that I could have up to 8 consecutive connections going on. At the time I thought it was nothing, but then I started trying to find out what was different between Internet Explorer and Firefox. I happened upon an article on MSDN about a known bug that Internet Explorer will hang for 5 minutes when there are 2 persistent connections to a server, and rich content is downloaded. I had found my culprit. It turns out that I had to add 2 keys to the registry. MaxConnectionsPerServer, and MaxConnectionsPer1_0Server. I set the latter to 8 and the former to 24, hexadecimal. The keys need to be DWORD keys.

That would allow 8 connections for HTTP 1.0 and 32 or so connections for HTTP 1.1. The HTTP 1.1 guidelines recommend that there only be 2 connections allowed, but if Firefox wasn’t adhering to it, why should I. I added the keys to HKEY_CURRENT_USER>Software>Microsoft>Windows>Current Version>Internet Settings and it worked like a charm. Everything was perfect. Talk about looking for a needle-in-a-haystack. I’m still amazed that I found it.

The purpose of this entry is so that no one has to go through the week that I just went through. Generally no software should be in front of the client before it is ready, but in this case we already had a client. Hopefully this will help anyone out there who is experiencing hangs in Internet Explorer. Darn Microsoft and not fixing bugs for almost 3 years!

*EDIT Make that 8 years, since IE 8 appears to still suffer from the same problem!*

Here are some helpful links that might be better at explaining than I am…

Wininet Connection Issue

IE Hang Issue


Macromedia / Adobe Flash and AJAX: Companions or Adversaries

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

Macromedia / Adobe Flash and AJAX: Companions or Adversaries

Picture of Irv Owens Web DeveloperOne of the hottest new things in web development right now is pretty old. JavaScript is taking the world by storm through the XMLHTTPRequest. My question is, isn't this exactly what Flash MX was designed to do?

I have only been working with Flash for about three-and-a-half years, and one of the first things that drew me to it was the ability to get and post to other pages without a page refresh. Flash was designed to do this from the beginning. With the ColdFusion flash gateway, developers can even directly access CFCs and other template pages. The question then is do we really need AJAX?

I think so. One of the benefits to using AJAX is that it is possible to create standards compliant web pages that are more dependent on the resources of the client and less on the server. Back in the nineties, it was much better to rely on the servers because they often had more computing power, but now desktops are very powerful and most can handle the rigors of sorting and validating data. These are probably some of the more banal uses of AJAX, but these are things that should be handled by the client and not the server.

There will be some overlap between AJAX and Flash. Many in the AJAX camp will claim that AJAX is much lighter than Flash as far as bandwidth is concerned, and I can see that poorly designed Flash will take more bandwidth than well designed Flash. It is possible to draw components with actionscript. This puts the drawing entirely up to the client, with the Flash movie being mostly just compressed script. If AJAX needs to use graphics, it has to send them via CSS during the initial download, and afterwards these images will be available as long as they are in the browser's cache. It is even possible, as it is in Flash, to have the initial page appear while still downloading components.

I think that for some projects AJAX will be the technology of choice, but for others Flash MX will be optimal. Personally, I believe that for most of the jobs you could do with AJAX, Flash will be the faster solution because of the well designed nature of the IDE. Flash is now a platform and the Flash Development Environment is the tool. Macromedia is going to embrace Eclipse to try to get Java developers to see the benefits of creating web applications with Flash. I think that in the long run, Flash is a good bet, and that AJAX is sort of a fad that will become less and less a good choice as bandwidth becomes more available. I like a lot of what is happening with AJAX, and hopefully the developers of Flash will keep working toward accessibility. But in the end, well designed flash applications are hard to beat. They don't need screen refreshes, the Macromedia components are well designed and often will take XML as their data source. The applications allow more interface flexibility than traditional CSS, although this is changing, and overall lead to a better user experience.

So why do I bash Flash constantly? My negativity where Flash is concerned comes from having to endure many, many very poor Flash websites and applications that use Flash just because it moves. The developers often spend little or no time in working with the actionscript, and they don't plan for low bandwith users. Many Flash developers believe that the dial-up and ISDN / Mobile users don't matter and that is simply bankrupt thinking. Developers should plan and develop for the least common denominator. A light design can still be a good design, and is often, in my opinion, the best design. AJAX lends itself to better developer practices by its complexity, but I don't believe that complexity is ever a good solution to a problem. Perhaps with the introduction of AJAX tools, and an IDE this complexity could be improved upon, and we are already seeing the beginning of the uses of AJAX in web applications and they are quite impressive, but most of the impressiveness comes from the fact that they are doing it without Flash, not from the application itself.

The fact is that over 90% of the web is Flash plugin enabled, and it is a relatively small and fast download. If you want to design really solid applications, take everything you have learned about minimal design and apply that to flash development. Perhaps then, Flash can turn its negative image around and become a real tool for business solutions.

About AJAX
Flash Remoting LiveDocs


I Finally Found a Real-World Use For AJAX

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

I Finally Found a Real-World Use For AJAX

Picture of Irv Owens Web DeveloperI'm working on this project now that makes heavy use of standard javascript, Flash, ColdFusion, and HTML. The project is using the TCallFrame javascript function to control the flash movie. Because of the nature of the project, the timing between the movie refresh calls makes using a refreshing iFrame not possible for the text information that I need to display. Likewise, the text needs to be displayed using complicated styles and colors, controlled by ColdFusion components that Flash can't handle easily.

So I was faced with a dilemma. I couldn't refresh an iFrame without throwing the timing of the TCallFrame request, combined with some javascript cookie writing, I didn't write the original application, and Flash remoting calls. Basically, nothing should happen until the cookie is written, which it doesn't, most of the time.

It eventually dawned on me, not only that the application needed to be strengthened with some type of cookie listener at the very least, and a small-scale rewrite at the most, but that I only really needed the application to think that there was no iFrame refresh, but an AJAX call using an invisible iFrame could work.

So I began working on it and it evolved into a two-part system. On one hand, the div that contained the text to be returned from the ColdFusion components needed to be continuously refreshed, which I found out after struggling with another timing issue. Eventually I realized that the variable that needed to be set in my parent window was empty at the time of the screen refresh because the iFrame hadn't finished loading yet.

So what I had to have happen was for the iFrame to make a javascript function call to the parent once the variable was written. Once I had that accomplished it was easy to have the checkboxes that called the TCallFrame javascript call the function to refresh the iFrame. It worked wonderfully, but unfortunately it still caused a timing issue with the main movie. It delayed the writing of the cookie by a hair. Still it was cool to see, that I had an actual use for AJAX.

I have actually found other uses for it, but none as clear cut as that. Flash just wasn't flexible enough. I'm not reversing myself, I still think that Flash should be most developers' first choice when it comes to remoting, but in a pinch AJAX is allright!


Why Flash is Still Oh, So Wrong for So Many

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

Why Flash is Still Oh, So Wrong for So Many

Picture of Irv Owens Web DeveloperI have recently come under some minor pressure from various factions about why, while knowing Flash fairly well, I am always reluctant to design and build a flash site featuring the technology. My history with Flash is pretty much the same as most other developers. My first versions of this very site three or four years ago were made entirely in Flash, as were many of my customers sites. Flash seemed like the way to go. It rendered the same in every browser, fonts weren't an issue, and it allowed an incredible amount of freedom to create.

So why then were my sites so problematic. The first issue was one of bandwidth. I had music and lots of motion on these sites. They were extremely interactive and eye catching. The problems came up when users had to come to my site using dial-up. When they hit the site and saw the loading bar, the first thing they did was to click back and go on to another site. My webtrends illuminated this for me. My next step was to go more minimal, which is my favorite thing to do, but then I wondered why I was using Flash at all, because now the motion was mostly gone, and so was the majority of the interactivity. I was using flash simply for the z-index, and I was finding that I could do this with CSS. So, not to be deterred, I did another redesign that kept the motion and interactivity, but minimized the huge bitmap graphics that were giving me the long download times. Instead, I used vector graphics. These were much smaller, but now I had a new problem. If my clients didn't have at least a Pentium 4 running at greater than 2 GHz, my site ran slowly, so slowly that it was almost unusable.

The next issue was that in all the time I had my site, I could never find it using search engines. I discovered that search engines couldn't index my site because they couldn't see through the Flash. To the spider, my site looked like a huge gif in a HTML file with some meta-tags. In other words, it looked like nothing. I tried alt tags, no script tags, etc… but nothing helped. Finally, I decided to design an alternate site for dial-up users using good ol' XHTML and CSS. I found that as soon as I uploaded the file, the search engines had me, and no one ever visited my Flash site anymore.

Suffice it to say that I took my Flash site down. Later, I would redisign my site again so that it would adhere to web standards and could render even faster for all users. That site is this one, and it is the first that I am happy with. I am enjoying some minor success with getting listed on search engines and blog aggregators, and life is good.

I don't hate Flash any more than I hate Allen wrenches or crowbars. It is a tool, and typically you try to use the right tool for the job. It seems to me that many web developers, however are trying to use a sledgehammer to staple two pages together. It just doesn't work. In some cases Flash is OK. In corporate settings, Flash is an excellent tool for presentations, product demonstrations, promotional materials delivered through the company intranet, or from the presenter's local hard drive, as long as it doesn't have to be delivered over the web.

There are a few cases where it is perfectly reasonable for designer / developers to build flash-only web sites for people. Art sites, such as photography showcases can benefit from Flash and its fantastic bitmap compression. Flash photography sites can often download faster than their HTML / CSS counterparts due to smaller image sizes. Some product demonstrations can benefit from Flash and its interactivity. Many cellular phone providers have used Flash to great effect in this regard. Simple branding banners contained within standard HTML / CSS pages with limited motion and interactivity can be excellent, as long as the text of the page is available for the user to read while the Flash is loading.

Still, designers and developers need to ask themselves, what exactly am I trying to do, and who is my target customer? I have had a very hard time making a solid business case for Flash on most of my ecommerce and business sites. Flash, like ColdFusion and Chess, takes only a minute to learn, and can take a lifetime to master. There is a lot to Flash, and a good designer knows how and when to use it to make a site look more professional, or to enhance content that may otherwise appear to be bland. However, beginners seem to tend to develop only in Flash because it addresses many of the apparent problems with XHTML / CSS. Those of browser incompatability, having to learn JavaScript, etc. Someone with limited knowledge of ActionScript and no knowledge of HTML is able to open Flash MX 2004 and create a website. Many designers use Flash exclusively, for this reason.

It seems that XHTML / CSS / JavaScript is having a renaissance. With the proliferation of blog sites, and better browser support of web standards many Flash sites are starting to look tired, and compared with the relative quick response of the HTML sites, many users are deciding to click away from the loading screens in favor of a site with similar content, or products, that is designed in standards compliant XHTML. Not because they love web standards, but because to the user the XHTML site works better and they don't have to wait. I have actually heard designers say that they don't care if dial-up users can't access the site, it has to be beautiful. This thinking is bankrupt, probably 80% of the country is still using dial-up. BroadBand is still frequently ridiculously expensive, and until this changes Flash will be limited to design and car sites mostly, while the bulk of the web is built using XHTML.

I'd actually like to see that change. I'd like to see 3 Mbps synchronous connections standard in every home across the country, and Flash sites loading instantly, but the reality is that it won't happen within the next 5 to 10 years. At least not until garbage cable company decides to charge reasonable rates, and build better fiber backbones, and adequate DNS resources.

In the meantime, I'm quite happy with CSS / XHTML. It does everything I used to do with Flash, but it does it faster and is more accessible. Hopefully more designers will build standards compliant sites, and will realize they can be every bit as beautiful as Flash sites. Check out csszengarden.com to see other great CSS designs.


Microsoft IE Developer Toolbar

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

Microsoft IE Developer Toolbar

Picture of Irv Owens Web DeveloperI didn't even know about this and it has been out for about a month. Microsoft has heard the cries from web developers used to using Firefox's developer toolbar extension. While it is often pretty easy to validate your pages using Firefox, see how your block level elements are behaving, and look at the DOM of your page using the Firefox extension, it has been almost impossible with the awful lack of tools for Internet Explorer. They have finally addressed this.

The new IE Developer Toolbar has almost everything that its Firefox adversary has, except for the strong javascript debugger. This is very upsetting especially considering the lame debugging that is built into IE today, but with the relative dearth of tools for internet explorer, anything is welcome.

I have found the toolbar to be extremely useful. The DOM inspector is wonderful in that it highlights the selected item if visible to indicate for which item you are viewing properties. If you have to build applications or websites using Internet Explorer at work, I hope you are designing for Firefox at home, no… I guess you always have to design for Internet Explorer, then you will love the new toolbar. I'd suggest that you download it and install it right away.

IE Developer Toolbar