Today is a good day to code

The Future of Scripting

Posted: December 31st, 1969 | Author: | Filed under: ColdFusion, Companies, Microsoft, Programming, Sun Microsystems, Uncategorized | Tags: , , , | 1 Comment »

The Future of Scripting

Picture of Irv Owens Web DeveloperInitially I wanted to stay away from scripting languages as a developer due to the fact that they weren't really programming languages at all. For some time I was reluctant to even call myself a programmer until I built my first Java desktop application. In CNET's open source blog today, they ask the question has scripting peaked?

Scripting hasn't peaked out yet. The reason is clear. Building a web site with C++ or Java is like driving an armored tank to your mailbox. It is that ridiculous. The funny thing is that even Microsoft realizes this, giving their ASP.net developers two languages to choose from when developing web applications. There are many reasons for enterprises to choose C# over Visual Basic when building a web application, especially if they already have desktop and client-server applications built using the technology. It would be possible to completely reuse many of the methods used in the desktop application for the web application. The frameworks built into J2EE as well as C# allow for robust development making it less likely that a developer will lose control of their code. Still, using these technologies and frameworks where a scripting language and a light framework would do adds un-necessary overhead to a project and can push deadlines out unreasonably.

Here's what I see. PHP is a fantastic scripting language that has no real back end and therefore is suitable for light to moderate customer facing websites and some intranet applications. Use of PHP in this regard will only continue to grow. I think some of the 25% decline in worldwide use is a reactive measure to PHP's early security vulnerability. PHP is losing ground quickly to ASP.net and VB scripting as Microsoft's Server 2003 is more widely adopted. Personally I think that LAMP is superior for many tasks, but ASP.net is almost ubiquitous now, hosting and maintenance are cheap. I'll continue to use PHP for light jobs, but at the same time I realize that this is just a preference and performance-wise ASP.net is better. Talking about Java… Sun needs to buy ColdFusion from Macromedia / Adobe. It should be THE Java application server. There is no cleaner and easier scripting language, and it has nearly unlimited flexibility and is design-pattern friendly. Why this move hasn't occured yet is beyond me. It would have made sense for Macromedia to sell it, but I think the issue is that Sun has many proud engineers who love to over develop products. The thought of supporting something as business friendly as ColdFusion probably makes them sick. The business case for this is probably that Macromedia probably sees the big picture and that there are big bucks in ColdFusion, especially now that enterprises are seeing it as a way to get around JSP's notoriously long development cycles.

I see scripting as having a bright future, and I'll tend to side with Zend's guys as saying that regardless of how the Evans study got its numbers, PHP is increasing in use not decreasing. I'm not sure if it is true, but if the next version of IIS is going to have PHP support built-in, I'll be seriously considering going with a Microsoft server in the near future and running it alongside ColdFusion. I like PHP, but I just like ColdFusion better.

news.com – Scripting's demise


What is this Y!Q stuff?

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

What is this Y!Q stuff?

Picture of Irv Owens Web DeveloperYou may have noticed all of the Y!Q links everywhere on my site. It is a new beta product from Yahoo! that allows people to perform web searches constrained by selected content from the page they are searching from. The content that goes to Yahoo! is selected by the publisher and targeted to return even more relevant results than would be possible going directly to the search engine.

When a user visits a search engine, the system has no background about the person to constrain their results so it makes it difficult to perform a search, for example if I knew someone were from Washington State, and they typed in the word apple, then I could assume they might be looking for apple wholesalers, or apple growers, or apple trees. If someone from California searched for the word apple, I might return the company. This is possible if you know something about the person who is searching, which is why personalized search has been receiving more focus of late.

I prefer the context based approach, because then I don't have to provide any personal information for the search engine to give me what I want. It would know just by the content of the web page that I am searching from.

I'll be honing the coldfusion parsing scripts to give the best possible content to Yahoo! I'll be removing words that are less than four characters in length from the article, to get rid of parts of words and words that carry little meaning like 'the.' I hope to have the best, most relevant results, because Yahoo! is offering $5,000 in their contest. Of course there had to be some motive for me to use this beta program!

I suppose that in its final iteration, Yahoo! will create some type of advertising revenue sharing model similar to Google's adwords. They seem to be hoping that it will generate more clicks because of its usefulness to the user. It is still kind of buggy, for example in all browsers other than Safari 2.0 a semi-transparent overlay pops up when the Y!Q link is pressed, on Safari, it takes you to Yahoo's relevant results page. Hopefully they will fix this soon, I'm pretty sure it has something to do with the changes Apple made to Safari's javascript processing engine. Also, since I am trying to automate this, sometimes a character gets into the string, and causes the Y!Q to return something not valid. I hope this will help with your searching.


Pondering Switching the Other Way

Posted: December 31st, 1969 | Author: | Filed under: Apple, ColdFusion, Programming, Uncategorized | Tags: , , | No Comments »

Pondering Switching the Other Way

Picture of Irv Owens Web DeveloperWhile I have been a Mac user for the past five years almost exclusively, I have been thinking lately about switching back to using a PC. The reasons for this truly stem from my need for the ultimate in geekery. I'd really like to get a dual-core Pentium 4. The tremendous advantage is that these cores also employ hyperthreading which to the OS looks like four discreet CPUs. Also, I have the urge to work in several 3D programs, none the least of which is Swift3D, that I have noticed run significantly faster on the newest Intel and AMD based machines than on the Macs.

But the Macs are going to go Intel you say. That is true, but the Mac prices aren't going to change. That is almost guaranteed. There is no way Apple is giving up it's hardware margins, nor should they. I have a choice, and I can get more bang for my buck going with a PC. This has always been true, but at one time I was happy with an iMac G3. The iMac has always been competetively priced relative to it's PC counterparts so I was content. When I first bought my G5 I was relatively content. Now the issue is that G5s cost about $2,000 at the entry level. I can take that money to Dell and get a Dual-Core P4 that will take it's lunch money on any given day, albeit with several crashes along the way.

But you by a Mac for the software, that is why it is worth it. This is true, and Mac OS X is definately superior to Windows XP and probably it's upcoming service pack, Windows Vista. I will miss it, but running Mac OS X does not enhance my productivity in any tangible way, it just looks better and the entire OS crashes less. I have had plenty of application crashes, which are about the same.

What it comes down to is what my current computing needs are vs my wallet, and in that game the Mac is at a severe disadvantage. We won't even talk about gaming. But the ultimate reason is my geekiness. I have a weak spot for Visual Studio 2005. After using several betas of the application via Microsoft's Express Beta program, I have to say I am impressed with the ease of developing using C# in this IDE. Their visual web developer software is equally compelling although unless I had to I wouldn't use ASP.net for just about anything. Not because it is bad, but because it takes so much longer to develop anything in than ColdFusion or PHP. Ultimately, my love of new technology and my desire to retain as much of my cash as possible is fueling this internal debate. I will probably not buy another Mac because of the cost, but at the same time I will not give up my iBook. I'll probably carry a Mac laptop for the forseeable future. My workhorse, the desktop however is definately another story.


JoostBook – Joost to Facebook Interface Widget

Posted: December 31st, 1969 | Author: | Filed under: java, JavaScript, Programming, Uncategorized | Tags: , , , | No Comments »

JoostBook – Joost to Facebook Interface Widget

Picture of IrvinSince I'm in love with Joost, I have been thinking about good applications that I could write for the platform. Before I get into talking about the widget / plugin, let me just say that the experience I have had with communicating with the Joost engineers, through their joost-dev google group, as well as them allowing early access to their SDK, has been outstanding. I have rarely come across a more open and generous group. Typically, the SDK guardians are very selfish about discussing future features, and are usually quite arrogant about the possibility of a developer finding an undiscovered bug. None of this has been the case with the Joost SDK staff.

If you don't want to read the details about how I built it, and you just want to use it, you can get it here: JoostBook: Joost / Facebook Interface. You will need Joost, and a facebook account to get started.

Now, about the widget. Firstly, the installation is a little wierd because of the level of control facebook insists on. In order to use the SDK, you have to authenticate, if an unauthenticated request is made, the response is with the facebook login page. This makes for some unique error catching conditions.

Secondly, we web developers often take for granted that the DOM will have a listener attached to it, and will automatically refresh if anything in the DOM changes. Well, I know that the Joost engineers are working on it, but it doesn't refresh, and therefore, while you can create new XHTML elements, as well as modify the ones that are there with JavaScript. You are best off currently just hardcoding all of your objects up-front, and changing their contents. Also, injecting XHTML using innerHTML doesn't really work so well currently either. I'd suspect that much of this is because there is a bridge between the 2D world of XULRunner / Mozilla, and the 3D world of the Joost interface. I'm sure there is a lot of complexity between the two.

So basically, once you have downloaded Joost, and installed the plugin, the first thing I had to do was check for if you are logged in, if you aren't logged in, it has to show you the facebook login page in an iframe so that the XULRunner browser can be cookied. After that, the widget should work like one would expect. You may have to log in alot, and if you aren't logged in, obviously the application can't update the JoostBook facebook application.

Writing the Joost plugin was the easy part, getting the facebook stuff to work was the hard part. Most of it was because the error handling is terrible. Since facebook doesn't allow you to see the 500 errors that your server is throwing, and it doesn't log it, you have to find other ways to check to see if your server is behaving properly. I spent a lot of time in my logs checking for errors.

The install process is a little wierd too, for example, in Firefox 2.0.0.8 on Windows XP, when I clicked on the Joda file linked in the page, it tried to open it as if it were some kind of markup file, obviously the joda looked like garbage, I had to right click and save. Perhaps if I had used a joost:// link it would have worked OK, but I think more research is in order. I didn't really try it in IE because most of the readers of this blog use Firefox, but it should work the same way.

Then having to install the application in facebook can be a little difficult as well. Well, the installation isn't difficult, its the concept that you have to install two applications that work together that is hard. At least there is no particular order in which you need to install them, worst case whenever you run the JoostBook plugin in Joost, it'll show you the facebook login page all the time.

At any rate, it was a fun experience, and I still think the guys at Joost are on to something. I'm slightly less psyched about the facebook platform, but I'm still excited about it.


Internet Explorer 6 Hangs with Multiple Connections

Posted: December 31st, 1969 | Author: | Filed under: ColdFusion, JavaScript, Programming, Uncategorized | Tags: , , , | 6 Comments »

Internet Explorer 6 Hangs with Multiple Connections

At work we are using the demis map server, which by itself is an incredible application. We had built a flash based client as our application to allow people to see images overlaid on top of the vector data digested by the map server. One of the issues we had observed with the application was that it tended to hang, or stop responding when a user would ask for many images to be shown on top of the vector map, then they navigated away from the current screen. Now, since I had seen the code and it was a mess with JavaScript setting cookies that ColdFusion was supposed to read and pass to flash, and images for checkboxes, I automatically suspected the code. However, the problem was deeper than that.

The code needs to be rewritten no doubt, there are many more efficiencies to be had, but that didn’t explain the hang. I combed over the server, watching response while a user was using the application. The map server stresses the machine, because it needs a ton of I/O and it would spike the CPU frequently, but no processes went to 99% CPU utilization, and the server seemed to respond to other clients even when one of them was hung up. It was pretty clear then that the problem wasn’t with the server. To take this logic a little further, we built a load test using wget and saving the result to a file. We looped over the calls as fast as we could and we never caused the map server to hang. It performed as expected.

The next logical step was to look at the possibility of corrupt files. We did notice that we could get the map server to crash when we fed it corrupt files, but we found no eveidence that the files that we were using in production were corrupt in any way. At this point we were plenty dejected, because we had spent something like 35 hours over a couple days working on this problem and we had nothing. We performed a new ColdFusion install on a different server, we built a server with better hardware, we reinstalled the map server application multiple times, nothing seemed to affect it. We even improved the network bandwidth available to the client, still nothing. At that point I was down to either it was the code, or it was the client.

To test this theory I commented out all of the flash calls on every page and went through the application to try to cause the system to hang. I couldn’t do it, so I had effectively limited the possible cause to the Flash movie. I started to go through what the Flash movie was doing, and what could cause it to fail. The demis people told us that they had seen hangs when the map server wasn’t responding, and the Flash player was parsing XML. This lead me to try the application in Firefox, and lo and behold, it never hung up. It worked like a charm. The only problem was that our client was set on Microsoft Internet Explorer

I started about the arduous task of removing all XML parsing from the Flash code, then I tried it and it still hung. I was truly disappointed, but I rethought what was happening with the XML. It was making server calls, I realized that I could have up to 8 consecutive connections going on. At the time I thought it was nothing, but then I started trying to find out what was different between Internet Explorer and Firefox. I happened upon an article on MSDN about a known bug that Internet Explorer will hang for 5 minutes when there are 2 persistent connections to a server, and rich content is downloaded. I had found my culprit. It turns out that I had to add 2 keys to the registry. MaxConnectionsPerServer, and MaxConnectionsPer1_0Server. I set the latter to 8 and the former to 24, hexadecimal. The keys need to be DWORD keys.

That would allow 8 connections for HTTP 1.0 and 32 or so connections for HTTP 1.1. The HTTP 1.1 guidelines recommend that there only be 2 connections allowed, but if Firefox wasn’t adhering to it, why should I. I added the keys to HKEY_CURRENT_USER>Software>Microsoft>Windows>Current Version>Internet Settings and it worked like a charm. Everything was perfect. Talk about looking for a needle-in-a-haystack. I’m still amazed that I found it.

The purpose of this entry is so that no one has to go through the week that I just went through. Generally no software should be in front of the client before it is ready, but in this case we already had a client. Hopefully this will help anyone out there who is experiencing hangs in Internet Explorer. Darn Microsoft and not fixing bugs for almost 3 years!

*EDIT Make that 8 years, since IE 8 appears to still suffer from the same problem!*

Here are some helpful links that might be better at explaining than I am…

Wininet Connection Issue

IE Hang Issue


Big Iron (Mainframes) and the World of Tomorrow

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

Big Iron (Mainframes) and the World of Tomorrow

Picture of Irv Owens Web DeveloperThere was an article in CNET yesterday espousing the need for developers to pick up mainframe development, and schools reinstating their mainframe classes. While I don't think anyone should waste their time learning about a mostly dead technology, it makes sense to learn from the applications developed on mainframes and take the lessons with a grain of salt.

Right now I am working on converting a legacy mainframe application that was implemented in the 1970's into a web application. The real issues are stemming from the current business process with that mainframe. The database, probably some RDBMS variant, is normalized in such a way that it makes enough sense to keep that structure rather than try to re-invent the wheel. What has been suprising is that it also makes sense to maintain most of the data presentation layer.

The people who use the current system get a ton of data from a very small amount of screen real-estate. The mainframe systems were usually text based, and limited in the number of characters that could be stored in a field, and therefore displayed. Much of the business process that resulted from these limitations has evolved around using codes and cheat sheets to figure out what the codes mean. This also has the effect of shielding somewhat sensitive information from outsiders and customers. The use of codes as a shorthand for more detailed information also has the effect of being able to transfer a large amount of knowledge in a very short time for experienced users. Similar to the way we use compression to zip a text-file into a much smaller file for translation later. When a user inputs the code, they are compressing their idea into a few characters that the user on the other end can understand.

I have been more fortunate than most, because I have access to one of the original architects of the system, and I believe that having an understanding of the business environment and the system architecture is more important than knowing the actual code. Most people looking to hire individuals who understand the mainframe are really looking for people to dis-assemble their applications and rebuild them as web applications.

I do intend to maintain the look of the existing mainframe screens, but intend to replace the current cheat-sheets with simple hover javascript events to display descriptions of what the codes mean. I like this approach of blending the old with the new since it will create a sustainable bridge between the legacy users and incoming users who may not have had the same experience.

The article in CNET further implies that mainframes still sport some advantages over server based applications. That may be true to a degree for deployed desktop applications, but maiframes have no advantage when it comes to web applications. Still, people who know COBOL, FORTRAN, and other low level languages can command a premium for their technical knowledge in the few shops who feel that maintaining these mainframe applications and hardware are better for some reason than replacing them, but it is only a matter of time until these shops agree that paying an ever increasing amount for maintenance and upgrades is more expensive than bringing someone onboard to convert the application to the web. Therefore I see no future in the mainframe, however some great applications were developed for them, and the applications that are still running on them were probably more robust than average.

Much of the methodology I tend to follow when constructing a database or organizing code were implemented for the first time on big iron, so I actually feel priviliged to be able to work with it. Its almost like looking into a time machine where you can see and feel the environment of the past which, even though it may seem the same, is vastly different than the business climate today.

Learn COBOL today!

What is a mainframe anyway?


ColdFusion Frameworks

Posted: December 31st, 1969 | Author: | Filed under: ColdFusion, Programming, Uncategorized | Tags: , | 1 Comment »

ColdFusion Frameworks

Picture of Irv Owens Web DeveloperI have recently begun exploring the landscape of potential, mostly object oriented, controller layers for ColdFusion. Three of the frameworks that I have been working with are a substratum framework to add OO style to Fusebox 3, Fusebox 4, and Mach-II. In bouncing between these frameworks I have noticed that there are significant differences between all of them. In fact, I have noticed that for some tasks, one framework is better than another.

Taking Fusebox 3 first. It is possible to add your own CFC invokation layer to Fusebox. This works well for medium to small-sized applications, or applications that need performance. The reason Fusebox 3 works so well, even though it is so old, is that even with the many cfincludes it performs frequently better than Fusebox 4 and Mach-II due to their use of XML. The use of XML for their control files, instead of the fbx_switch.cfm file in Fusebox, enables developers to port their applications to different languages like PHP with much less difficulty. In order to port, you just need to recode your objects, you don't have to rethink how the application works. This is only of benefit however, if you intend to move the application.

The downside to using Fusebox 3 is that it requires much greater discipline from the developer. If you are commited to OO like development, and CFCs. You will have to enforce it yourself, the framework is not going to make you avoid procedural code. In some cases procedural is the way to go, but it is up to the developer to know this.

Fusebox 4 has much improved support for CFCs and OO like programming style. It uses listeners, can implement loop logic in the control files, and also allows freedom, to a lesser degree than Fusebox 3, to the developer to decide how they want to build the application. The XML control files are only read once if you set the framework to production, and then cached which enhances performance. Fusebox 4 performs almost as well as Fusebox 3 in my experience. If you use CFCs, inheritance, binding, and design patterns, it performs slightly better than Mach-II but significantly slower than the same application, coded differently in Fusebox 3 since Fusebox 3 doesn't support CFCs in the same way. It is possible to use design patterns in Fusebox 3, but it still seems just a hair faster. Granted none of this analysis is scientific, it is just my observations over time.

Mach-II I have talked about in earlier blogs, however it is still I think the best general purpose framework. It can perform somewhat slowly due to it's use of the application variable scope for almost all variables, and it's forced “implicit-invokation.” The variable issue can be avoided however by using var frequently in your objects to indicate local variables only. The error catching employed by this framework could use some work and there should be a built in way to cache invoked components to enhance performance, but these features will hopefully find their way into Mach-II 2.0, or a later version. In the hands of a skilled developer familiar with the framework, Mach-II can be very quick and scalable, however in the hands of a novice… Well, OO novices, and people really new to ColdFusion probably shouldn't touch Mach-II, it can be really frustrating. Truthfully the best framework for beginners, and people who have been using ColdFusion for a while, and want to get better at organizing their code should really look at Fusebox 3, in no time they will be ready to graduate to Mach-II. That is my opinion, and of course there are those people who can pick up a ColdFusion book and a Mach-II book on a weekend and be ready to code professionally on Monday, but most people aren't like that.

I've been thinking about trying to create my own ColdFusion specific controller layer, but it is difficult to keep it general enough for any application. Perhaps it would be better to group applications, and develop several frameworks that would work for certain groups. Well, I'll keep at it.

The Home of Fusebox
The Home of Mach-II
Tips and Help for Mach-II


Macromedia / Adobe Flash and AJAX: Companions or Adversaries

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

Macromedia / Adobe Flash and AJAX: Companions or Adversaries

Picture of Irv Owens Web DeveloperOne of the hottest new things in web development right now is pretty old. JavaScript is taking the world by storm through the XMLHTTPRequest. My question is, isn't this exactly what Flash MX was designed to do?

I have only been working with Flash for about three-and-a-half years, and one of the first things that drew me to it was the ability to get and post to other pages without a page refresh. Flash was designed to do this from the beginning. With the ColdFusion flash gateway, developers can even directly access CFCs and other template pages. The question then is do we really need AJAX?

I think so. One of the benefits to using AJAX is that it is possible to create standards compliant web pages that are more dependent on the resources of the client and less on the server. Back in the nineties, it was much better to rely on the servers because they often had more computing power, but now desktops are very powerful and most can handle the rigors of sorting and validating data. These are probably some of the more banal uses of AJAX, but these are things that should be handled by the client and not the server.

There will be some overlap between AJAX and Flash. Many in the AJAX camp will claim that AJAX is much lighter than Flash as far as bandwidth is concerned, and I can see that poorly designed Flash will take more bandwidth than well designed Flash. It is possible to draw components with actionscript. This puts the drawing entirely up to the client, with the Flash movie being mostly just compressed script. If AJAX needs to use graphics, it has to send them via CSS during the initial download, and afterwards these images will be available as long as they are in the browser's cache. It is even possible, as it is in Flash, to have the initial page appear while still downloading components.

I think that for some projects AJAX will be the technology of choice, but for others Flash MX will be optimal. Personally, I believe that for most of the jobs you could do with AJAX, Flash will be the faster solution because of the well designed nature of the IDE. Flash is now a platform and the Flash Development Environment is the tool. Macromedia is going to embrace Eclipse to try to get Java developers to see the benefits of creating web applications with Flash. I think that in the long run, Flash is a good bet, and that AJAX is sort of a fad that will become less and less a good choice as bandwidth becomes more available. I like a lot of what is happening with AJAX, and hopefully the developers of Flash will keep working toward accessibility. But in the end, well designed flash applications are hard to beat. They don't need screen refreshes, the Macromedia components are well designed and often will take XML as their data source. The applications allow more interface flexibility than traditional CSS, although this is changing, and overall lead to a better user experience.

So why do I bash Flash constantly? My negativity where Flash is concerned comes from having to endure many, many very poor Flash websites and applications that use Flash just because it moves. The developers often spend little or no time in working with the actionscript, and they don't plan for low bandwith users. Many Flash developers believe that the dial-up and ISDN / Mobile users don't matter and that is simply bankrupt thinking. Developers should plan and develop for the least common denominator. A light design can still be a good design, and is often, in my opinion, the best design. AJAX lends itself to better developer practices by its complexity, but I don't believe that complexity is ever a good solution to a problem. Perhaps with the introduction of AJAX tools, and an IDE this complexity could be improved upon, and we are already seeing the beginning of the uses of AJAX in web applications and they are quite impressive, but most of the impressiveness comes from the fact that they are doing it without Flash, not from the application itself.

The fact is that over 90% of the web is Flash plugin enabled, and it is a relatively small and fast download. If you want to design really solid applications, take everything you have learned about minimal design and apply that to flash development. Perhaps then, Flash can turn its negative image around and become a real tool for business solutions.

About AJAX
Flash Remoting LiveDocs


Why Separate Business Logic From Display Code – Is That a Trick Question?

Posted: December 31st, 1969 | Author: | Filed under: ColdFusion, Programming, Uncategorized | Tags: , | No Comments »

Why Separate Business Logic From Display Code – Is That a Trick Question?

Picture of Irv Owens Web DeveloperI was perusing the web the other day when I came across a site that was questioning the need for OO (Object Oriented) code in a language like ColdFusion. The author suggested that PP (Procedural Programming) was often faster as it involved much less overhead, and asked the question if it was truly necessary to strictly separate business logic from display code. I could see points in this persons argument up to this point. Not separate logic from display, was he mad? Still, to a developer who has not worked on complex applications, and has stuck strictly to commercial sites, I can see where the computing overhead and design complexity required of creating usable software would seem absurd. I can even see where EPAI (Every Page is an Island) can be of benefit in commercial sites with only several dynamic pages.

Having maintained large applications developed both with a framework and using elements of OO, as well as maintaining large applications built with no framework and EPAI, I can definately say that the applications developed with the framework and elements of OO are much easier to take care of. The primary reason is that there is a higher level of encapsulation per object, so that each individual object does only one task, and that task it could perform independent of any other objects. This way it is very easy to troubleshoot that one piece. As you continue through troubleshooting each piece you are most assured to find the issue. With EPAI, troubleshooting becomes difficult because each page has display logic mixed in, and can be performing several tasks, especially if it is sumitted to itself in forms. Even with appropriate variable scoping, it is still hard to determine what is setting what where.

I would suggest that the person who suggested that there was no benefit to separating business logic from presentation logic read Design Patterns by the Gang of Four, Gamma, Helm, Johnson, Vlissides. After reading a brief excerpt from the preface of the book I knew that it could help me solve some of my design problems. The issues in the book are real-world issues and as such the solutions make sense. After reading this book, the extra system overhead and complexity should seem worth it in many cases. However, this does not mean that it applies in all cases. Invariably there will be exceptions, for example where performance is the highest priority for a given operation. In this case you may wish to bypass the framework you have developed or are using for this operation, if the overhead it incurs is significant. This is just one example of many where design patterns and maybe even OO may not be the best solution to a problem. Remember, that is what programmers are doing, solving problems. Design patterns are just to give us more tools to do so.


Why Flash is Still Oh, So Wrong for So Many

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

Why Flash is Still Oh, So Wrong for So Many

Picture of Irv Owens Web DeveloperI have recently come under some minor pressure from various factions about why, while knowing Flash fairly well, I am always reluctant to design and build a flash site featuring the technology. My history with Flash is pretty much the same as most other developers. My first versions of this very site three or four years ago were made entirely in Flash, as were many of my customers sites. Flash seemed like the way to go. It rendered the same in every browser, fonts weren't an issue, and it allowed an incredible amount of freedom to create.

So why then were my sites so problematic. The first issue was one of bandwidth. I had music and lots of motion on these sites. They were extremely interactive and eye catching. The problems came up when users had to come to my site using dial-up. When they hit the site and saw the loading bar, the first thing they did was to click back and go on to another site. My webtrends illuminated this for me. My next step was to go more minimal, which is my favorite thing to do, but then I wondered why I was using Flash at all, because now the motion was mostly gone, and so was the majority of the interactivity. I was using flash simply for the z-index, and I was finding that I could do this with CSS. So, not to be deterred, I did another redesign that kept the motion and interactivity, but minimized the huge bitmap graphics that were giving me the long download times. Instead, I used vector graphics. These were much smaller, but now I had a new problem. If my clients didn't have at least a Pentium 4 running at greater than 2 GHz, my site ran slowly, so slowly that it was almost unusable.

The next issue was that in all the time I had my site, I could never find it using search engines. I discovered that search engines couldn't index my site because they couldn't see through the Flash. To the spider, my site looked like a huge gif in a HTML file with some meta-tags. In other words, it looked like nothing. I tried alt tags, no script tags, etc… but nothing helped. Finally, I decided to design an alternate site for dial-up users using good ol' XHTML and CSS. I found that as soon as I uploaded the file, the search engines had me, and no one ever visited my Flash site anymore.

Suffice it to say that I took my Flash site down. Later, I would redisign my site again so that it would adhere to web standards and could render even faster for all users. That site is this one, and it is the first that I am happy with. I am enjoying some minor success with getting listed on search engines and blog aggregators, and life is good.

I don't hate Flash any more than I hate Allen wrenches or crowbars. It is a tool, and typically you try to use the right tool for the job. It seems to me that many web developers, however are trying to use a sledgehammer to staple two pages together. It just doesn't work. In some cases Flash is OK. In corporate settings, Flash is an excellent tool for presentations, product demonstrations, promotional materials delivered through the company intranet, or from the presenter's local hard drive, as long as it doesn't have to be delivered over the web.

There are a few cases where it is perfectly reasonable for designer / developers to build flash-only web sites for people. Art sites, such as photography showcases can benefit from Flash and its fantastic bitmap compression. Flash photography sites can often download faster than their HTML / CSS counterparts due to smaller image sizes. Some product demonstrations can benefit from Flash and its interactivity. Many cellular phone providers have used Flash to great effect in this regard. Simple branding banners contained within standard HTML / CSS pages with limited motion and interactivity can be excellent, as long as the text of the page is available for the user to read while the Flash is loading.

Still, designers and developers need to ask themselves, what exactly am I trying to do, and who is my target customer? I have had a very hard time making a solid business case for Flash on most of my ecommerce and business sites. Flash, like ColdFusion and Chess, takes only a minute to learn, and can take a lifetime to master. There is a lot to Flash, and a good designer knows how and when to use it to make a site look more professional, or to enhance content that may otherwise appear to be bland. However, beginners seem to tend to develop only in Flash because it addresses many of the apparent problems with XHTML / CSS. Those of browser incompatability, having to learn JavaScript, etc. Someone with limited knowledge of ActionScript and no knowledge of HTML is able to open Flash MX 2004 and create a website. Many designers use Flash exclusively, for this reason.

It seems that XHTML / CSS / JavaScript is having a renaissance. With the proliferation of blog sites, and better browser support of web standards many Flash sites are starting to look tired, and compared with the relative quick response of the HTML sites, many users are deciding to click away from the loading screens in favor of a site with similar content, or products, that is designed in standards compliant XHTML. Not because they love web standards, but because to the user the XHTML site works better and they don't have to wait. I have actually heard designers say that they don't care if dial-up users can't access the site, it has to be beautiful. This thinking is bankrupt, probably 80% of the country is still using dial-up. BroadBand is still frequently ridiculously expensive, and until this changes Flash will be limited to design and car sites mostly, while the bulk of the web is built using XHTML.

I'd actually like to see that change. I'd like to see 3 Mbps synchronous connections standard in every home across the country, and Flash sites loading instantly, but the reality is that it won't happen within the next 5 to 10 years. At least not until garbage cable company decides to charge reasonable rates, and build better fiber backbones, and adequate DNS resources.

In the meantime, I'm quite happy with CSS / XHTML. It does everything I used to do with Flash, but it does it faster and is more accessible. Hopefully more designers will build standards compliant sites, and will realize they can be every bit as beautiful as Flash sites. Check out csszengarden.com to see other great CSS designs.