Today is a good day to code

Windows Vista SuperFetch

Posted: December 31st, 1969 | Author: | Filed under: Apple, Microsoft, Uncategorized | Tags: , | No Comments »

Windows Vista SuperFetch

Picture of Irv Owens Web DeveloperMost PC users are plagued more and more by chronic slowdowns the older their PC gets. Some of this, probably most of this is due to spyware, but a vast majority of this is caused by software loading into the system tray. The tray is an area in which a program can run without having to have an interface, or appear in the taskbar. This was a great idea with the introduction of Windows 95, but in the era of many computer users running greater than 1 GB of RAM it seems un-necessary, or at least that it should be redesigned in Vista.

Microsoft is attempting to remedy the problem of slowness as well as acknowledge the larger amounts of RAM in currently shipping systems by building a technology called SuperFetch. Which is a fancy caching program designed to notice which software you use most frequently, and load that into memory at start-up. This can improve the percieved performance benefit to some users, but probably would end up being a technology that would get in the way of most power users. It would in a way take users back to the heady days of Windows 3.11 when we could choose how much system memory was to be dedicated to the applications, and how much was to be reserved for hard drive cache. This system would behave in much the same way, but instead of the user being in control, the system would adjust automatically.

In attemping to understand why the system tray is strange, and probably should be either redesigned or removed, it is a good idea to look at how Apple as well as Microsoft handle GUI-less applications. In Windows developers have the ability to create full applications that run in the background without disrupting the user, or appearing on their display. These programs are known as services. Probably the most widely used service is the IIS service. IIS, if you don't know already, is Microsoft's web server product. It is similar to Apache, except it has the ability to process scripts built in to the server. Whereas with Apache it is necessary to include modules for this functionality. As a result Apache is much more flexible, which could change if Microsoft is truly to build PHP support into IIS. But I digress. Some applications such as Apache for windows run as a service, and also include a system tray icon for management. In Mac OS X applications without a GUI typically either run without any notification to the user, or place a small icon in the upper right of the screen, such porgrams are fairly rare, since most applications on the Macintosh load themselves into memory at their first use and remain there. Since it is quite unnecessary to turn the Macintosh off, or to reboot most of the time, your programs always launch insanely quickly. It would be best for Microsoft to implement a similar type of system. This would minimize the overhead necessary for keeping frequently used programs in memory, and would be vastly less complex.

Another potential solution could be to automatically dump programs from the system tray if they aren't being actively manipulated by the user, or performing some operation on the system. This would do two things beneficial to the system. First, it would free system memory, handles, and other resources for use with software that is actually doing something. Second, it would discourage developers from using the system tray as a place to put their icons. To supplement this model, Microsoft should encourage developers to write services for Windows as opposed to tray software. The way it could work is that the first time this happened, the system would notify the user that it was closing the applications and ask them if they wanted to be notified in the future. That way the user would remain in control. On systems with less than 512 MB of RAM this simple system would pay back sometimes enormous benefits, reducing the idle footprint of the OS from around 400 MB on some systems back to a managable 256 MB. Microsoft is over-engineering this one, they should review the K.I.S.S. software development methodology.


New Internet Explorer 7 to Allow More Customization

Posted: December 31st, 1969 | Author: | Filed under: Google, JavaScript, Microsoft, Programming, Uncategorized | Tags: , , , , | No Comments »

New Internet Explorer 7 to Allow More Customization

Picture of Irv Owens Web DeveloperI love the ability I have to add more functionality to Firefox. Right now I have the web developer tools so that I can check out a page's stylesheets, javascript, block level elements, etc… I have the IP tool installed so that I can see the IP address of the site that I am currently visiting. I have the Gmail notifier and the PageRank tool all incorporated in my browser, most of which modifies the status bar at the bottom of the browser and is completely innocuous. Internet Explorer has always supported plug-ins, but they were limited in their ability to change the user's browsing experience, relegating them to toolbars and the like. That is about to change.

Similar to the new Google dashboard Internet Explorer will allow small web applications to be installed in the browser, it will allow a user to modify the webpages they are viewing, create a new download manager using the .net languages, really the implications seem to be pretty huge. There is just one problem. Security.

One of my biggest fears with a heavily extensible Internet Explorer is that people will be able to use it to compromise the security of the operating system. We have heard time and time again that in Longhorn, ahem, Vista, users will be able to run Internet Explorer 7 in a sandbox of sorts, or a least privileged user account, preventing would be hackers from compromising the system. That is great for Vista, but what about on Windows XP Service Pack 2? Don't get me wrong, I think Microsoft has done as much as can be expected of anyone when patching a completely insecure OS, and they did it in record time too. Still, there have been plenty of bulletins regarding more compromises and exploits in Windows XP SP2, some regarding Internet Explorer. If you give individuals the ability to distribute code that a user can install, it is possible, by definition to compromise that user's system. I'm sure that Microsoft would be quick to point out that then it isn't their fault that someone installed software that allowed hackers to have their way with all their files, but at the same time it is very easy to misrepresent a piece of software to a computer novice who is using Windows. Just look at how far Gator / Claria has gotten sneaking software onto systems. I think that while having the ability to customize one's web browser is cool, Microsoft should consider passing on this potential nightmare. It is sort of reminiscent of Microsoft's touting of Active X and how it was going to obliterate the line between desktop software and internet applications and change the way we all use our computers. Well, it changed the way we all use our computers, we all need anti-virus / spyware / malware filters that sniff out those Active X controls and disable them. Most of us, those in the know, if we have to use windows, turn the Active X controls off altogether.

I think that Microsoft should really not include this feature, and I mean even for toolbars unless they are reviewed by Microsoft and signed by Microsoft. That is the only way to be sure users aren't getting malware. If the plug-in isn't signed by Microsoft then the OS should refuse to install it. It should be that simple. Of course it makes developing for IE that much more difficult, but Microsoft could release a developer's version of IE that was open source so that the plug-in verification could be disabled to allow all plug-ins to be installed. Everyone in the software business knows that features move boxes, but Microsoft should keep their eyes on the prize of security. They really need to get their reputation back, and integrating more sketchy features in not the best way to do this.

IE Extensibility – From the IE blog


The Future of Scripting

Posted: December 31st, 1969 | Author: | Filed under: ColdFusion, Companies, Microsoft, Programming, Sun Microsystems, Uncategorized | Tags: , , , | 1 Comment »

The Future of Scripting

Picture of Irv Owens Web DeveloperInitially I wanted to stay away from scripting languages as a developer due to the fact that they weren't really programming languages at all. For some time I was reluctant to even call myself a programmer until I built my first Java desktop application. In CNET's open source blog today, they ask the question has scripting peaked?

Scripting hasn't peaked out yet. The reason is clear. Building a web site with C++ or Java is like driving an armored tank to your mailbox. It is that ridiculous. The funny thing is that even Microsoft realizes this, giving their ASP.net developers two languages to choose from when developing web applications. There are many reasons for enterprises to choose C# over Visual Basic when building a web application, especially if they already have desktop and client-server applications built using the technology. It would be possible to completely reuse many of the methods used in the desktop application for the web application. The frameworks built into J2EE as well as C# allow for robust development making it less likely that a developer will lose control of their code. Still, using these technologies and frameworks where a scripting language and a light framework would do adds un-necessary overhead to a project and can push deadlines out unreasonably.

Here's what I see. PHP is a fantastic scripting language that has no real back end and therefore is suitable for light to moderate customer facing websites and some intranet applications. Use of PHP in this regard will only continue to grow. I think some of the 25% decline in worldwide use is a reactive measure to PHP's early security vulnerability. PHP is losing ground quickly to ASP.net and VB scripting as Microsoft's Server 2003 is more widely adopted. Personally I think that LAMP is superior for many tasks, but ASP.net is almost ubiquitous now, hosting and maintenance are cheap. I'll continue to use PHP for light jobs, but at the same time I realize that this is just a preference and performance-wise ASP.net is better. Talking about Java… Sun needs to buy ColdFusion from Macromedia / Adobe. It should be THE Java application server. There is no cleaner and easier scripting language, and it has nearly unlimited flexibility and is design-pattern friendly. Why this move hasn't occured yet is beyond me. It would have made sense for Macromedia to sell it, but I think the issue is that Sun has many proud engineers who love to over develop products. The thought of supporting something as business friendly as ColdFusion probably makes them sick. The business case for this is probably that Macromedia probably sees the big picture and that there are big bucks in ColdFusion, especially now that enterprises are seeing it as a way to get around JSP's notoriously long development cycles.

I see scripting as having a bright future, and I'll tend to side with Zend's guys as saying that regardless of how the Evans study got its numbers, PHP is increasing in use not decreasing. I'm not sure if it is true, but if the next version of IIS is going to have PHP support built-in, I'll be seriously considering going with a Microsoft server in the near future and running it alongside ColdFusion. I like PHP, but I just like ColdFusion better.

news.com – Scripting's demise


MySpace.com Switches From ColdFusion to Blue Dragon

Posted: December 31st, 1969 | Author: | Filed under: ColdFusion, Microsoft, Programming, Uncategorized | Tags: , , | 1 Comment »

MySpace.com Switches From ColdFusion to Blue Dragon

Picture of Irv Owens Web DeveloperWhen MySpace decided to stop using ColdFusion recently, many ColdFusion developers felt somewhat betrayed by the change. Many even suggested that it wasn't the best engineered solution. By that they were hinting that by using fusebox, no one has said which version, and ColdFusion 4.5 / 5 they weren't working with the best that Macromedia had to offer.

New Atlanta claims in their press release that using the exact same code, MySpace was able to reduce their CPU usage by 50% under heavy load. They also claim that this result could not be duplicated with ColdFusion MX. I believe both of these claims. Most ColdFusion developers are loathe to admit it, but Microsoft has a pretty good thing going with C# right now. The ASP.net framework is decent, it performs extremely well in every test I have seen, so I am not suprised that by using ADO.net and ASP.net they were seeing gains like this. The biggest problem with any scripting or programming language comes down to the drivers when using an external resource. Web development is no different. When your database connection pool reaches it's limit, it just can't create any more connections and requests get queued. There is no way around this except to get better database drivers. I have seen JDBC fail miserably time and time again, especially with SQL Server 2000 in this area. One of the biggest causes of that 100% CPU utilization seems to be in the JDBC driver when the database doesn't respond in a timely fashion. I don't know what causes it, but it happens when I am working with a complicated dataset. It isn't even that the memory usage is too high and the server is thrashing, the CPU just goes to max and starts refusing connections. The only solution to this is to kill and restart the JRun instance. After that it behaves well again until it crashes.

In all fariness to Java / J2EE, I have only experienced these issues while working with ColdFusion Enterprise on Windows based systems, this doesn't seem to happen on Linux, at least in my experience. Since MySpace was so heavily invested in both ColdFusion and Windows hardware, I guess they had no choice but to use New Atlanta's Blue Dragon. Still, what most programmers have to realize is that at the end of the day, it is the tool that delivers the best that will be used. No matter what we feel about Microsoft, ASP.net is fast and stable in it's newest iteration. One of the advantages to ASP.net is that it automatically fixes it's memory leaks. This is something that Macromedia desperately needs to build into JRun. ColdFusion is sometimes slow and is often buggy when dealing with some of it's advanced features. Experienced CF developers know how to deal with most of this so it doesn't come into play very often, and I'd bet that with higher quality Microsoft SQL Server JDBC drivers, and the application of good design patterns and reusable CFCs they could have gotten better results out of the CFMX server. But as the CEO of New Atlanta said in his blog, to rewrite all the code to take advantage of components, invocation, and var typed variables is beside the point. They made the decision that would best serve their business. If their code was reusable or not is immaterial after the fact.

Still, I find ColdFusion's performance to be reasonable. I wouldn't call JSP / ColdFusion a speed demon any day, with light load, PHP blows it away with 10 users on it, but once you scale that up to 100,000 users concurrently, then Java starts to shine. Since Microsoft shamelessly copied Java with C# and improved on it, it is no suprise that ASP.net performs as well as it does since it has native OS support in Windows Server. No one uses ColdFusion because it is the fastest from the execution standpoint, they use it because it is the fastest language to develop in.

Another interesting point would be if Microsoft were to acquire New Atlanta in order to integrate ColdFusion support directly into IIS. This would give Macromedia / Adobe some competition and force them to fix some of the issues they have let languish in CF. It wouldn't be too suprising a move since they are seriously evaluating building PHP support directly into IIS. I'll bet that it wasn't even that tough for New Atlanta to port their J2EE version of ColdFusion over to ASP.net due to it's support for C# and Microsoft's Java source to C# source conversion tools. It, of course required some optimization, I'm sure, but I'll bet it is smokin'. Maybe I'll download it and try it.

New Atlanta
New Atlanta CEO's blog
House of Fusion MySpace Conversion Discussion


Reaching into the Cradle for Programmers

Posted: December 31st, 1969 | Author: | Filed under: Google, Microsoft, Uncategorized | Tags: , | No Comments »

Reaching into the Cradle for Programmers

Picture of Irv Owens Web DeveloperOne of the biggest issues facing America, other than Katrina of course, is that we will lose our cometitive edge in programming, especially in the face of offshoring and the like. One of the ways in which to combat this eventuality as I have commented on earlier in my blog is to add value to our programmers. One thing to look at is what makes up a good programmer, or not good but rather an effective programmer.

Effective programmers may not be the best at all the technical tricks in programming, but they can take a set of good specs, some user feedback, or a problem and churn out a piece of software that solves that problem. The ability to do this comes not from college, or advanced courses in algorithms using C++. This ability is established at a much earlier age. Probably somewhere between ages 3 and 5. Most of the people that I consider to be good programmers started writing code at around age 10, and have a strong artistic bent. Usually around the same time as they start writing code, they begin to take things apart to see how they work, to quote a friend. These traits should be sought out in elementary school, and should be encouraged through high-school. Many would say that this takes the self-determination out of a child's path to adulthood. I would argue that it helps them find their path, many of these kids would want to be programmers, if they knew what it was to be a programmer. Most of them are strong problem solvers, even if they aren't that good in math.

For the most part, the extreme emphasis schools put on learning math to be a good programmer is ludicrous. Obviously if you are developing new encoding codecs or security algorithms you have to be good at math, but that is only one kind of software development. There are many other aspects of software problem solving that don't involve that kind of math. Also, people can continue to learn math outside of school, even though most don't.

A good way to start your child down the path is to keep them assembling blocks of logic to solve a problem. There are many games, especially video games, that enforce this type of thinking. To maintain our lead, we have to enhance our kids' creativity and their interest in software development. There is still plenty of work to be done, just look at most web sites, web applications, and operating systems. They are hardly ideal. This is why Google and Microsoft are coming under so much criticism from their recruiters. They focus mostly on the education, but that hardly makes one a good programmer and therfore are often turning their nose up at creative individuals who could make an immediate impact. Developers take many strange and divergent paths to get where they are, it takes a good recruiter to understand those paths to hire good programmers when they find them.


Another Possible Twist on Intel Mac

Posted: December 31st, 1969 | Author: | Filed under: Apple, Microsoft, Uncategorized | Tags: , | No Comments »

Another Possible Twist on Intel Mac

Picture of Irv Owens Web DeveloperAgain, I am engaging in idle speculation on the heels of the underwhelming Apple media event, as well as Steve Jobs pulling out of the keynote. Many people have begun to wonder why indeed Apple is not sticking with the PowerPC architecture. It isn't clear whether or not IBM can make a 3 GHz G5 part, or whether they can get a chip's power requirements low enough to deliver a G5 PowerBook. In fact, it is pretty clear that a dual-core 2.5 GHz part would be at least as powerful as a similar part from Intel. Now, I must preface this by saying that I don't truly believe this one myself, but I am, as I often do, pondering the possibilities.

Let's say that in a bizarre parallel universe, Apple after releasing the multi-button, multi-function mighty mouse that is fully compatible with computers running Windows, decides that they can make more money out of building iPods and computer hardware than they can in releasing software. They have realized that what makes their products compelling is their design, and not in actuality their operating system.

The result of this revelation. Apple decides to produce all of their iLife applications as well as GarageBand for the PC. They will continue to sell Macs of all types and iPods, but they decide to phase out OS X in favor of pre-installing Microsoft's Vista. Since most of their profit comes from hardware, to the shareholders this seems like a good move. It would also explain why the Mighty Mouse is designed to work so well with PCs, and why Apple has been so explicit about not doing anything to prevent users from running Windows on their Macintoshes.

Now, why this would be a very bad move. On my very long drive back from Las Vegas, I was listening to the TWiT podcast. They brought up the fact that CP/M was a lot like Apple way back in the day. They had the leading operating system for PCs and they locked it into their hardware. Eventually IBM decided to get into the game with an operating system that would run on any intel based hardware, regardless of the vendor. Soon, Microsoft wrote an operating system that was superior to the one that IBM made, but was company agnostic, it didn't care who's hardware it was running on as long as it made the system requirements. After a while, all anyone said about CP/M was “CP/M who?” Apple's current strategy of lock-in is similar to CP/M's. This strategy obviously didn't work for CP/M, and isn't working for Apple. Without the iPod Apple's computer division isn't doing all that well. If they licensed their OS, they could do at least as well as Microsoft. But they would have to drop their hardware line. What they could be thinking about is that they could be like Dell and sell hardware with the OS preloaded. They could then focus on their hardware margin. But here's where this would destroy Apple. The problem is knock-offs. Whatever they came up with, they would have something like a one week lead on the design, before it was reverse engineered and sold on the market here for hunderds of dollars less. Right now, even though there are cases that look very similar to the G5, no-one, even Microsoft, has been able to reverse engineer the operating system to any truly successful level. Just a few thoughts….


IE7 Using CURI to Handle URI Objects

Posted: December 31st, 1969 | Author: | Filed under: Microsoft, Uncategorized | Tags: | No Comments »

IE7 Using CURI to Handle URI Objects

Picture of Irv Owens Web DeveloperWhen some people think of the issues plaguing much of Microsoft's software, they often think that it is the result of lazy coding. Sometimes that is the reason there are issues, other times it could be the deadlines the team had to meet, or it could be that no one actually thought that the potential bug could be a real issue. One of the issues that web developers have had to work around since IE 5 came out was the 2KB limit on URL strings. Another issue was that hackers had the ability to send a malformed URL string to IE to fool it into thinking that their site was a trusted site. Then they could wreak havoc on your computer by sending IE awful Active X commands to trash your system.

IE 7 so far doesn't look like it has a bunch of sexy features, but under the hood Microsoft is really working hard on this release. From the partial standards compliance to running IE under a reduced permissions sandbox if you will, they are really working hard to try to get people to trust the internet again. If that wasn't enough, Microsoft is building tools into IE to detect if a site is on a list of “bad” sites that Microsoft will keep. But one of the coolest enhancements to me is the CURI object. Basically it is a struct that allows a programmer to handle it as such. Since it is not a string, it is possible to validate the CGI variables apart from the rest of the URI. If someone were to try to slip a malformed URI down the pipe, the validation of that CGI string would fail as would the attack. In IE 5 and later, the CGI string was handled as a string and passed around the code. String variables give the developer limited abilities to validate parts over other parts. There are many sub-string functions and libraries out there, many are built into the development languages, but they cost the developer in performance. Was Microsoft lazy, who can say, but it seems as though they are working hard to make IE 7 everything that 6 should have been.

Microsoft's IE Blog


The MSN Bot

Posted: December 31st, 1969 | Author: | Filed under: Microsoft, Uncategorized | Tags: | No Comments »

The MSN Bot

Picture of Irv Owens Web DeveloperA few days ago, I noticed that the MSN bot had been hitting my RSS feed more than any sane bot should. The MSN search team at Microsoft have said that they are experimenting with the web droid. Some publishers are complaining about this because it seems that in some cases this is causing their bandwidth to average over the amount guaranteed in their hosting agreements.

While having to pay more for hosting can be a real pain, MSN's propensity to recognize RSS feeds and keep checking them for updates is a good feature, seeing as some bloggers post all the time, and blog readers want up to date information. Services that use the rpc-ping system, like Technorati typically do a very good job of crawling to get the latest blogs, only when there is an update. Perhaps Microsoft could implement this sort of function into MSN search, although it might be difficult with only 5.5% of the market to get people to actually use it.

Jeremy Zawdony – Dear MSN Bot


Possible Apple and Google iTunes Deal

Posted: December 31st, 1969 | Author: | Filed under: Apple, Google, Microsoft, Uncategorized | Tags: , , | No Comments »

Possible Apple and Google iTunes Deal

Picture of Irv Owens Web DeveloperI am really ambivalent on the possibility of a deal between Google and Apple to help the search company figure out how to deploy a music solution similar to Yahoo's launch. Google hasn't been making software for Macintoshes. I am still waiting for Google to release Google Earth for the Mac. It shouldn't be that hard, since they already have a direct 3d implementation. I could see if it were, perhaps using Direct X, or using Active X controls to display it in the browser, but this is a standalone program. Does Google really care about the Apple users out there? On the flip side, there is a really strong business case for the deal.

If Google were to feature songs in the iTunes music store it would be possible for them to expand their iPod penetration even further than the amazing levels it has reached. Believe it or not, the numbers say that the once rabid iPod acquisition rate has begun to plateau, and profitibility of the devices has been diluted by the proliferation of the iPod Shuffle. Still, the problem is that most of the people I know who live in the middle and south-east of the country don't really understand the iPod, podcasting, napster, or anything. Many of them still frequent CD stores. The iPod is mainstream in America's big cities, but it is still fringe on main street America. Google has managed to penetrate much of that market, to a much higher level than the iPod, and Google is a trusted name, much the way Westinghouse was in the fifties and sixties. For Apple to tie itself to Google's image can only be a good thing.

Google, however should take care. Any such deal is going to further Microsoft's already boiling ire. They aren't ready for all-out war with Microsoft at this time, no matter how rich they are. Google is still very dependent upon Microsoft's technology as they have the OS market. When the Google OS comes out…. Sporting a thin client Linux system with a slick interface and applications delivered over the web, then Google will be ready. While they are probably working on something like this behind the scenes, they are wisely not parading it in front of Microsoft. Still, as paranoid as Microsoft is, Google should not tie itself too much to the rival Apple, although it would be better for customers, me in particular as a Mac user, it may not be wise to wake a sleeping giant by shouting in their ear.


Whirlwinds of Code and Forming Design Patterns

Posted: December 31st, 1969 | Author: | Filed under: ColdFusion, Microsoft, Programming, Uncategorized | Tags: , , | No Comments »

Whirlwinds of Code and Forming Design Patterns

Picture of Irv Owens Web DeveloperOne of the biggest issues with understanding object oriented programming is getting over its associated terminology. Most developers whether they realize it or not have formed design patterns, and use them all the time. If an established developer were to look at their code, they would often see that their application was broken down into data access and storage components, display components, and the logic that allows them to successfully communicate. When asked to bring a system from one application to another, they can usually do it with little to no modification. This is the idea behind design patterns.

In an interview yesterday I realized that I had better take a more aggressive look at design patterns. Understanding the terminology may be tough, but it is an excellent way to communicate an application's business needs, especially using UML, as well as getting down to the lowest level of describing the objects that will comprise your frameworks. I have always tried to get a firm handle on design patterns, but they have largely eluded me. I have understood simple systems like breaking your code into well defined model-view-controller layers, and using messaging to communicate between layers, but I have never really been able to understand the more advanced concepts. In the interview I noticed that I was designing with some object oriented concepts by using Fusebox, even if I didn't know what to call them, but ignoring some of the more specific ones.

Programming is often like the game Dark Cloud for the PlayStation 2. For those who haven't played the game, it is a role playing game in which the player travels around their world trying to re-assemble their world, which was scattered by an evil genie. The player is provided a weapon which is pretty weak to start with, and by travelling around the world, they can gain objects which make their weapon stronger. When they get enough objects, they can merge their objects permanently into the weapon, increasing its effectiveness. They then begin the process anew, adding objects to the newly enhanced weapon until they can merge it again. When you have better weaponry, the player can gain pieces of their world more quickly and can assemble more complex worlds in shorter time. It is like this with programming. Often I feel as though code is whirling around me and once I have that “aha!” moment it merges and becomes something solid for me that enables me to take the next step. Building large applications has caused me to develop different frameworks or APIs for me to use. For example, most of my applications require search, so I have developed a pretty thorough search framework, made up of components, that can be moved with little modification. I wouldn't have known that it was that, but it is.

Today, or last night rather, I had one of those “aha!” moments, the moments we all write software for. I was finally able to put names to some of what I was doing. Now that I am beginning to understand, I can see why it would be hard for experienced object oriented developers to explain to procedural developers how to do OO design. You just begin to think differently. I can see about ten areas in which I can improve my search API / framework to make it more portable. The hardest part is finding the dragon, slaying it is easy. In other words, associating design patterns to what you are doing is hard, once you can put names to faces so to speak, the rest is simple. For a while I could never understand why people were so excited about Microsoft enabling the use of C# in SQL Server 2005. But now I can see, you can create an entire data access framework all on the database server, abstracting the underlying database and its queries from the application. It would be possible, in a web application, to completely separate the model from the controller and view layers. This has huge benefits in code maintenance because you could have any number of applications using the data access framework through web services.

What really dragged it together for me was why Java was so tough. I realized that it was tough because my mostly procedural mind was trying to write a program thinking about what each class should do instead of things like what does this class know about itself, what is it's purpose. How do the methods inside it work to help it achieve its purpose. In short I was trying to write a simple program, instead of thinking about a toolset to help me achieve my goal. With Java, you have to diagram, you have to chart or you will just get lost. Even objective-c makes more sense, with over-riding the init method for objects. These things didn't make sense before, but now I am getting it. I still have a long way to go, but I think I'll start working with Mach II, even if there is a performance hit. That is a little more OO than Fusebox, but Fusebox is a great foundation for it.

All that being said, there are still some instances where you can go too far with data encapsulization. For example if you had a table that had contact information in it. You wouldn't want to return each row, create a struct out of it, then set an iterator method to go through each struct, then each element of the structs, at least not in ColdFusion. Iterating over a query is something that the built-in elements of ColdFusion do fairly well anyway so building frameworks to disassemble a query object, then re-assemble it as a bunch of structs is probably an un-necessary layer of complexity for most applications. So like anything else, discretion is required. Now I'm ready to tackle the UML book and hopefully figure out how to use that nifty ID3 tag reader framework for Objective-C that I downloaded a while back and couldn't quite figure out how to use. I've got Macintosh applications that need to be developed.

Here are a few of the sites that helped me get to the “aha!” moment.
Macromedia exerpt from 'Design Patterns'
ColdFusion object factories, the Composition CFC
Introduction to the Mach-II framework