Today is a good day to code

SQL Server 2000 vs MySQL

Posted: December 31st, 1969 | Author: | Filed under: Microsoft, Uncategorized | Tags: | No Comments »

SQL Server 2000 vs MySQL

Picture of Irv Owens Web DeveloperOne of the funny things about comparing these two systems is that they are both medium sized DBMS systems. Neither is really suited to huge enterprise level solutions, although both companies will say that they are ready for them. I have not implemented any system on Oracle, Borland, or DB 2 so I can't truly speak for their performance on an official level. On an unofficial level, however I can say that I have noticed that they are significantly slower with lighter load. With heavy load, however they perform much better with Microsoft SQL Server 2000 and MySQL falling off quickly. This is evident simply when taking a look at hosting companies who provide either SQL Server 2000 or MySQL, in a local installation, these systems are blazing, returning results for queries often in under 100 ms. In production however, when using shared hosting most of my queries take anywhere from 250 ms to regularly hitting 2000 ms. Both of which put me outside the desired window. When using Oracle locally, performance hovers around 100 ms. When using an Oracle based system in production, performance still seems to hover around 100 ms. I have heard and witnessed to some degree that Sybase feels the fastest, which each subsequent iteration of a query moving faster. Sybase is probably the smartest DB as far as adapting to use, while Oracle seems to be the strongest.

Frequently however for most web applications, SQL Server 2000 or MySQL seem to be just fine. Between the two of them, after working with SQL Server for some time, I have come to really love the enterprise manager, with it's associated tools, the Query Analyzer and the Profiler, however when working with full-text indexes, MySQL wins hands down. As far as performance is concerned, from what I have seen MySQL is faster with a moderate load. SQL Server seems better with heavier loads, and appears to have better caching. As far as that goes, it has MySQL flat out beat. MySQL could do a better job of using the cache for similar queries, for example if I were to use SQL like this:

SELECT id, username, password, lastAccess
FROM testDB

It would return that dataset and hold the dataset in memory. But, it wouldn't be smart enough to realize that this query is just accessing a subset of the one above:

SELECT id, lastAccess
FROM testDB
WHERE id < 100

From what I have seen, SQL Server does a better job realizing that those two SQL snippets are utilizing the same recordset. MySQL would treat the last query as a completely new query and go to the disk to get the data. This may improve some by using MySQL's enterprise version, Max DB, or in version 5, but from what I have seen in the betas, it doesn't seem much better.

But with the ease of full-text searching in MySQL, I will almost always prefer it to SQL Server 2000 until Microsoft starts using a built-in full-text searching solution.

Look at this:

SELECT id, field1, MATCH (field1,field2) AGAINST
('text to be searched') AS score
FROM table WHERE MATCH (table1,table2) AGAINST
('text to be searched')

That is so simple and easy to understand. Creating the index is just as easy. Creating a full-text index using SQL Server is fairly complicated, primarily as I indicated above because you have to deal with an external application. The full-text searching rules built into MySQL such as the 50% rule as you can see in effect by visiting Owens Performance Blog Search makes querying an effort for the user, but much of that can be handled by decent business logic.

Ultimately it comes down to performance for price. When compared this way, MySQL wins, it is free. Still, MS SQL Server 2000 is a good value, given it's strong performance, robustness, and quality of tools. I will never hate using it for jobs. Still, if given my preference I would choose MySQL.


The Windowing Graphical User Interface

Posted: December 31st, 1969 | Author: | Filed under: Apple, Google, Microsoft, Uncategorized | Tags: , , | No Comments »

The Windowing Graphical User Interface

Picture of Irv Owens Web DeveloperNaturally if you ask most people about the biggest recent advance in the history of desktop computing, most of them would say that the windowing graphical user interface is it. While the GUI is definately a significant leap from the textual interface used in all Unix variants, I am somewhat dismayed by the lack of progression in interface technology.

It seems that the pioneers of these technologies can only think in Windows. It would be interesting if Apple and Microsoft, instead of focusing on improving their GUIs, were doing work in earnest to come up with another method of communicating with computers. Naturally, artificial intelligence and excellent speech recognition come to mind. With the incredible number crunching performance of current desktop computers, there is no reason for there to be no decent speech-to-text conversion software. Dragon is really good, but as far as controlling the OS, browsing, and writing software, it is far from ideal. For example, the computer should be smart enough that when developing in Dreamweaver I could say cffile tag. It should then put cffile tags with the cursor inside of the right less than symbol for me to add attributes. If the next command that I spoke was not an attribute of that tag, then the cursor should jump between the tags and begin copying in text. These things should be built into the operating system as APIs. But perhaps I shouldn't be looking to Apple and Microsoft to do these things. Maybe they are too focused on directly competing and keeping up with each other to really innovate. Maybe the open source community can come up with something to knock them off.

I definately don't see the GUI taking a dive anytime soon, but for devices like cell-phones, there isn't a much better method of controlling them than talking to them. I should be able to have it read-off my incoming SMS messages, and I should be able to dictate a response back to the phone. Intel has some pretty powerful mobile processors debuting, and they could easily handle this sort of rudimentary voice recognition software. I should be able to browse the web from my phone where it could read off the contents of the pages that I found while I am driving or whatever. It only makes sense.

As far as search goes this makes URLs and DNS somewhat unnecessary. Obviously you should be able to say triple-w dot whatever dot com and have your computer browse there, but who would actually do that. Wouldn't you just go into Google and say blah dot com, or more likely blah? This has been something that has come up several times in blogs and in discussion boards. More and more Google is becoming the primary way people find websites. This is good in that accessing information is more straightforward for most people. It is bad in that all pages can not appear on Google. Google seems to realize this, and that is why the blogsearch site has been launched, to give more people a chance to be found. Without DNS, and without having to know the web address of the site you want, a combination of voice and Google's search technology can make files on a computer available through the voice browser as well as any assets on the world wide web. Wouldn't it be great to just, as you are walking out to work say as an afterthought, “I want for you to find the best price on this movie, download it, and forward it to my television.” When you get home from work, you could just say to your television, “start a movie.” That is where we need to be going. Having all computing centralized around having multiple desktop machines in a home is a dead end. Devices and home servers are the future, and the graphical user interface is not the interface for this future.


Windows Vista SuperFetch

Posted: December 31st, 1969 | Author: | Filed under: Apple, Microsoft, Uncategorized | Tags: , | No Comments »

Windows Vista SuperFetch

Picture of Irv Owens Web DeveloperMost PC users are plagued more and more by chronic slowdowns the older their PC gets. Some of this, probably most of this is due to spyware, but a vast majority of this is caused by software loading into the system tray. The tray is an area in which a program can run without having to have an interface, or appear in the taskbar. This was a great idea with the introduction of Windows 95, but in the era of many computer users running greater than 1 GB of RAM it seems un-necessary, or at least that it should be redesigned in Vista.

Microsoft is attempting to remedy the problem of slowness as well as acknowledge the larger amounts of RAM in currently shipping systems by building a technology called SuperFetch. Which is a fancy caching program designed to notice which software you use most frequently, and load that into memory at start-up. This can improve the percieved performance benefit to some users, but probably would end up being a technology that would get in the way of most power users. It would in a way take users back to the heady days of Windows 3.11 when we could choose how much system memory was to be dedicated to the applications, and how much was to be reserved for hard drive cache. This system would behave in much the same way, but instead of the user being in control, the system would adjust automatically.

In attemping to understand why the system tray is strange, and probably should be either redesigned or removed, it is a good idea to look at how Apple as well as Microsoft handle GUI-less applications. In Windows developers have the ability to create full applications that run in the background without disrupting the user, or appearing on their display. These programs are known as services. Probably the most widely used service is the IIS service. IIS, if you don't know already, is Microsoft's web server product. It is similar to Apache, except it has the ability to process scripts built in to the server. Whereas with Apache it is necessary to include modules for this functionality. As a result Apache is much more flexible, which could change if Microsoft is truly to build PHP support into IIS. But I digress. Some applications such as Apache for windows run as a service, and also include a system tray icon for management. In Mac OS X applications without a GUI typically either run without any notification to the user, or place a small icon in the upper right of the screen, such porgrams are fairly rare, since most applications on the Macintosh load themselves into memory at their first use and remain there. Since it is quite unnecessary to turn the Macintosh off, or to reboot most of the time, your programs always launch insanely quickly. It would be best for Microsoft to implement a similar type of system. This would minimize the overhead necessary for keeping frequently used programs in memory, and would be vastly less complex.

Another potential solution could be to automatically dump programs from the system tray if they aren't being actively manipulated by the user, or performing some operation on the system. This would do two things beneficial to the system. First, it would free system memory, handles, and other resources for use with software that is actually doing something. Second, it would discourage developers from using the system tray as a place to put their icons. To supplement this model, Microsoft should encourage developers to write services for Windows as opposed to tray software. The way it could work is that the first time this happened, the system would notify the user that it was closing the applications and ask them if they wanted to be notified in the future. That way the user would remain in control. On systems with less than 512 MB of RAM this simple system would pay back sometimes enormous benefits, reducing the idle footprint of the OS from around 400 MB on some systems back to a managable 256 MB. Microsoft is over-engineering this one, they should review the K.I.S.S. software development methodology.


New Internet Explorer 7 to Allow More Customization

Posted: December 31st, 1969 | Author: | Filed under: Google, JavaScript, Microsoft, Programming, Uncategorized | Tags: , , , , | No Comments »

New Internet Explorer 7 to Allow More Customization

Picture of Irv Owens Web DeveloperI love the ability I have to add more functionality to Firefox. Right now I have the web developer tools so that I can check out a page's stylesheets, javascript, block level elements, etc… I have the IP tool installed so that I can see the IP address of the site that I am currently visiting. I have the Gmail notifier and the PageRank tool all incorporated in my browser, most of which modifies the status bar at the bottom of the browser and is completely innocuous. Internet Explorer has always supported plug-ins, but they were limited in their ability to change the user's browsing experience, relegating them to toolbars and the like. That is about to change.

Similar to the new Google dashboard Internet Explorer will allow small web applications to be installed in the browser, it will allow a user to modify the webpages they are viewing, create a new download manager using the .net languages, really the implications seem to be pretty huge. There is just one problem. Security.

One of my biggest fears with a heavily extensible Internet Explorer is that people will be able to use it to compromise the security of the operating system. We have heard time and time again that in Longhorn, ahem, Vista, users will be able to run Internet Explorer 7 in a sandbox of sorts, or a least privileged user account, preventing would be hackers from compromising the system. That is great for Vista, but what about on Windows XP Service Pack 2? Don't get me wrong, I think Microsoft has done as much as can be expected of anyone when patching a completely insecure OS, and they did it in record time too. Still, there have been plenty of bulletins regarding more compromises and exploits in Windows XP SP2, some regarding Internet Explorer. If you give individuals the ability to distribute code that a user can install, it is possible, by definition to compromise that user's system. I'm sure that Microsoft would be quick to point out that then it isn't their fault that someone installed software that allowed hackers to have their way with all their files, but at the same time it is very easy to misrepresent a piece of software to a computer novice who is using Windows. Just look at how far Gator / Claria has gotten sneaking software onto systems. I think that while having the ability to customize one's web browser is cool, Microsoft should consider passing on this potential nightmare. It is sort of reminiscent of Microsoft's touting of Active X and how it was going to obliterate the line between desktop software and internet applications and change the way we all use our computers. Well, it changed the way we all use our computers, we all need anti-virus / spyware / malware filters that sniff out those Active X controls and disable them. Most of us, those in the know, if we have to use windows, turn the Active X controls off altogether.

I think that Microsoft should really not include this feature, and I mean even for toolbars unless they are reviewed by Microsoft and signed by Microsoft. That is the only way to be sure users aren't getting malware. If the plug-in isn't signed by Microsoft then the OS should refuse to install it. It should be that simple. Of course it makes developing for IE that much more difficult, but Microsoft could release a developer's version of IE that was open source so that the plug-in verification could be disabled to allow all plug-ins to be installed. Everyone in the software business knows that features move boxes, but Microsoft should keep their eyes on the prize of security. They really need to get their reputation back, and integrating more sketchy features in not the best way to do this.

IE Extensibility – From the IE blog


The Future of Scripting

Posted: December 31st, 1969 | Author: | Filed under: ColdFusion, Companies, Microsoft, Programming, Sun Microsystems, Uncategorized | Tags: , , , | 1 Comment »

The Future of Scripting

Picture of Irv Owens Web DeveloperInitially I wanted to stay away from scripting languages as a developer due to the fact that they weren't really programming languages at all. For some time I was reluctant to even call myself a programmer until I built my first Java desktop application. In CNET's open source blog today, they ask the question has scripting peaked?

Scripting hasn't peaked out yet. The reason is clear. Building a web site with C++ or Java is like driving an armored tank to your mailbox. It is that ridiculous. The funny thing is that even Microsoft realizes this, giving their ASP.net developers two languages to choose from when developing web applications. There are many reasons for enterprises to choose C# over Visual Basic when building a web application, especially if they already have desktop and client-server applications built using the technology. It would be possible to completely reuse many of the methods used in the desktop application for the web application. The frameworks built into J2EE as well as C# allow for robust development making it less likely that a developer will lose control of their code. Still, using these technologies and frameworks where a scripting language and a light framework would do adds un-necessary overhead to a project and can push deadlines out unreasonably.

Here's what I see. PHP is a fantastic scripting language that has no real back end and therefore is suitable for light to moderate customer facing websites and some intranet applications. Use of PHP in this regard will only continue to grow. I think some of the 25% decline in worldwide use is a reactive measure to PHP's early security vulnerability. PHP is losing ground quickly to ASP.net and VB scripting as Microsoft's Server 2003 is more widely adopted. Personally I think that LAMP is superior for many tasks, but ASP.net is almost ubiquitous now, hosting and maintenance are cheap. I'll continue to use PHP for light jobs, but at the same time I realize that this is just a preference and performance-wise ASP.net is better. Talking about Java… Sun needs to buy ColdFusion from Macromedia / Adobe. It should be THE Java application server. There is no cleaner and easier scripting language, and it has nearly unlimited flexibility and is design-pattern friendly. Why this move hasn't occured yet is beyond me. It would have made sense for Macromedia to sell it, but I think the issue is that Sun has many proud engineers who love to over develop products. The thought of supporting something as business friendly as ColdFusion probably makes them sick. The business case for this is probably that Macromedia probably sees the big picture and that there are big bucks in ColdFusion, especially now that enterprises are seeing it as a way to get around JSP's notoriously long development cycles.

I see scripting as having a bright future, and I'll tend to side with Zend's guys as saying that regardless of how the Evans study got its numbers, PHP is increasing in use not decreasing. I'm not sure if it is true, but if the next version of IIS is going to have PHP support built-in, I'll be seriously considering going with a Microsoft server in the near future and running it alongside ColdFusion. I like PHP, but I just like ColdFusion better.

news.com – Scripting's demise


MySpace.com Switches From ColdFusion to Blue Dragon

Posted: December 31st, 1969 | Author: | Filed under: ColdFusion, Microsoft, Programming, Uncategorized | Tags: , , | 1 Comment »

MySpace.com Switches From ColdFusion to Blue Dragon

Picture of Irv Owens Web DeveloperWhen MySpace decided to stop using ColdFusion recently, many ColdFusion developers felt somewhat betrayed by the change. Many even suggested that it wasn't the best engineered solution. By that they were hinting that by using fusebox, no one has said which version, and ColdFusion 4.5 / 5 they weren't working with the best that Macromedia had to offer.

New Atlanta claims in their press release that using the exact same code, MySpace was able to reduce their CPU usage by 50% under heavy load. They also claim that this result could not be duplicated with ColdFusion MX. I believe both of these claims. Most ColdFusion developers are loathe to admit it, but Microsoft has a pretty good thing going with C# right now. The ASP.net framework is decent, it performs extremely well in every test I have seen, so I am not suprised that by using ADO.net and ASP.net they were seeing gains like this. The biggest problem with any scripting or programming language comes down to the drivers when using an external resource. Web development is no different. When your database connection pool reaches it's limit, it just can't create any more connections and requests get queued. There is no way around this except to get better database drivers. I have seen JDBC fail miserably time and time again, especially with SQL Server 2000 in this area. One of the biggest causes of that 100% CPU utilization seems to be in the JDBC driver when the database doesn't respond in a timely fashion. I don't know what causes it, but it happens when I am working with a complicated dataset. It isn't even that the memory usage is too high and the server is thrashing, the CPU just goes to max and starts refusing connections. The only solution to this is to kill and restart the JRun instance. After that it behaves well again until it crashes.

In all fariness to Java / J2EE, I have only experienced these issues while working with ColdFusion Enterprise on Windows based systems, this doesn't seem to happen on Linux, at least in my experience. Since MySpace was so heavily invested in both ColdFusion and Windows hardware, I guess they had no choice but to use New Atlanta's Blue Dragon. Still, what most programmers have to realize is that at the end of the day, it is the tool that delivers the best that will be used. No matter what we feel about Microsoft, ASP.net is fast and stable in it's newest iteration. One of the advantages to ASP.net is that it automatically fixes it's memory leaks. This is something that Macromedia desperately needs to build into JRun. ColdFusion is sometimes slow and is often buggy when dealing with some of it's advanced features. Experienced CF developers know how to deal with most of this so it doesn't come into play very often, and I'd bet that with higher quality Microsoft SQL Server JDBC drivers, and the application of good design patterns and reusable CFCs they could have gotten better results out of the CFMX server. But as the CEO of New Atlanta said in his blog, to rewrite all the code to take advantage of components, invocation, and var typed variables is beside the point. They made the decision that would best serve their business. If their code was reusable or not is immaterial after the fact.

Still, I find ColdFusion's performance to be reasonable. I wouldn't call JSP / ColdFusion a speed demon any day, with light load, PHP blows it away with 10 users on it, but once you scale that up to 100,000 users concurrently, then Java starts to shine. Since Microsoft shamelessly copied Java with C# and improved on it, it is no suprise that ASP.net performs as well as it does since it has native OS support in Windows Server. No one uses ColdFusion because it is the fastest from the execution standpoint, they use it because it is the fastest language to develop in.

Another interesting point would be if Microsoft were to acquire New Atlanta in order to integrate ColdFusion support directly into IIS. This would give Macromedia / Adobe some competition and force them to fix some of the issues they have let languish in CF. It wouldn't be too suprising a move since they are seriously evaluating building PHP support directly into IIS. I'll bet that it wasn't even that tough for New Atlanta to port their J2EE version of ColdFusion over to ASP.net due to it's support for C# and Microsoft's Java source to C# source conversion tools. It, of course required some optimization, I'm sure, but I'll bet it is smokin'. Maybe I'll download it and try it.

New Atlanta
New Atlanta CEO's blog
House of Fusion MySpace Conversion Discussion


Reaching into the Cradle for Programmers

Posted: December 31st, 1969 | Author: | Filed under: Google, Microsoft, Uncategorized | Tags: , | No Comments »

Reaching into the Cradle for Programmers

Picture of Irv Owens Web DeveloperOne of the biggest issues facing America, other than Katrina of course, is that we will lose our cometitive edge in programming, especially in the face of offshoring and the like. One of the ways in which to combat this eventuality as I have commented on earlier in my blog is to add value to our programmers. One thing to look at is what makes up a good programmer, or not good but rather an effective programmer.

Effective programmers may not be the best at all the technical tricks in programming, but they can take a set of good specs, some user feedback, or a problem and churn out a piece of software that solves that problem. The ability to do this comes not from college, or advanced courses in algorithms using C++. This ability is established at a much earlier age. Probably somewhere between ages 3 and 5. Most of the people that I consider to be good programmers started writing code at around age 10, and have a strong artistic bent. Usually around the same time as they start writing code, they begin to take things apart to see how they work, to quote a friend. These traits should be sought out in elementary school, and should be encouraged through high-school. Many would say that this takes the self-determination out of a child's path to adulthood. I would argue that it helps them find their path, many of these kids would want to be programmers, if they knew what it was to be a programmer. Most of them are strong problem solvers, even if they aren't that good in math.

For the most part, the extreme emphasis schools put on learning math to be a good programmer is ludicrous. Obviously if you are developing new encoding codecs or security algorithms you have to be good at math, but that is only one kind of software development. There are many other aspects of software problem solving that don't involve that kind of math. Also, people can continue to learn math outside of school, even though most don't.

A good way to start your child down the path is to keep them assembling blocks of logic to solve a problem. There are many games, especially video games, that enforce this type of thinking. To maintain our lead, we have to enhance our kids' creativity and their interest in software development. There is still plenty of work to be done, just look at most web sites, web applications, and operating systems. They are hardly ideal. This is why Google and Microsoft are coming under so much criticism from their recruiters. They focus mostly on the education, but that hardly makes one a good programmer and therfore are often turning their nose up at creative individuals who could make an immediate impact. Developers take many strange and divergent paths to get where they are, it takes a good recruiter to understand those paths to hire good programmers when they find them.


Another Possible Twist on Intel Mac

Posted: December 31st, 1969 | Author: | Filed under: Apple, Microsoft, Uncategorized | Tags: , | No Comments »

Another Possible Twist on Intel Mac

Picture of Irv Owens Web DeveloperAgain, I am engaging in idle speculation on the heels of the underwhelming Apple media event, as well as Steve Jobs pulling out of the keynote. Many people have begun to wonder why indeed Apple is not sticking with the PowerPC architecture. It isn't clear whether or not IBM can make a 3 GHz G5 part, or whether they can get a chip's power requirements low enough to deliver a G5 PowerBook. In fact, it is pretty clear that a dual-core 2.5 GHz part would be at least as powerful as a similar part from Intel. Now, I must preface this by saying that I don't truly believe this one myself, but I am, as I often do, pondering the possibilities.

Let's say that in a bizarre parallel universe, Apple after releasing the multi-button, multi-function mighty mouse that is fully compatible with computers running Windows, decides that they can make more money out of building iPods and computer hardware than they can in releasing software. They have realized that what makes their products compelling is their design, and not in actuality their operating system.

The result of this revelation. Apple decides to produce all of their iLife applications as well as GarageBand for the PC. They will continue to sell Macs of all types and iPods, but they decide to phase out OS X in favor of pre-installing Microsoft's Vista. Since most of their profit comes from hardware, to the shareholders this seems like a good move. It would also explain why the Mighty Mouse is designed to work so well with PCs, and why Apple has been so explicit about not doing anything to prevent users from running Windows on their Macintoshes.

Now, why this would be a very bad move. On my very long drive back from Las Vegas, I was listening to the TWiT podcast. They brought up the fact that CP/M was a lot like Apple way back in the day. They had the leading operating system for PCs and they locked it into their hardware. Eventually IBM decided to get into the game with an operating system that would run on any intel based hardware, regardless of the vendor. Soon, Microsoft wrote an operating system that was superior to the one that IBM made, but was company agnostic, it didn't care who's hardware it was running on as long as it made the system requirements. After a while, all anyone said about CP/M was “CP/M who?” Apple's current strategy of lock-in is similar to CP/M's. This strategy obviously didn't work for CP/M, and isn't working for Apple. Without the iPod Apple's computer division isn't doing all that well. If they licensed their OS, they could do at least as well as Microsoft. But they would have to drop their hardware line. What they could be thinking about is that they could be like Dell and sell hardware with the OS preloaded. They could then focus on their hardware margin. But here's where this would destroy Apple. The problem is knock-offs. Whatever they came up with, they would have something like a one week lead on the design, before it was reverse engineered and sold on the market here for hunderds of dollars less. Right now, even though there are cases that look very similar to the G5, no-one, even Microsoft, has been able to reverse engineer the operating system to any truly successful level. Just a few thoughts….


IE7 Using CURI to Handle URI Objects

Posted: December 31st, 1969 | Author: | Filed under: Microsoft, Uncategorized | Tags: | No Comments »

IE7 Using CURI to Handle URI Objects

Picture of Irv Owens Web DeveloperWhen some people think of the issues plaguing much of Microsoft's software, they often think that it is the result of lazy coding. Sometimes that is the reason there are issues, other times it could be the deadlines the team had to meet, or it could be that no one actually thought that the potential bug could be a real issue. One of the issues that web developers have had to work around since IE 5 came out was the 2KB limit on URL strings. Another issue was that hackers had the ability to send a malformed URL string to IE to fool it into thinking that their site was a trusted site. Then they could wreak havoc on your computer by sending IE awful Active X commands to trash your system.

IE 7 so far doesn't look like it has a bunch of sexy features, but under the hood Microsoft is really working hard on this release. From the partial standards compliance to running IE under a reduced permissions sandbox if you will, they are really working hard to try to get people to trust the internet again. If that wasn't enough, Microsoft is building tools into IE to detect if a site is on a list of “bad” sites that Microsoft will keep. But one of the coolest enhancements to me is the CURI object. Basically it is a struct that allows a programmer to handle it as such. Since it is not a string, it is possible to validate the CGI variables apart from the rest of the URI. If someone were to try to slip a malformed URI down the pipe, the validation of that CGI string would fail as would the attack. In IE 5 and later, the CGI string was handled as a string and passed around the code. String variables give the developer limited abilities to validate parts over other parts. There are many sub-string functions and libraries out there, many are built into the development languages, but they cost the developer in performance. Was Microsoft lazy, who can say, but it seems as though they are working hard to make IE 7 everything that 6 should have been.

Microsoft's IE Blog


The MSN Bot

Posted: December 31st, 1969 | Author: | Filed under: Microsoft, Uncategorized | Tags: | No Comments »

The MSN Bot

Picture of Irv Owens Web DeveloperA few days ago, I noticed that the MSN bot had been hitting my RSS feed more than any sane bot should. The MSN search team at Microsoft have said that they are experimenting with the web droid. Some publishers are complaining about this because it seems that in some cases this is causing their bandwidth to average over the amount guaranteed in their hosting agreements.

While having to pay more for hosting can be a real pain, MSN's propensity to recognize RSS feeds and keep checking them for updates is a good feature, seeing as some bloggers post all the time, and blog readers want up to date information. Services that use the rpc-ping system, like Technorati typically do a very good job of crawling to get the latest blogs, only when there is an update. Perhaps Microsoft could implement this sort of function into MSN search, although it might be difficult with only 5.5% of the market to get people to actually use it.

Jeremy Zawdony – Dear MSN Bot