Monday, September 29, 2008
In previous years, I'd have jumped onto the new Notes/Domino within six months of release.
This time however, things have changed.
First of all, there's the fact that the whole Notes 8 client is a rewrite using eclipse. It turns the 18+ year old product into a version 1.0 again - at least for a little while.
My first attempts with the Version 8 client showed that the system was so slow that I could make coffee faster than it could start - and I mean proper plunged coffee,
I quickly realised that you could gain some massive speed improvements by turning off the anti-virus program - great... but is that really a good idea?
After a while of having both 7.0.2 and 8.0 on my PC, I found myself using the older client pretty much exclusively.
8.0.2 and Hard Drive Concerns
Ok, so there was an 8.0.1 but I blinked and it passed on by. At 8.0.2, I decided to give things another go. First of all, I tried installing the client on my PC.
I had almost no space at all left on drive C: but that was ok. I had another drive I wanted to install on. The Notes install would prompt me for install locations but would then screw up halfway through the install because it would use drive C: for temporary files - despite the fact that both my TEMP and TMP variables pointed to a different location and despite the fact that I'd told it to extract files to a drive other than C:
Ok, so I wasn't going to be able to install the client. I don't really blame IBM/Lotus for that - after all, it's Microsoft's install program that's really to blame.
Domino 8.0.2 and the Subforms Issue
Ok, so it was fairly obvious to me that I wasn't going to be able to use the Notes 8 client. Perhaps if I started with the Domino 8.0.2 server.
I did my first test but wasn't really expecting any problems. After all, I'd had no problems at all in the R6 to R7 upgrade.
Everything worked except my most business critical application. It had the same subform being used for both a header and footer on the web.
Sure, it wasn't such a big deal. After all, you only have to copy the affected subforms and rename them, then edit every form in the database which uses these sub-forms and change their footers.
Of course, modifying every form in a given database - particularly a critical one - means that there's also a proposal to be made, a test plan to be drawn up and followed and results to be checked.
Suddenly my little software upgrade project changed into an application modification project. It now needs to compete with our other schedules and It's probably going to blow out by a couple of years.
Thanks IBM - that one, I can lay the blame for squarely at your feet.
So, not giving up, I decided to go back to trying to get the 8.0.2 client to work. After a messy operation on my hard drive enabled me to free up a lot of space on C:, I started installing the client, only to be confronted with...
"The feature you are trying to use is on a network resource that is unavailable."
Yep. It was looking for my old 8.0 install files. They were long ago deleted. I couldn't proceed.
Again, this is a Microsoft Installer problem.
Luckily IBM had this documented for version 6 of Notes;
The Microsoft Windows Installer Cleanup Utility can be downloaded from;
I've run it, killed off the old Notes and finally installed 8.0.2.
I'm running it, for the time being, without my Anti-Virus. I've still got to find a way to reduce its effects. Hopefully it will be smooth sailing from here - though I'll still have to go through a series of fixes for the Domino server upgrade.
IBM, could you make the next upgrade a bit smoother - and while you're at it... perhaps using the Windows installer isn't such a great idea after all.
In any case, I decided to have a go of their survey - answering questions based on our usage of Lotus products. Note that we don't use SameTime at all yet, so I wasn't expecting a great response.
I rated at the top end of "Basic".
If there's anyone out there who is using the entire Lotus product range, I'd be really interested to see how it rates. The assessment takes about 1 minute - if you're a fast reader.
Microsoft Unified Communications Assessment
The assessment also comes with a download package containing case studies, whitepapers and other "goodies".
IBM, this is one area where you're obviously a lot stronger than Microsoft. How about a competing package and assessment?
Thursday, September 25, 2008
Yesterday afternoon, I spent a bit of time backing up my work PC so that I could try a few tricks to get the System partition larger. I keep all my data and apps on a different partition but nevertheless, they still like to dump files in the Windows directory on the sysetm partition. In addition, there's the fact that Windows Update keeps wanting to increase it's stranglehold on my C: Drive.
I didn't want to do a reformat and reinstall because I would have to reinstall all of my apps too. I also didn't want to do an entire system image because that saves things in a proprietry format. I wanted to be able to get to my files on DVD if and when I wanted to.
My Partition Troubles
When I first installed Windows XP I figured that 10GB would be sufficient space - especially since I upgraded from Windows NT which had problems recognising system partitions over 10GB. How wrong was I? For the last few weeks, I've been engaged in a morning hard drive cleanup, removing the help files, old updates etc, from my C: Drive but within a week, Windows update has done it's work and I start to run out of space again. What's that... turn off Windows update.. well, I suppose I could - but that would be cheating.
In any case, I had a plan for this morning. I was going to ghost my C: partition to a server then delete the partitions and create a new, and bigger C: It would fix all my problems at once.
So... I guess that everyone working in IT has those days where you wonder if you should have gotten out of bed. For me, today was one of them.
I kicked off the Ghost (version 2003) but figured that it was running a bit slowly. I decided to have a look at the server I was sending the image to. Immediately after logging on, I knew I was in trouble. The server had rebooted unexpectedly.
At first I thought it was Windows Update. After all, even when it's turned off, Microsoft seem to have a way to override it and download the occasional "really critical" patch. They also seem to think nothing of rebooting your production server. I guess they're a bit confused about ownership of the hardware.
Anyway, I noticed there was a "critical error" so I sent the data to Microsoft. Believe it or not, sometimes it returns useful information. In this case, Microsoft said (approximately), that the Hard Drive had a glitch but that it was "probably nothing to worry about".
Not trusting the Error Message, I decided to check things out in Event Viewer.
I noticed that there were a couple of warnings about impending drive failure shortly before the actual Failure. I was faced with a dying drive.
A Relaxing bit of DRP
Now this server was an "ARCHIVE" server, meaning that the data on it didn't change. It just held old data and installation files, clip art, that sort of thing. It was excluded from our backup schedule but I had a copy of the data on portable hard drive (and several copies on stacks of DVDs). Essentially, I could have rebuilt the server from scratch then restored the data, but I was feeling particularly lazy... and I wanted to test a different type of server DRP. Also, it ended up being the same solution I was about to apply to my own PC. (very coincidental).
Since the server was still functioning, and I already had a full copy of everything that was on drive D:, I only had to worry about C: Drive. Now, on the server, there's not much there, Just Windows Server 2003 and various service packs etc. My own PC, with it's hundreds of applications with DLL files everywhere is a different story altogether. I decided to try and get a ghost image of the server. If nothing else, it would save me locating all of those pesky service packs.
The server was still operating, so I checked the brand of the NIC and found an appropriate Boot CD for it. I restarted the Server and booted from the CD. Logged into the network via command line interface and mapped a drive to another server with oodles of space.
Creating the Ghost Image
Then I ran;
Ghost -NTIL -SPLIT=650
- The NTIL gets around an old NT problem and I do it out of force of habit.. It's probably not needed anymore, but it does no harm either.
- The Split=650 splits the files at 650 MB. This enables them to fit onto CD if I choose.
I'll probably use DVD but since some versions of Windows have problems with 4GB files (especially if there's a FAT32 partition involved), the 650 limit is a good choice.
In Ghost, I selected Local, Partition, To Image
Gave the file a decent name and selected high compression. I wasn't really pressed for time.
The Ghost image took about 50 minutes to create. I'm not sure why it took so long - probably the compression and the fact that the source hard drive was a bit wonky.
When the image was finished, I turned the PC off and replaced the hard drive. It was IDE because this wasn't a real "server" it was a PC masquerading as one. Being a mere archive server, it didn't need to be terribly powerful.
Creating a New Partition to Ghost onto
The next step was to start the server and create a new partition so that Ghost would have somewhere to write. I knew that my old DOS boot disks wouldn't support partitions larger than 4GB, so I needed something else.
I decided to use the Windows XP Pro Install CD. Sure, it might have been more appropriate to use the Windows Server CD but I wasn't installing anything and an NTFS drive is still and NTFS drive regardless of what software you used to create it. Also, the WinXP CD was closer to the top of the pile.
At the Windows XP Setup now prompt, I pressed Enter, then F8 to agree to the licence.
Next, I was prompted to create a partition to install on (Press C). I chose to make it 20480 (about 20GB). I like my partition sizes in multiples of 1024. Upon pressing Enter, I was prompted to format it. Since I didn't need to, I pressed ESC to go back a screen, then pressed F3 twice to close out of the Windows XP setup.
I then Removed the WinXP CD and reinserted my Network Boot Disk - hurriedly before XP rebooted the system.
Connecting the to network, remapping the drive and starting ghost, this time with only -NTIL, I was able to ghost back the partition - using these steps.
- Select Local Partition from Image
- Choose Image
- Choose Partition in Image file (there will be only one)
- Choose Drive
- Choose Partition
- Proceed with Restore (ignore the data will be overwritten warning).
Ghosting back took only 12 minutes. At the end of the process, I removed the boot disks and restarted.
The server went straight to the Windows 2003 Server logon. (Impressive - I expected this, but it's still nice to see).
From there, it was a quick trip to Disk Manager to create a new Drive D: (and the formatting of that drive which took ages).
Restoring Data Files and Shares
Then I copied all of the files back from the portable hard drive.
Now, since I used a portable hard drive which was formatted as NTFS, the security on the files should have come back automatically. It didn't... Luckily, this was a flat file structure - they're archives, so the entire server is read-only.
And... since share information is stored in the registry.... it should have followed that the share would have automatically worked... It didn't.
All up... an interesting bit of DRP showcasing an alternative method. Because of the issues bring security back and the delay waiting for disks to format, it might not be applicable to servers requiring a speedy recovery but it was sufficient to get our archive server back in half a day.
Repeating the Steps on My PC
So, next up, I'm going to be doing the procedure on my PC... only this time, instead of replacing the hard drive, I'll simply delete the existing partitions using FDISK - or I could do it during WinXP setup.
Thursday, September 18, 2008
Yesterday, we did some tests on Domino 8.0.2. I wasn't expecting any real problems given the ease of the last few Domino upgrades - up to 7.0.2. I guess I was wrong.
First of all, the initial attempt at installation failed with one of those non-specific error messages. I rebooted the server and tried again - no problems. I've seen this problem a few times though - it's something to do with the JVM being held open on Windows Server 2003 even though Domino itself has been closed.
Upon starting Domino, I chose not to upgrade the designs of the databases. Ideally you should not upgrade the designs until your last production server is on at least the same "major" version - in this case, 8.x.
I then did a bit of testing. As usual, I found no problems. Domino is one of the few systems I know which can be upgraded and tested (albeit roughly) in 15 minutes.
Then I found the problem. Our most critical database refused to render web pages. This was shown in Microsoft Internet Explorer as a general "dummy spit" message which told me nothing.
Luckily, I use Mozilla Firefox most of the time. I was only using IE for testing because;
a. Most of our clients use it
b. It had no passwords cached
Mozilla gave me a very different story - a proper error message.
HTTP Web Server: Application Exception - Duplicate Subform found. A given
subform cannot be used on the same form more than one time.
I'd read about this problem - and I thought that we'd checked that specific database for the problem - apparently not.
Anyway - I wasn't going to compound the problem by attempting a last-minute fix. Time to rollback.
I decided to try a couple of tricks first.
First I tried re-installing 7.0.2 over 8. In the first instance the install
failed completely but a retry after a reboot worked wonders.
I didn't really expect a simple re-install to fix the problem though I
would have been pretty impressed if it had.
The next step was to move (copy and delete) the entire
Directory - except for the data directory of course.
I then used ArcServe (r12) to restore this folder from tape - but not
directly to the original location - just in case I stuffed up and overwrote data or something. I restored to a different drive.
After restore, I copied the files to the right location and rebooted.
Everything came up normally - problem solved.
I guess we won't be moving to Domino 8 until we resolve the subform issue on that database.
Wednesday, September 10, 2008
I was just wondering if people have email problems because of poor implementations, poor policy, low expenditure or because they're on other systems with less resilience than domino.
The Problems Discussed
The problems mentioned in the survey were as follows;
- Outbound Confidential Material
- Archiving and Retrieval
- System Management Time
- Mail File Sizes
After struggling for a few years with the Symantec Anti-Spam solutions, I finally redirected our mail through a cleansing service. This service runs our mail through several different Anti-Spam solutions. Anything considered spam is sent to email@example.com while all other mail goes to it's rightful recipient. All users at our company have rights to read the mailbox of firstname.lastname@example.org and they know that if there's something that they didn't get, they can search for it there.
Of course, the filters are so accurate that this never happens. (touch wood).
Outbound Confidential Material
We have an extranet system which is powered by Domino and which contains about 200 document databases. Most of our documents are intended for one group or another. When we send documents to these groups, we broadcast from the database. The recipients get a link which they can follow and login to get their documents. If they forward the link to somone else, the document is safe.
If one of our administration people sends the document to the wrong group (difficult because the broadcast functions on our database fill in the group automatically), then the ACL of the database will prevent the wrong person from accessing the document. The problem then becomes one of embarrassment, not security.
Finally, if we have to send anything highly-secure one-to-one, then the attachment is PGP encrypted before we send it. We have policies in place to enforce this and everybody at our company - right up to CEO level - is expected to comply.
Archiving and Retrieval
We've got a Lotus Notes document management solution called AbilitySuite in place. It's been working well for years. It forces people to categorise emails according to companies, projects and other criteria. Some of our people are "lazy" and don't categorise everything but since we also use mail rules to simplify categorisation of emails we usually pick up the important ones.
In any case, if a mail doesn't get categorised, it doesn't matter too much - after all, since every email is journaled we can still retrieve all emails to (or from) a given person between two dates if we get a subpoena. The other great thing about the AbilitySuite system is that it chunks our mails into monthly archives. If we need to get space back, then we just move the archives onto another server, back it up to optical media (several times), store copies on and off-site and then, stop backing it up from then on.
System Management Time
I have to say that even though our Domino systems run a huge number of applications as well as email and other things, it doesn't consume much system management time. I have no idea what the point of this topic is.
Perhaps it's that - in the days before our archiving solution and before our spam management was done by an external party, there was a lot of work to do. Well, erm... that's the past... It's not the case anymore - and anyone on a mail system which does require a lot of maintenance needs to look at their infrastructure to put in a better solution. Enough said.
Mail File Sizes
The afforementioned archiving capabilities of AbilitySuite have put this one to rest. Sure, I have a few users over quota but do I care? No. If I wanted, I could delete all of their mail and they'd still not lose anything - because it's all stored in the mail archives.
I've only let people get over-quota because of the generosity of my heart and the fact that our other systems aren't affected by it. If it started to push our backup windows out, I'd get those users down to quota.
I guess my point is simple - why, in this day and age, are we still talking about such things? If your email system doesn't make these points a no-brainer, then perhaps it's time to move to one which does.
Tuesday, September 09, 2008
If you're into social networking or find yourself frequently adding comments to articles around the net, then you probably have a significant web presence. Certainly more than you could handle doing spot checks via Google search.
What's great about Google alerts is that it doesn't show you all your web presences, just the new ones - or those recently updated. This makes it easy to see if someone is using your identity or to re-locate a site you put a comment on to check for follow-ups.
You might not be able to stop someone from falsely putting comments on sites using your name but at least you'll know about it and you'll be in a position to request its removal or post a follow-up correction.
If you have a gmail account, then you already have access to google alerts.
How to Set up the Alerts
- Login to your Gmail or iGoogle Page
- At the top of the screen, click on More, then Even More and then on Google Alerts
(or you could just use this link - http://www.google.com.au/alerts?hl=en)
- In the Search Terms box, type your name.
Note, if you often post under pseudonyms or name variations you might want to create some additional google alerts to search for them. Eg: rwilco
- You can leave most of the other defaults as-is and make sure that you have your email address in the last box.
- Click - Create Alert.
That's it... now you just wait for those emails to come rolling in.
Monday, September 08, 2008
Microsoft, once an industry unto itself no longer holds that coveted position. They didn't so much lose direction as, fail to take the correct turns along the route to today's platforms.
While the world is heading towards open source, platform independence and service orientated architecture, Microsoft is more tightly bound to proprietary systems on the windows platform than ever before.
A little whinge about IBM's Marketing
So where is IBM in all this? Well, to be perfectly frank, they're not in the sweetest of spots even though they deserve to be. Why? Because although in my opinion, IBM's technology is more than a match for Google's, they haven't yet caught the attention of "Joe Public". Their marketing team has the "killer suite" on their hands but doesn't seem to know how to sell it.
What's weird is that it's been demonstrated over and over again that the way forward is to catch the attention of the home users and get them to push the technology to business users. Apple and Microsoft both did this in the past by selling technology cheaply to schools. Microsoft made the enormous gains with Outlook and Internet Explorer by making them free to home users. More recently, Apple has focussed the attention of initiatives like the iPod and the iPhone firmly upon non-business users. These users frequently come to work and ask "when we're going to start buying Macs" because they're "so easy to use".
Also recently, the Open Source community has made a lot of headway with Mozilla and Open Office by making these things available to the public, while Google's initiatives in turning most of their apps over to "Joe Public" for testing have seen them take a position of dominance.
IBM on the other hand, has always taken a top-down approach. They believe that the business comes first and seem to have forgotten that all CEOs and CIOs have children, relatives and friends who will be very influential in their opinions of technology. There's no way that these people will be recommending Notes/Domino when they haven't ever seen the product. I'm not suggesting that the Notes client needs to be free but I am suggesting that a cut down version of it, (or a web version of it) with some basic online services available, would be a good start. If nothing else, IBM deserves a bit of recognition for Symphony.
Back on Topic
So, if Google's strategy represents the future, why then do I believe that IBM's Technology is right on target? Well, if we consider the web browser (and in particular, Google Chrome) to be the glue in the google "cloud computing" system... what then is the Notes Client in the IBM strategy?
Have a look at this picture comparing the two;
At the top, we have Google Chrome, running a Home Page, Email (Gmail), Calendar (Google Calendar), RSS Reader (Google Reader), Document Editors (Google Docs), Web Pages and a Blogging Tool (Blogger).
At the bottom we have the Notes 8.0.2 client running.... a home page, Email, Calendar, RSS Reader (sidebar), Document Editors (Symphony), Web Pages (built-in browser), and a Blogging Tool (OpenNTF BlogSphere).
Funnily enough, the strategy is very much the same - and IBM got there first!
Even the home page on Chrome is amazingly similar to the IBM Lotus Notes Welcome Page.
I've left off a lot of other comparisons, like Google Talk vs Sametime and Google Desktop vs IBM Omnifind. I'm sure Google probably does a couple of extra things that Notes/Domino doesn't but I'm equally certain that Notes/Domino does a lot more that Google isn't able to do at all. In particular, databases, a development platform, replication, clustering and access control management.
I've often said that the main gap in the Google strategy is to provide a secure platform which can be hosted at a business site rather than online. I'm sure that such an appliance is in the works and will eventually emerge. In the meantime however, IBM already has a proven solution available and if an OS is required, they have Foundations as well.
Now, if only they could capture the attention of the common people.
Thursday, September 04, 2008
"Notes R5, 6, 7 and 8 Standard running concurrently on one Windows machine, and side by side the Notes Client, Domino Designer and Administrator of each version - no image manipulation (besides scaling), no VMs, no tricks... "
It's so impressive that it deserves a referral.
Well done guys!
Since the release of Google's new "web browser", Chrome yesterday the web has been buzzing with speculation about how Google will be hurting Mozilla. The funny thing is, that if you read the comic about Google chrome, you'll see that it is being positioned more as an operating system than as a browser.
The idea is that the Web browser will become the operating system of choice for cloud computing.
The Google Chrome "browser" has certain advantages over the current generation of browsers, particularly in the area of robustness and multithreading. The browser changes are similar, in a way to the fundamental chanages from Windows 3.1 to Windows XP.
Under Windows 3.1, all applications shared the same address space and one faulty application would result in the dreaded "General Protection Fault" message and would often pull the entire system down. I see that kind of behaviour all the time in Internet Explorer and slightly less often (but still frequently) in Mozilla.
The Google Chrome system treats every tab as an entirely separate application. This allows it to reclaim memory when a tab is closed (avoiding memory leaks) and allows individual tabs to be closed without affecting the rest of the system (the browser) whenever there is a problem.
Google has also been talking about getting plug-in's to run in their own address space too. I don't believe that's been done in the beta that is currently available, but we've been told that it's coming.
Google chrome is also a "self-protecting" browser. It frequently downloads updates to its list of malware and will prevent such applications from running.
The Mobile Connection
Finally, you can't overlook the connection between Chrome and Android (Google's Mobile Phone system). They're based on many of the same components with the aim of creating a truly cross platform cloud computing system.
Why is Mozilla Safe?
Google and Mozilla have long been on good terms. Both browsers are open-source and I believe that Google is hoping that Mozilla will follow it's lead and stabilise their browser too. This will prevent accusations of monopoly while achieving Google's real aim - stabilisation of browsers. Google's interests aren't in the browser market at all. They're firmly set on cloud computing - and for that, they need fast and stable web browsers.
Why Should Microsoft be Worried?
Microsoft is really the obvious target. Imagine operating systems such as Linux which run local operating system services, such as printers, fonts and screens but do not require any applications. All they require is a Web browser such as Google crime or Mozilla Firefox. You will notice that I have left out Internet Explorer because this is a Windows only browser. At this stage, I haven't made my mind up on a safari or opera but I suspect at least the latter will be a possibility.
Now imagine that the browser runs on various platforms, on mobiles, on consoles such as the Xbox and Playstation and on appliances. The underlying system is unimportant - so long as it can communicate with the cloud.
So, we've started our Web browser and connected to our cloud computing system (possibly something like iGoogle). In this case, I will be using Google as an example however there are quite a few other cloud computing platforms available.
For my documents; Word Processing, Spreadsheets, Presentations, I can use Google Docs, (in place of MS Office), for Mail, Calendar and Instant Messaging, I can use Gmail, Google Calendar and GoogleTalk (instead of Outlook/Exchange).
If I'm on a commercial-level service, which Google does not yet fully support, I don't have to worry about backup, disaster recovery or security (access control and anti-malware).
Finally, there's a range of other online tools available to me, from Google and from other suppliers. These tools include Blogging tools (like Blogger and Wordpress), Graphics (like Picasa and Flickr) and file storage and sharing tools like GoogleBase. There's even online tools available for converting to PDF via the Web.
All that's missing are some good online project management solutions to replace Microsoft Project.
Why Should IBM be Worried?
IBM has less to worry about than Microsoft but Google's cloud computing initiative is a direct threat to Lotus Foundations. The concept is pretty much the same with the main differences being that IBM have a better array of collaboration tools and that IBM are better placed to take advantage of "mistrust" in Google. Most companies today will think twice before hosting all of their corporate data online.
Even so, it's only a matter of time - so IBM probably needs to put a bit more effort into Foundations and Cloud Computing Alternatives (a cloud version of Lotus Notes?) in order to stay relevant.
Wednesday, September 03, 2008
The download was fast (it's a fairly small file) and installation is simple... Too simple in fact. It doesn't give you much choice about where you put things. There's also not much available in the way of configuration at the moment. What was nice is that it correctly recognised Mozilla Firefox 3 as my main browser and imported my bookmarks etc.
One thing that may annoy is that it modifies the registry to run Google Update on Startup.
When it starts, it gives you an option immediately, to change your search engine - or keep google, that's a nice touch. Google is obviously trying to appear as if they're not a monopoly.
It's important to remember that this is a beta product and in no way represents the final finished product.
General Look and Feel
At this point it's very rough around the edges, though I'll admit that the tabs are nice - even if they do break windows standards.
The browser itself is very fast but I think this has a lot to do with the fact that there are no addins.
It incorrectly rendered my Google Bookmarks, which both Mozilla and Internet Explorer do very well and there's no Google Toolbar available for it as yet. Since these are the same reasons that I don't use Safari, it's got no hope of becoming my default browser until this is fixed.
My iGoogle Homepage is very busy, so I decided to compare browsers on it.
Notes on Test Results
It's important to remember a few things, Firefox has been my main browser for the past couple of years though I still use IE a bit too. Both of these browsers may have had speed bonuses due to caching. Also, we're comparing against a beta product which hasn't been optimised yet and the browsers are all running different levels of addins, with firefox carrying the most, followed by IE. Safari and Chrome probably have none.
So, the playing field is by no means ... level.
Still, the results are interesting.
My iGoogle Page
- Safari v3.1.1 Memory used: 72,560
- Firefox v3.0.1 Memory used: 77,460
- Internet Explorer v7.0.5730 Memory Used: 67,764
- Google Chrome Beta Memory Used: 19908 and 39488 = 59,396
Chrome has the lowest starting memory usage, but this could be due to lack of pluggins. What's interesting here is that Chrome tends to create multiple instances of itself to manage memory (or perhaps it's multi-threading?)
Next I started Gmail and switched to the tab.
- Safari v3.1.1 Memory used: 103,948
- Firefox v3.0.1 Memory used: 100,092
- Internet Explorer v7.0.5730 Memory Used: 108,112
- Google Chrome Beta Memory Used: 28,228 + 20,104 + 10,004 + 39,804 = 98,140
Chrome still has the lowest memory usage but what was interesting was that I noticed some timing differences. Both Mozilla and Chrome loaded Gmail fairly quickly while IE took a little while and Safari took ages. I couldn't put it down to simple caching because there's no way that Chrome would have previously visited gmail, and I know that IE certainly has.
Various Speed Tests
Finally, I decided to do a couple of speed tests.
Complex Text Rendering
- Safari v3.1.1 10 seconds Memory used: 107,828
- Firefox v3.0.1 11 seconds Memory used: 90,748
- Internet Explorer v7.0.5730 16 seconds Memory Used: 99,428
- Google Chrome Beta 16 seconds Memory Used: 29,024 + 45,696 + 5960 = 80,410
Lots of Pictures
- Safari v3.1.1 14 seconds Memory used: 111,888
- Firefox v3.0.1 17 seconds Memory used: 90,116
- Internet Explorer v7.0.5730 16 seconds Memory Used: 107,884
- Google Chrome Beta 12 seconds Memory Used: 21,132 + 18,684 + 46,644 = 86,460
Note that while Internet Explorer and Firefox went all the way to my profile (because my password was cached) Chrome and Safari went to a page which was considerably less complex. This shows in their results - though the Safari memory issue is interesting.
- Safari v3.1.1 14 seconds Memory used: 111,624
- Firefox v3.0.1 16 seconds Memory used: 107,552
- Internet Explorer v7.0.5730 17 seconds Memory Used: 108,516
- Google Chrome Beta 9 seconds Memory Used: 28,380 + 46,400 = 74,780
Google Chrome was a clear winner in memory but I'm sure that has a lot to do with Addons.
To compare effectively, I ran Firefox in Safe Mode and got a figure for my starting page (iGoogle). Firefox went down from 77,460 to 50,660 which brings it below the Google Chrome memory of 59,396.
I can guess that something similar would happen with Internet Explorer - though to a lesser extent. What's interesting is that, aside from the front page, Internet Explorer always consumed more memory than Firefox. I think therefore that we can say Firefox has better memory management. In fact, considering that Google Chrome didn't go all the way in facebook to my profile (hence used less memory), Firefox seems to beat all the browsers in terms of memory use.
Safari was the clear winner in terms of speed, with the exception of a rather unexpected result in Facebook. Since I only took IE and Firefox all the way to my profile (ie: my login was cached), the Safari and Chrome results (14 and 9 seconds respsectively) can be compared. This suggests that Chrome has the ability to beat Safari in speed on certain applications.
Anyway, that concludes my results for now. It doesn't prove anything really but it's interesting food for thought.
Oh... one last thing. Google Chrome is not currently compatible with iNotes - (and rendered one of our domino extranet pages incorrectly). Since IE, Safari and Firefox manage to render it properly, I guess I can't really suggest that anyone uses Chrome seriously until the next beta.