Archive for the ‘XP’ Category

h1

Driver to Distraction

November 19, 2012

Vista busy cursor  Another whinge, I’m afraid, this time about Canon and their failure to provide driver upgrades for older peripherals. I don’t want to be forced to buy replacements for ageing but perfectly serviceable peripherals, much as I can understand manufacturers wishing me to. At least there is a happy ending to this tale.

The device in question is Canon’s LiDE 50 scanner which I guess I acquired around 5 or 6 years ago. At that time I was running Windows XP. I think the driver support was there when I switched to Vista but recall having considerable difficulty getting the device to run on Windows 7. That would have been around March 2010, but I recently had the same issue again when wanting to use the scanner with a newish  Windows 7 Lenovo T430 laptop.

I remembered that on that occasion in 2010 I had been unable to install the driver from the supplied CD, tried the manufacturer’s website and discovered there was no Windows 7 driver available. I then did as anyone would (what you probably just did) and searched on the ‘net for a possible solution. What I found was that there was a Windows 7 driver for a slightly later model, the LiDE 60, which would still recognise and support my scanner.  I’m not certain where I came across that particular nugget but I still had the Canon driver on my desktop PC.  The latter originally came from here.

I thought it would just be a matter of running the SetupSG.exe file as administrator, but it simply did not run. I could see it was starting with a Winzip self-extract but the extracted driver install program would not launch and (oh so helpfully) the temporary files were deleted.  Googling for a solution brought me here. So it turns out that you have to install and use winRAR to do the extract then plug in the scanner, find it as an unsupported device in Device Manager, use the driver update utility and point it at the newly extracted driver file. The driver update hung on the first attempt, but mysteriously “took” on the second after a reboot.

I thought it would be plain sailing from there. The scanner was being recognised by the OS and was making scannerish noises on system boot up. But when I actually tried to scan an image into Photoshop it failed with the error message “The program can’t start because rmslantc.dll is missing from your computer”. Searching for a fix for that took me to Aaron Kelley’s blog. Thankfully, the remaining steps were not hard and well explained by Aaron. The scanner now works fine.

All the same, it was quite a job to get there, needing a number of steps, a lot of Googling and even more perseverance. It really should not be that hard.

googleplus-me

Advertisements
h1

How to Press your TV into Service as Video Podcast Player

October 21, 2012

Vista busy cursor  Acquiring a TV that can connect to my home wifi, more specifically a Samsung Smart TV, has proven to be transformative.  Surprisingly so. I finally have a proper solution to a problem that has been bugging me for a long time, namely how to watch video podcasts on an HD TV. That is, as opposed to on a phone, tablet or computer, and as conveniently as if I were watching normal broadcast programmes.  It’s not that I spend a lot of time watching video podcasts; currently I only watch three shows a week. Still, when I do take the time to watch them I want to do so in comfort and with a minimum of hassle.

Before arriving at the complete solution, there were a couple of false starts.

False Start 1 – Laptop and HDMI cable

Our Samsung Smart TV, bought for the master bedroom to replace a dying cathode ray TV, was not our first HD TV. We had already acquired a 42″ Toshiba TV for the living room, albeit not a Smart TV. My first attempt at “lean back” podcast viewing involved hooking up my laptop’s mini displayport, via adapter and HDMI cable, to the Toshiba TV, having used iTunes to download my video podcasts to the laptop over the home wifi. This setup did work, in the sense that I could sit back in my armchair and watch my podcasts on the TV, but it was hardly a slick solution, the downsides being:

  • It was not trivial to get the laptop (running Windows XP) to recognise the TV and send a video signal to it
  • The TV would cut out when I closed the lid of the laptop! If I left the lid open I could see the video in two places and found that disconcerting. After a fair bit of Googling and messing with the Windows settings I did manage to cure the problem
  • I was forever having to use the TV’s own remote control to switch the picture size to “native” (as opposed to, say, “wide”) otherwise parts of the picture would get cut off
  • I had no remote control for video playback! I was effectively using my TV as a PC monitor so found myself having to use the mouse for play/pause/rewind, etc. The HDMI cable was too short to allow me to use the mouse from the comfort of my armchair, so I had to get up to pause the video if the phone rang.
  • I couldn’t really leave the laptop on and connected to the TV the whole time, so whenever I wanted to do some video podcast watching there was the faff of booting the laptop up, connecting the cable up, often having to wait for my shows to download and then having to disconnect it all afterwards.

False Start 2 – Android phone and MHL cable

When the Samsung Galaxy SIII was announced, one of the features that caught my notice was Allshare Cast.  It allows you to mirror the phone’s display on the TV in real time, although you have to buy a specific Samsung accessory, a wifi dongle that plugs into the TV. This sounded like the ideal solution for my video podcasts, but I had by then already upgraded to a Samsung Galaxy Note which does not support Allshare Cast.  The Note does, however, support HDMI out, or at least MHL over micro USB which amounts to the same thing. The bottom line is you can still mirror the phone’s display on a TV provided you get the right cable and adapter. A cheaper solution than Allshare Cast but the phone has to be located close to the TV, because of the cable, so again I was missing my remote.  The beauty of Allshare Cast would have been that I could have kept the phone with me and used it, effectively, as a remote.

I had the idea of trying to use my old Android phone, an original Samsung Galaxy S, as a remote. I looked for apps that would allow me to control the Galaxy Note from the Galaxy S. The obvious choice would have been Droidmote, but that requires root and there is no way I was going to take a chance on rooting a Galaxy Note right near the start of a 2-year contract.

I also tried a curious app called Tablet Remote from Tournesol which uses bluetooth for inter-device communication and a custom keyboard on the “controlled” device to implement the transmitted commands without need for root. It is a bit of a fiddle to set up but did work very well for a day or so. Then the bluetooth connection started generating errors and there was no recovery from that.  I did have a dabble at writing my own Android apps to do something similar but have parked that since I now have a satisfactory solution.

The solution – Samsung Smart TV, Allshare and Juice

I bought the Samsung 22″ 1080p TV because I needed a new TV, not because I had a fix for my podcast problem in mind. And I bought a TV with Internet connectivity simply because more and more new models are offering this and there seemed no sense in investing in older tech just to save a few coppers. In truth, I was not sure what the benefits of a Smart TV really were. Very likely a lot of people buy Smart TVs because they are the “latest thing” but then just proceed to use them with broadcast TV, satellite or cable, which is what they are used to, without ever taking the time to explore the additional options brought by Internet access. Samsung do at least recognise this by featuring a very large, colourful and conspicuous button, right in the middle of the remote, to activate the “Smart Hub” screen. It just begs people to ask “What the hell’s that button for?” and maybe give it a whirl.

In my own case I have made considerable use of the Samsung’s Smart TV capabilities but it is not really the Internet access that made the difference. Wifi connectivity to other devices in my house has been the key to my podcast viewing, allied with support for the DLNA protocol. Samsung don’t refer to DLNA explicitly – they use the Allshare brand  – but it is just their own implementation of DLNA. Clearly they want you to buy lots of Samsung devices and connect them up using Allshare, which is understandable to a point, but this goes against the grain of DLNA which is all about ensuring interoperability between devices from different manufacturers for sharing of video, images and audio content over wifi.

The specifics of my podcast solution are as follow:

Source device

I have my video podcasts downloaded automatically to a selected folder on my desktop PC running Windows 7. Should anyone be interested, the shows I currently follow are from Leo Laporte’s This week in Tech (TWiT) network, namely “All About Android“, “Before You Buy” and “Know How“.  They all come out weekly and the latter two are available in HD.

Podcatcher software

I’m using the Juice application, formerly known as iPodder. It looks a bit old-fashioned and clunky but it works very well.  I have it set up to delete the files automatically ten days after download.

DLNA broadcast software

Surprisingly, all you need is Windows Media Player. If you activate the sharing feature, and include the relevant folder in your media library, then WMP will act as a DLNA server, making the files in that folder and its subfolders available for consumption by any DLNA client on the same wifi. Interestingly, I couldn’t make WMP recognise files sitting within the Windows “My Documents” tree, which is where my iTunes  music and videos are located. That meant I couldn’t use iTunes as my podcatcher unless I changed the default iTunes folder and moved all the content across. It was easier to use Juice and pick a download location that WMP could access.

Accessing the video content

Even with the WMP application window closed, the DLNA service is running in the background. I can then press the bright, cube-shaped Smart TV button on my Samsung TV remote and wake up the Smart TV functionality.  From there it is a matter of navigating to the Allshare icon, selecting it and navigating to the “videos” option. My DLNA-enabled desktop PC appears in the list of sources.  I select it and navigate to the folder with my content and select the show I want to watch. It buffers very briefly then plays perfectly.  Beautiful quality, no stuttering.

Remote control

I now have not one but two remote options. I can use the Samsung TV remote to play, pause and FF/FR in 15 second steps.  Unfortunately the 15 second interval is fixed. I can though navigate to any part of the show by using the “tools” button on the remote then selecting “time search”.

An even better option is to use my Galaxy Note as the remote. If I launch the Allshare app on that I can again select the desktop PC as source, navigate to the show I want and then launch it directly from my phone.  I am presented with a dialog box asking whether I want it to play on the Note itself or send it to the Samsung TV for playback.  If I choose the latter, it plays perfectly on the TV as before but I can now use the Galaxy Note as the remote. The advantage is that I get fine control of playback navigation.  Instead of the 15 second forward/back, or the slightly clunky time search, I can navigate within the show to the second by swiping on the Note’s screen.

The upshot is that my podcasts are just there, available to be watched on my Samsung TV, very shortly after each episode is published. No faff, no hassle and I have full remote control for comfortable “lean back” viewing. Heaven.

googleplus-me

h1

Hasta la vista, Windows 7

October 30, 2009

Vista busy cursor For the second time in a matter of weeks I found myself unable to boot into Vista on my home desktop due to a file permissions problem. The tell-tale signs are becoming familiar. Boot-up starts as normal with the screen that has the pulsating green progress bar.

When that disappears we get a black screen and after a few seconds the mouse cursor appears in the centre. The disk continues to thrash for a few more seconds then settles, but we remain stuck looking at the mouse cursor on a black field. The black screen of death.

I believe the problem is that Windows has reached the point where it wants to write to the disk but is unable to because the file it is trying to access has been made read-only or otherwise had its permissions stripped away. You’d think that a booting OS would always have access rights but apparently not.

I wasted hours with Spinrite, thinking it must be due to a damaged sector. The only way out of this, short of a reinstall of the OS, is to boot up in a different OS, maybe on a different disk or from a CD, and then manually change the permissions on the files in the drive that won’t boot.

Ironically, it is the use of different OS’s on different disks on the PC that seems to be implicated in giving rise to the problem in the first place. Particularly if one of the OS’s is Windows 7, or at least the Release Candidate.

I have had two disks on my desktop for years. The larger one (250GB) is the Vista drive that came with the PC. I later added a 40GB drive salvaged from an older computer and for a long time had XP on it. I found I could switch between the two without problem. The BIOS allows you to choose which disk to boot from.

More recently, I used the 40GB disk to try out Windows 7 64-bit – first the Beta then the RC. All went well until my first “black screen of death” crisis. That sorry tale is recounted here. I blamed myself because I had meddled with the permissions on the Vista disk, but that was only to add permissions which seemed to have been “taken away” somehow without my intervention, making files inaccessible over the local network. I am starting to wonder whether Windows 7 was responsible in some way for messing with permissions on the Vista drive.

To my mind, an OS should not be making automatic file permission changes on other drives on the system. I’m not sure why but I suspect Windows 7 does this. The first black screen crisis was resolved when I booted in Windows 7 and could see that the Vista disk had been stripped of permissions. I added them back manually from within Windows 7 and was then able to boot back into Vista.

A second black screen crisis happened a couple of days ago. I had (as on the previous occasion) booted in Windows 7 to play around with a few 64-bit apps. I tried to uninstall an older 64-bit app but Windows 7 refused, claiming it could not locate the original MSI file. I then tried to return to Vista only to find I was back to my black screen of death. Worse, I could not get back into Windows 7 either. That would start to boot then spontaneously restart, causing a never ending loop.

I was forced to do a clean install of XP on the 40GB drive. I had no important data on that drive so it wasn’t an issue.  I could then see all the files on the Vista drive, so as a precaution copied around 200GB of data to my 1TB external drive. I did notice all the files came across with the read-only flag set, which seemed odd. As Vista continued to prove unbootable, even in safe mode, despite hours of Spinrite and other attempted solutions, I decided I would use XP as my main working system for the time being so I started installing apps and device drivers. I also wanted my data available on the network so turned on file sharing. I noticed that when I shared the Vista drive it took a very long time and gave me a message about writing permissions. That got me wondering. I tried booting in Vista and of course it came right up as if nothing had happened.

As I was coming to realise, it was a variant on the permissions problem which had stopped Vista from booting, and the act of sharing the drive had restored the required permissions.  It is though very worrying to think that Windows can so easily get itself locked into an unbootable state like this, with no easy way for the user to diagnose and no solution that does not involve fixing the unbootable disk via a second OS on another drive.

I am hugely relieved to be up and running again, but extremely suspicious of Windows 7 and whether it has tendencies to make unwelcome interventions in other drives on the system, potentially jamming up other OS’s which may be installed on them. Well, for now at least Windows 7 has gone. Hasta la vista.

AddThis Social Bookmark Button

h1

Honestly, Vista is fine nowadays … really it is!

October 23, 2008

Vista busy cursor The genesis of this blog was the plethora of troubles I had as a Vista early adopter. There was no wider agenda (I am no Apple fanboy), it was just a catalogue of genuinely unexpected problems from the perspective of a Windows user who had been hoping for great things from Vista.

I rapidly found myself in good company. No shortage of bloggers ready to put the boot into Microsoft’s new OS.

But that was then and this is now. As I have no axe to grind, and Vista is now perfectly fine and trouble free on a day to day basis, I may as well say so, never mind how this blog got started.

I put down the transformation in life with Vista to three things:

(a) beefing up the hardware a bit,

(b) the various Vista updates put out by Microsoft and

(c) what I call the personal learning curve.

Hardware

The hardware changes were the adding of 4Gb of RAM, to the 1Gb that shipped with the PC, and the introduction of a dedicated graphics card to replace the onboard graphics chip. The latter was up to the job of running Aero Glass, although not gaming, but ate into the limited RAM that came with the computer. The key benefit of the proper graphics card was to free up the whole of the initial 1Gb, and the extra 4Gb made a big difference too. Vista needs RAM to work well, and my biggest gripe is with the Dells and HPs of this world who early on sold Vista PCs that were not really up to the job. They were branded as Vista ready, and technically they were, but not really powerful enough to cope satisfactorily with Vista’s RAM and graphics processing demands. Clearly, they were trying to maximise sales and profits, and coupling a new OS with the lowest possible price/spec would have seemed like the way to go.

Microsoft have suffered from the reaction to PCs which are barely up to the job of running Vista, the OS getting the blame, not the vendors for their avarice. Unlike Apple, Microsoft don’t directly control both software and hardware. But they could and should have seen this coming, and exercised better control over the branding of Vista machines sold by third party vendors.

Vista updates

The various Vista security and performance updates have made a big difference too. Many people still go on about Service Pack 1 as the turning point, but only because they had heard bad things about Vista, had already decided that the release of SP1 would make it all better and did not dip their toe in until then. Anyone who had been living with Vista on a day to day basis from the start would have seen a gradual improvement in behaviour from well before SP1, and the service pack itself would not have brought that much of a step change user experience.

My feeling is that there were still a lot of bugs in Vista, and poorly written components, when it was first released. The code that shipped was only up to the standard of a late Beta, but Microsoft were under pressure to ship because Vista was already running so late. This meant that a lot of code improvements that should ideally have been taken care of before final release were instead put out under the update system as fast as Microsoft could manage it, eventually getting the complete OS up to the standard everyone had hoped for on day one.

Some parts are still maybe not quite there. People talk about bloated kernels and the drawbacks of backwards compatibility, but I think that is quite wrong. I think the problems are in higher levels of the architecture, such as the WIN32 component. For example, I find it amazing that a third party application such as teracopy can do a far quicker and tidier job of copying files around the PC or between different drives than the native Windows copy function.

Learning Curve

Finally, we have the personal learning curve. This includes simple things like sticking with User Account Control until it no longer grates. The longer you have your Vista PC, the less new software you need to keep installing, and UAC intrusions become fewer and fewer. It just stops being that much of a nuisance.

The rest of it is finding alternatives for the software that ran under XP but doesn’t run under Vista. Software vendors behave like PC vendors – they see the launch of a new OS as an opportunity to boost sales. So they don’t offer fixes to make existing versions of their programs Vista compatible. They try to sell you new Vista compatible upgrade versions at added cost. Again, Microsoft get the flak but it is third parties who are to blame.  Over time users find alternatives.

Meantime, the public perception of Vista is badly tarnished … and Apple continue to trade on that in their successful Switch adverts, long after Vista ceased to be the pig it is portrayed as.

Microsoft have tried belatedly to repair the harm, with their Mojave stunt and their own new advertising campaign.  But it’s too late for Vista.  It will go down in technology history as a turkey, regardless.  Microsoft need to learn the lessons of the Vista debacle to be sure they don’t repeat them with Windows 7.  They are big and dominant enough to survive one poorly received OS, but two in a row would not be so clever.

AddThis Social Bookmark Button

h1

So what’s actually wrong with Vista?

May 13, 2008

Vista busy cursor The answer is “not a lot”. If you buy a modern PC with dual core processor, at least 2GB of RAM and/or a decent dedicated graphics card (as opposed to an on-board graphics chip) you are likely to be wondering what all the Vista-bashing is about. There will still be some annoyances, such as User Account Control and some video codecs causing program crashes, but you will still get an operating system that has a nice modern-looking interface and, well, generally does its job in an OK sort of way.

So how did Vista come to be the target of so much derision? To understand that you need to see Vista not from the perspective of someone buying a typical Vista PC of today, but of someone transitioning from XP a year ago.

Vista introduced new and very resource hungry user interface technology. In truth, quite a hike was needed in computing power to run Vista properly. However, PC vendors were selling a lot of PCs that were “Vista capable” but only just, in order to keep prices down. The result was that buyers would spend good money on a machine that looked good on paper but performed like a pig.

Then factor in that the only significant new selling point was the 3D graphical interface technology, but it wasn’t used all that impressively and the novelty soon wore off.

Then factor in that early on Vista was plagued by incompatibilities, specific performance issues and bugs. No worse than XP had been in its day, but that was a long time ago and people forget.

Then factor in the long gestation, the sense of anticipation, the completely over the top hype from Microsoft set against the actual experience: a slow, buggy OS (when compared with a by now very mature XP) and just a bit of not very exciting eye candy to show for it.

Then factor in some general drift in public sentiment away from Microsoft for a variety of reasons.

Then factor in the age of the blog. Blogs like this one, started by real Vista early adopters who experienced all the things described above and now had the perfect medium to share their opinions with the world. Not that I’m claiming credit for denting Vista’s reputation single handedly. But all the blogs and negative press in general have, collectively, taken their toll.

So the Vista of today is pretty much alright, but the reputational damage is done. Microsoft will want to usher in Windows 7 (AKA Vista SP2) as soon as they can in the hope of leaving their Vista woes behind them. I’m not confident about their chances.

AddThis Social Bookmark Button

h1

Learning to hate with ActiveSync

April 3, 2008

Vista busy cursor I try not to “do hate”. It’s a matter of personal philosophy. The very act of hating someone or something reduces us to the level of the objects of our hate.

But I make an exception for Microsoft’s ActiveSync. In that one case, hate is perfectly justified. In fact, no negative emotion directed towards it is too extreme.

Even during Vista’s endlessly-rotating-blue-bagel-riddled infancy I did not begin to come close to the desire for murderous revenge regularly engendered by ActiveSync, Microsoft’s lame effort at software for synchronising Windows Mobile devices with MS Outlook.

Since my family started using Windows Mobile devices in 2003 (the original O2 XDA and subsequent incarnations) I have synchronised with Outlook as infrequently as I think I can get away with. It has always been such an utter pain, from the frustration of getting a connection (USB, infra-red, bluetooth, wireless, piece of string with a plastic cup at either end … ActiveSync can fail to locate them all) to the unpredictable and alarming threat of synchronising in the wrong direction thus deleting all one’s new contacts and appointments … and latterly dismembering my laptop’s network connection capability.

Yes, ActiveSync rendered my XP Thinkpad unable to connect to a network via LAN or wireless. Violent, painful death would be a megillion times too good for it, could software but be subjected to torture and assassination.

It started when I upgraded my XDA Mini S to an XDA Stellar. I was in danger of making a second exception to my “no hate” rule for the former’s telescopic stylus which suffers from a congenital design fault and becomes very loose in its storage hole after a while. The stylus would fall out almost every time I picked the Mini S up unless I was very careful. I lost the two that came in the original box and two more from a pack of spares I had to buy from O2. I found myself going to great lengths to carry the phone upside down, to enlist some help from gravity in my stylus-retention challenge. Even so, people would keep finding random disembodied styli lying around the place and returning them to me.

Enough! It had to go, hence the XDA Stellar. A far better bit of kit anyway, and thankfully equipped with a non-collapsible securely stowable stylus.

O2 XDA Stellar

You’ve guessed the downside. I had to get my non-SIM contact details across to the new phone. I hadn’t used ActiveSync in months. I tried infra-red to connect. Slow, but experience had taught me it was less disaster-prone than the USB cable method. No dice. ActiveSync did not want to know. Reluctantly, like an utter fool I resorted to USB. No only did this fail to produce a connection, it caused an ActiveSync freeze-up and general computer crash which left my laptop bereft of any TCP/IP based communication capability whatever.

It has taken me days to get any improvement. I have followed any number of Microsoft Technical articles, checking settings and reinstalling parts of Windows. The biggest help has been uninstalling the ethernet and wireless devices from the Control Panel and allowing Plug and Play to rediscover/reinstall them on a reboot. LAN and wifi are now both operational again, although the latter seems to take ages settling down. It keeps losing the wifi connection and reconnecting every few seconds, for the first 10 minutes or so after a reboot or switch from LAN connection.

Maybe my experience with the Windows Mobile Device Center in Vista will be better. I’m going to try that next, since there is no way I’m letting ActiveSync loose on my laptop again. Who knows? It might turn out out to be the best reason yet to be grateful for Vista.

AddThis Social Bookmark Button

h1

XP all the better for Vista’s long gestation

December 31, 2007

Vista busy cursor Having a pop at Vista is very much in vogue, but XP had its own problems when it first came out. It was shunned for a year by gamers because they couldn’t get it to perform. All new versions of Windows (or indeed any OS) are wont to require something of a “settling in” period.

While Vista is not the first version of Windows to be criticised by early adopters and reviewers, it has though been a particular disappointment, especially in view of the long wait followed by all that “The WOW starts Now!” hype. Maybe also consumers are getting less tolerant, generally having higher expectations and less patience. To make matters worse, Microsoft have lost a little of the “automatic choice” sheen. Perhaps this is down to Apple’s advertising campaign, and the emergence of strong alternatives to common MS software, eg Firefox vs Internet Explorer.

Possibly the biggest reason, though, is that Vista’s tardiness in materialising has given XP plenty of time to mature and ripen into a tried and tested performer.

Back in the day, people were used to a new version of Windows coming out every 2 or 3 years. Of course each new version had its teething problems, but as no-one had had the opportunity to enjoy a well-rounded, mature OS they got used to living with the odd niggle and incompatibility. That has changed, now. Users had grown comfortable and cosy with good old dependable XP, which had long since had all its wrinkles smoothed out. When buggy, niggly, underperforming Vista appeared on the scene it didn’t so much make a splash as give everyone a cold bath.

What we can learn from this is that OS vendors probably rattle out new versions too frequently, in the normal course of events. They are more driven by sales than user needs. There is nothing wrong with creating a good OS like XP and deliberately giving it a good run so that it can be tweaked to perfection, allowing users to enjoy it at its best for a few years. And when it is pensioned off the replacement should be worthy of the hype, and have been in alpha, beta and if necessary gamma for long enough that it is genuinely “ready for use” when it hits the shelves.

AddThis Social Bookmark Button

%d bloggers like this: