Mar 312012
 

A few weeks ago, I tossed Mint on an old machine to give it a try. Looking for something to replace my Ubuntu 10.04, and I’m not a fan of Unity at all. Mint, with its extensions to Gnome 3, seem to give the best of the new paradigm without changing everything I like about the old.

So last night I backup up my computer, and began the reinstall. I had issues even booting the Live DVD on my main computer. After some internet searching I resolved this by:

  • Pressing e at the bootup screen
  • Adding nomodeset to the configuration line before the —

I was then able to boot, and install the OS. This didn’t surprise me, as I’ve had to do similar tricks over the years to install OSes. Normally when everything is installed, we are good to go.

Oh boy, not this time. So after the install and reboot, I could not get X to load. Using the same trick as above, I was able to get into a GUI, but video performance was lack luster as no drivers were loaded. I attempted to install the restricted drivers, but they would install, and then on reboot, I’d have nothing again. I have noticed similar problems on other computers I have which have NVIDIA cards and AMD cpus. I was determined to find a fix this time. Instead of using the drivers from the repos, or the open source nouveau driver, I decided to go straight to the source, and download the driver from NVIDIA. You can’t install this from within X, and you will also need some extra packages, so before you start, make sure that:

  • You have all of the compilation tools installed: sudo apt-get install build-essential
  • And you have any header or development libraries that are needed for your kernel.

Once that is installed, you need to disable X. You cannot simply kill the X process, because it will start itself back up, and you can’t really run in runmode 3 or 1 as in the Debian world, 2 – 5 seem to be the same, and 1 may not run some of the system services required to install the driver. Many earlier tutorials mention running these commands to stop X for gnome:

  • sudo /etc/init.d/gdm stop
  • sudo /etc/init.d/gdm3 stop

Or KDE:

  • sudo /etc/init.d/kdm stop

But on mint, you need to use:

  • sudo /etc/init.d/lightdm stop

Once I ran that, I was dropped to a terminal, and I could run the NVIDIA installer as per the instructions. It disabled the nouveau driver and compiled my driver. After a reboot, all was good in the world.

I have a few thoughts on a simpler process. It is possible that the nouveau driver was conflicting with the NVIDIA driver. If I just disabled, or blacklisted the nouveau driver, the restricted drivers install wizard may have worked fine.

Here are a list of the pages I consulted to come to my resolution:

Dec 042010
 

I was trying to use the Knowledgetree automated installer on a fresh install of Ubuntu 10.04.01 64-bit server. While trying to install, I received an error about swftools, which I was able to install thanks to the directions here (in post 2):
http://forums.knowledgetree.org/viewtopic.php?f=6&t=20633

However, I then received an error that read:
Failed to fetch http://repos.zend.com/zend-server/deb/dists/server/non-free/binary-amd64/Packages.bz2 Hash Sum mismatch
Followed by a few zend packages that couldn’t be installed.

After much purging and googling, I found that this worked well:

  1. Go to /var/lib/apt/lists/partial/ and delete the files that failed to download. (If you are curious, I looked at the file that was supposed to be correct in the partial directory, and noticed it was encoded still. I suspect an incorrectly expanded archive file that was causing the problem. Some people reported that simply deleting these files fixed the problem, but it did not help for me.)
  2. Using the root URL of the repo that had the hash problem….go to the site. So for me it is http://repos.zend.com/zend-server/deb/dists/server/non-free/binary-amd64/
    • Here I noticed that there were a number of files that had the packages. The file it failed on was the bz2 file, however, there is a plain text one available called Packages
    • I clicked on Packages and copied all of the text.
    • On the server, I created a new file in the /var/lib/apt/lists/ directory with the same name of the file that failed/was encoded in the partial directory (sudo vim repos.zend.com_zend-server_deb_dists_server_non-free_binary-amd64_Packages) and pasted all of the text in there.
  3. Then I ran apt-get update and it ran without errors.
  4. Finally, I ran the Knowledgetree install again, and it downloaded all of the required packages fine.

I imagine this would work on any repo that is having a similar problem with Hash Sum mismatch, but of course the URL you will visit, and the file name you will use will be different.

Jul 102009
 

I just noticed this today. I monitor a number of sites and servers using the great Xymon system monitoring tool (Linky). It is based on early Big Brother code. Anyway, today I noticed that the graphs were all blue. Something like this:

Bad Xymon Graph

Bad Xymon Graph

I’m using Xymon 4.2.3 on the servers, and both servers show the graph like this using Firefox 3.5 on Linux. Google Chrome development version, SeaMonkey 1.1.5, Epiphany and Galeon all display the graph correctly.

I have no solution, but I thought I’d post my experience.

Jul 072009
 

Note:

Instead of editing my previous post to pieces, I thought it might be best if I repost my script with a better description of how it works. I’ve written up documentation of the script on the CornEmpire Software wiki which is available at http://wiki.cornempire.net. The documentation directly related to this script is available at http://wiki.cornempire.net/doku.php?id=d2l:bulkdeactivate

Introduction

One of the often requested features is to be able to bulk deactivate courses in D2L after they have completed. You can currently deactivate courses, but this is a one at a time effort. This has become more important as version 8.3 of D2L brought along a new My Courses widget, which allows users to see updates from each of their courses. Unfortunately, when course access ends, they can still see the updates. This has caused confusion for many users.
Another reason to deactivate courses is minimize the amount of clutter in the view of users. If course access has ended, and they cannot access the course, there is no need for the course to remain active (this could vary depending on how your roles are configured). A way around this is to script the bulk deactivation of courses. Through the use of some Javascript coding, we can instruct the web browser to visit each course, and deactivate it for us.

See the script in action.

Would you like to see more? I have the script, and it’s documentation available on this page: http://wiki.cornempire.net/doku.php?id=d2l:bulkdeactivate

If you have any questions, please post them below.

Oct 182008
 

I was trying to fix a friends computer recently.  It was a Windows XP machine which would boot up (in both normal and safe mode) and give the error with services.exe:

This application has failed to start because MSVCP60.dll was not found.  Re-installing the application may fix this problem.

And then completely stop loading.  This appeared to begin after a botched SP3 install through automatic updates.  None of the normal tricks for getting into windows would work, so I booted to a Ubuntu live CD (7.04) and took a look at the hard drive.

All of the information was still there, that was good news.  I found a copy of MSVCP60.dll on the harddrive, but it wasn’t in the c:windowssystem32 folder, and hence the error.  So I tried to copy it, but Ubuntu 7.04 Live CD cannot write to NTFS partitions.  Bummer.

So, I copied the file from another computer to a USB key, put that into the Live CD Ubuntu desktop, copied the file to the floppy, rebooted into Windows Recovery mode (from an old Win XP CD I had) and tried to copy the file from the floppy to the hard drive, however I was getting “Permission Denied” errors.

So, I took a look at the C: and sure enough, Windows Recovery Console could not read any files there.  Another bummer.

So, I started downloading a new version of Ubuntu (8.10 beta) in order to get read/write NTFS in the live CD.  While I was waiting, I thought I’d try to fix the boot sector on the broken machine.  I poked around for a bit.  Tried chkdsk but got an error about ‘one or more unrecoverable file system errors’.  Then the fixboot command looked interesting, so I ran it.

DANGER, DANGER!!!

It said it fixed the boot sector, and threw a nice FAT16 table at the beginning, after autodetecting what it was.  Huh?  Now when I search the C:, I get a message, “No Files Found”.

So, when I finally got Ubuntu downloaded and burned, I rebooted and took a look at the partition.  Double click…and nothing.  Can’t read the media.  I opened it Gparted partition editor, and sure enough, a nice 300GB FAT16 partition that cannot be read.  Well at this point I began to get worried.

I decided to take the machine home for the night, as I had more tools available there.

I knew that there was a backup boot sector on the drive, and what I had to do was to restore that.  But with what?  I did some searching and came across TestDisk.  I had used this in the past to try and recover an ext3 partition, but with no success.  Then I started poking around at a rescue CD that included TestDisk.  Ubuntu Rescue Remix is one of the rescue disks I cam across.  I also download Knoppix, but URR finished sooner because it was a much smaller ISO, so my success was with that one. 😉

Booted on the rescue disk.  It gives you a nice console.  Ran:

sudo testdisk

And started to get to work.  The drive was detected properly, always a good start.  I then scanned for partitions, there were 3 listed (actually only one on the drive) in the table, and they overlapped, which is not possible.  The application knew something was wrong, and did a slightly deeper scan and detected one NTFS partition that took up the whole drive.  I wrote this information to the disk.

I think I actually had fixed it at this point, but I kept on poking around.  I went back to the main menu, and went to the advanced tools.  I took a look at the boot sector recovery.  The detection showed both the master and backup being the same.  Not sure if when I rewrote it, it rewrote the backup too (that would make sense), but after writing once, the backup seemed unavailable as an option.

I then scanned the MFT to rebuild it if needed, but it was identical to what the application was going to reinstall.  So nothing to do there.  I rewrote the boot sector again, and then rebooted. (Have to take the CD out at this point, the boot menu gives you the option to boot to the hard drive, but that wouldn’t work for me)

I rebooted and got this error:

This application has failed to start because MSVCP60.dll was not found.  Re-installing the application may fix this problem.

Yeah!!!  Windows lives.  So, I rebooted, went into the new Ubuntu Live CD (8.10 beta) I had burned earlier and tried to access the drive.  No go.  Windows didn’t let go of it properly when it locked up.  Which isn’t a surprise really.  Ubuntu told me what command to use to force it to mount:

sudo mount /dev/sda1 /media/testdisk -t ntfs-3g -o force

I’m pretty sure that was it (from memory now).  /media/testdisk is a directory I created in order to mount the drive.

Mount successful.  File copy successful.  Rebooted, and Windows loaded!

It looked horrible.  The graphics driver was missing, probably killed in the botched SP3 upgrade.  There were still file missing errors, but I was able to get to the desktop.  I copied the files that were missing from somewhere else on the drive, back to c:windowssystem32 to make those errors go away.  I tried to download new video drivers, but Flash was also corrupted beyond repair it crashed all of the browsers when they loaded. All the websites that had the drivers (and the default homepages) used flash.  So I uninstalled flash, download new flash, and video drivers, installed both, and all is good.

The moral of the story, NEVER USE FIXBOOT!