Failed upgrade from Suse11sp1 to Suse11sp2

Hi,

I am running Suse 11 SP1 as a Xen server with various VM’s. I shutdown the
VM’s and booted off the SP2 CD and completed the update. After the reboot
the system shows only SP1 and no option to boot into XEN. The command uname
-a shows the SP1 kernel and there are no network interfaces.

More of the puzzle:

The disks are partitioned as follows:
/dev/sda1 — boot
/dev/sda2 — swap
/dev/sda3 — /

So I ran the upgrade again and notice that the disk partition that will be
updated is /dev/sda3. When selecting “Show all partitions” I see that
/dev/sda1 (boot) is unknown.

If I boot the system as a SP1 server, my disks are all 100%

My question is as follows: If I do a “New Installation” to Suse11 sp2 will I
lose my existing VM’s and the current disk partitions?

As always, any help would be great.

mmmh, seeing as no-one can help :slight_smile: I did this as a fix.

Copies of my VM images were stored on a separate disk partition. So after
trying to re-install and rescue the system I decided to do a new install of
Suse11sp2 but without changing or formatting the partition where the backup
images are stored. I then copied the VM images to /var/lib/xen/images.
Then I created a new VM and selected “I have a disk or disk image with an
installed operating system” This loads the VM in seconds. Configured the
“new” VM server with same name and bridge ports etc. All up and working and
one more server upgraded :wink:

Hope this helps someone …

[color=blue][color=green][color=darkred]

On 09 January 2013 at 17:20, in message[/color][/color][/color]
50EDA630.F60F.004E.0@corobrik.co.za,
John GillJohn.Gill@corobrik.co.za wrote:[color=blue]
Hi,

I am running Suse 11 SP1 as a Xen server with various VM’s. I shutdown[/color]
the[color=blue]
VM’s and booted off the SP2 CD and completed the update. After the[/color]
reboot[color=blue]
the system shows only SP1 and no option to boot into XEN. The command[/color]
uname[color=blue]
-a shows the SP1 kernel and there are no network interfaces.

More of the puzzle:

The disks are partitioned as follows:
/dev/sda1 — boot
/dev/sda2 — swap
/dev/sda3 — /

So I ran the upgrade again and notice that the disk partition that will[/color]
be[color=blue]
updated is /dev/sda3. When selecting “Show all partitions” I see that
/dev/sda1 (boot) is unknown.

If I boot the system as a SP1 server, my disks are all 100%

My question is as follows: If I do a “New Installation” to Suse11 sp2 will[/color]
I[color=blue]
lose my existing VM’s and the current disk partitions?

As always, any help would be great.[/color]

Thanks for reporting back your solution.

The problem was, probably, that your separate /boot partition didn’t get upgraded together with your /. So when you tried to boot into the new system, you had the untouched SP1 /boot data (kernel, initrd) and an already-SP2 root-system. As there were no matching kernel modules to your booted SP1 kernel (because SP2 kernel and it’s modules were installed) you had no network cards, which explains all.

I’m glad you solved it :slight_smile:

This is alarming. I’m just planning to do a (long overdue) SLES10SP3 → SLES10SP4 upgrade on a server which also has separate /boot partition. Can I expect to have the same problem? Is there something special I need to do during the upgrade? I didn’t see any warnings about this in SLES10SP4 release notes.

I did the upgrade last night and it went fine. The /boot partition was upgraded together with /.