Kernel panic not syncing VFS after updating Ubuntu server 17.10.1

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP








up vote
0
down vote

favorite












I am really struggling to figure out how to fix the boot for my Ubuntu server after running some updates recently.
This is the error on boot



boot error



I can't seem to see any grub loader screen for repair, so I reattached the installation ISO to the VM, and select repair. I've tried a few options, but the result is always the same.



I can get execute a shell on the root of what I think is my installation.



Running fdisk -l /dev/sda gives:



fdisk



Uname -r shows that I am on 4.13.0-21-generic, but I have newer installations in the boot directory. I don't know if that is because an upgrade has failed, or because the version running is currently from the CD ISO.



I have tried auto removing to clean, but there were a lot of errors. I thought it might be because there was no space left on the boot partition, but I believe there is plenty.



I have also tried update-initramfs on several of the kernel versions, but I get errors such as



initramfs



I have tried the advice for similar looking issue here including the mount options Kernel Panic - not syncing: VFS: Unable to mount root fs on unknown-block(0,0) but still get the same error.



The disks/partitions are not encrypted. This is all being run from a Bhyve VM from FreeNAS.



The result of lsblk is:
lsblk



When I am asked when booting from the original CD, what I want to connect root to, I am selecting the AtlassianServers--vg-root option. It then displays a message saying there is also a boot/efi partition which will also be mounted.



I'm a bit stuck what to try next. Is there a way to repair it that's fairly straightforward, or do I just install again over the top, and keep the underlying files mostly intact?







share|improve this question






















  • I have even now tried to re-install Ubuntu from the original ISO (keeping the data). I can now see the grub screen, but still get the kernel panic message, no matter if I try 4.13.0-37 or 38, or any of the recovery modes
    – Zief
    Apr 20 at 10:03














up vote
0
down vote

favorite












I am really struggling to figure out how to fix the boot for my Ubuntu server after running some updates recently.
This is the error on boot



boot error



I can't seem to see any grub loader screen for repair, so I reattached the installation ISO to the VM, and select repair. I've tried a few options, but the result is always the same.



I can get execute a shell on the root of what I think is my installation.



Running fdisk -l /dev/sda gives:



fdisk



Uname -r shows that I am on 4.13.0-21-generic, but I have newer installations in the boot directory. I don't know if that is because an upgrade has failed, or because the version running is currently from the CD ISO.



I have tried auto removing to clean, but there were a lot of errors. I thought it might be because there was no space left on the boot partition, but I believe there is plenty.



I have also tried update-initramfs on several of the kernel versions, but I get errors such as



initramfs



I have tried the advice for similar looking issue here including the mount options Kernel Panic - not syncing: VFS: Unable to mount root fs on unknown-block(0,0) but still get the same error.



The disks/partitions are not encrypted. This is all being run from a Bhyve VM from FreeNAS.



The result of lsblk is:
lsblk



When I am asked when booting from the original CD, what I want to connect root to, I am selecting the AtlassianServers--vg-root option. It then displays a message saying there is also a boot/efi partition which will also be mounted.



I'm a bit stuck what to try next. Is there a way to repair it that's fairly straightforward, or do I just install again over the top, and keep the underlying files mostly intact?







share|improve this question






















  • I have even now tried to re-install Ubuntu from the original ISO (keeping the data). I can now see the grub screen, but still get the kernel panic message, no matter if I try 4.13.0-37 or 38, or any of the recovery modes
    – Zief
    Apr 20 at 10:03












up vote
0
down vote

favorite









up vote
0
down vote

favorite











I am really struggling to figure out how to fix the boot for my Ubuntu server after running some updates recently.
This is the error on boot



boot error



I can't seem to see any grub loader screen for repair, so I reattached the installation ISO to the VM, and select repair. I've tried a few options, but the result is always the same.



I can get execute a shell on the root of what I think is my installation.



Running fdisk -l /dev/sda gives:



fdisk



Uname -r shows that I am on 4.13.0-21-generic, but I have newer installations in the boot directory. I don't know if that is because an upgrade has failed, or because the version running is currently from the CD ISO.



I have tried auto removing to clean, but there were a lot of errors. I thought it might be because there was no space left on the boot partition, but I believe there is plenty.



I have also tried update-initramfs on several of the kernel versions, but I get errors such as



initramfs



I have tried the advice for similar looking issue here including the mount options Kernel Panic - not syncing: VFS: Unable to mount root fs on unknown-block(0,0) but still get the same error.



The disks/partitions are not encrypted. This is all being run from a Bhyve VM from FreeNAS.



The result of lsblk is:
lsblk



When I am asked when booting from the original CD, what I want to connect root to, I am selecting the AtlassianServers--vg-root option. It then displays a message saying there is also a boot/efi partition which will also be mounted.



I'm a bit stuck what to try next. Is there a way to repair it that's fairly straightforward, or do I just install again over the top, and keep the underlying files mostly intact?







share|improve this question














I am really struggling to figure out how to fix the boot for my Ubuntu server after running some updates recently.
This is the error on boot



boot error



I can't seem to see any grub loader screen for repair, so I reattached the installation ISO to the VM, and select repair. I've tried a few options, but the result is always the same.



I can get execute a shell on the root of what I think is my installation.



Running fdisk -l /dev/sda gives:



fdisk



Uname -r shows that I am on 4.13.0-21-generic, but I have newer installations in the boot directory. I don't know if that is because an upgrade has failed, or because the version running is currently from the CD ISO.



I have tried auto removing to clean, but there were a lot of errors. I thought it might be because there was no space left on the boot partition, but I believe there is plenty.



I have also tried update-initramfs on several of the kernel versions, but I get errors such as



initramfs



I have tried the advice for similar looking issue here including the mount options Kernel Panic - not syncing: VFS: Unable to mount root fs on unknown-block(0,0) but still get the same error.



The disks/partitions are not encrypted. This is all being run from a Bhyve VM from FreeNAS.



The result of lsblk is:
lsblk



When I am asked when booting from the original CD, what I want to connect root to, I am selecting the AtlassianServers--vg-root option. It then displays a message saying there is also a boot/efi partition which will also be mounted.



I'm a bit stuck what to try next. Is there a way to repair it that's fairly straightforward, or do I just install again over the top, and keep the underlying files mostly intact?









share|improve this question













share|improve this question




share|improve this question








edited Apr 19 at 9:50

























asked Apr 18 at 21:32









Zief

313




313











  • I have even now tried to re-install Ubuntu from the original ISO (keeping the data). I can now see the grub screen, but still get the kernel panic message, no matter if I try 4.13.0-37 or 38, or any of the recovery modes
    – Zief
    Apr 20 at 10:03
















  • I have even now tried to re-install Ubuntu from the original ISO (keeping the data). I can now see the grub screen, but still get the kernel panic message, no matter if I try 4.13.0-37 or 38, or any of the recovery modes
    – Zief
    Apr 20 at 10:03















I have even now tried to re-install Ubuntu from the original ISO (keeping the data). I can now see the grub screen, but still get the kernel panic message, no matter if I try 4.13.0-37 or 38, or any of the recovery modes
– Zief
Apr 20 at 10:03




I have even now tried to re-install Ubuntu from the original ISO (keeping the data). I can now see the grub screen, but still get the kernel panic message, no matter if I try 4.13.0-37 or 38, or any of the recovery modes
– Zief
Apr 20 at 10:03










1 Answer
1






active

oldest

votes

















up vote
0
down vote













Have you tried updating the initramfs for the running kernel?



sudo update-initramfs -u -k $(uname -r)
sudo update-grub





share|improve this answer




















  • Hi, yes, with errors. I possibly updated this as you were replying. There is a screen with errors received
    – Zief
    Apr 19 at 10:55










  • What I mean is it failed with a similar error as shown. I previously tried it for the running 4.13.0-21-generic, and it error'd too. I have since used autoremove (possibly stupidly?) and it has left only 4.13.0-37 and 4.13.0-38 versions in the boot directory, but hasn't cleanly removed version 21. Now when running initramfs on version 21 there are errors with depmod as there are missing directories. I assume it is only running version 21 because that is on the CD anyway? Perhaps I am wrong and that was the version being used, as the others failed, I'm not really sure at this stage...
    – Zief
    Apr 19 at 11:00










Your Answer







StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "89"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f1026220%2fkernel-panic-not-syncing-vfs-after-updating-ubuntu-server-17-10-1%23new-answer', 'question_page');

);

Post as a guest






























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
0
down vote













Have you tried updating the initramfs for the running kernel?



sudo update-initramfs -u -k $(uname -r)
sudo update-grub





share|improve this answer




















  • Hi, yes, with errors. I possibly updated this as you were replying. There is a screen with errors received
    – Zief
    Apr 19 at 10:55










  • What I mean is it failed with a similar error as shown. I previously tried it for the running 4.13.0-21-generic, and it error'd too. I have since used autoremove (possibly stupidly?) and it has left only 4.13.0-37 and 4.13.0-38 versions in the boot directory, but hasn't cleanly removed version 21. Now when running initramfs on version 21 there are errors with depmod as there are missing directories. I assume it is only running version 21 because that is on the CD anyway? Perhaps I am wrong and that was the version being used, as the others failed, I'm not really sure at this stage...
    – Zief
    Apr 19 at 11:00














up vote
0
down vote













Have you tried updating the initramfs for the running kernel?



sudo update-initramfs -u -k $(uname -r)
sudo update-grub





share|improve this answer




















  • Hi, yes, with errors. I possibly updated this as you were replying. There is a screen with errors received
    – Zief
    Apr 19 at 10:55










  • What I mean is it failed with a similar error as shown. I previously tried it for the running 4.13.0-21-generic, and it error'd too. I have since used autoremove (possibly stupidly?) and it has left only 4.13.0-37 and 4.13.0-38 versions in the boot directory, but hasn't cleanly removed version 21. Now when running initramfs on version 21 there are errors with depmod as there are missing directories. I assume it is only running version 21 because that is on the CD anyway? Perhaps I am wrong and that was the version being used, as the others failed, I'm not really sure at this stage...
    – Zief
    Apr 19 at 11:00












up vote
0
down vote










up vote
0
down vote









Have you tried updating the initramfs for the running kernel?



sudo update-initramfs -u -k $(uname -r)
sudo update-grub





share|improve this answer












Have you tried updating the initramfs for the running kernel?



sudo update-initramfs -u -k $(uname -r)
sudo update-grub






share|improve this answer












share|improve this answer



share|improve this answer










answered Apr 19 at 10:07









Crypto

1




1











  • Hi, yes, with errors. I possibly updated this as you were replying. There is a screen with errors received
    – Zief
    Apr 19 at 10:55










  • What I mean is it failed with a similar error as shown. I previously tried it for the running 4.13.0-21-generic, and it error'd too. I have since used autoremove (possibly stupidly?) and it has left only 4.13.0-37 and 4.13.0-38 versions in the boot directory, but hasn't cleanly removed version 21. Now when running initramfs on version 21 there are errors with depmod as there are missing directories. I assume it is only running version 21 because that is on the CD anyway? Perhaps I am wrong and that was the version being used, as the others failed, I'm not really sure at this stage...
    – Zief
    Apr 19 at 11:00
















  • Hi, yes, with errors. I possibly updated this as you were replying. There is a screen with errors received
    – Zief
    Apr 19 at 10:55










  • What I mean is it failed with a similar error as shown. I previously tried it for the running 4.13.0-21-generic, and it error'd too. I have since used autoremove (possibly stupidly?) and it has left only 4.13.0-37 and 4.13.0-38 versions in the boot directory, but hasn't cleanly removed version 21. Now when running initramfs on version 21 there are errors with depmod as there are missing directories. I assume it is only running version 21 because that is on the CD anyway? Perhaps I am wrong and that was the version being used, as the others failed, I'm not really sure at this stage...
    – Zief
    Apr 19 at 11:00















Hi, yes, with errors. I possibly updated this as you were replying. There is a screen with errors received
– Zief
Apr 19 at 10:55




Hi, yes, with errors. I possibly updated this as you were replying. There is a screen with errors received
– Zief
Apr 19 at 10:55












What I mean is it failed with a similar error as shown. I previously tried it for the running 4.13.0-21-generic, and it error'd too. I have since used autoremove (possibly stupidly?) and it has left only 4.13.0-37 and 4.13.0-38 versions in the boot directory, but hasn't cleanly removed version 21. Now when running initramfs on version 21 there are errors with depmod as there are missing directories. I assume it is only running version 21 because that is on the CD anyway? Perhaps I am wrong and that was the version being used, as the others failed, I'm not really sure at this stage...
– Zief
Apr 19 at 11:00




What I mean is it failed with a similar error as shown. I previously tried it for the running 4.13.0-21-generic, and it error'd too. I have since used autoremove (possibly stupidly?) and it has left only 4.13.0-37 and 4.13.0-38 versions in the boot directory, but hasn't cleanly removed version 21. Now when running initramfs on version 21 there are errors with depmod as there are missing directories. I assume it is only running version 21 because that is on the CD anyway? Perhaps I am wrong and that was the version being used, as the others failed, I'm not really sure at this stage...
– Zief
Apr 19 at 11:00

















 

draft saved


draft discarded















































 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f1026220%2fkernel-panic-not-syncing-vfs-after-updating-ubuntu-server-17-10-1%23new-answer', 'question_page');

);

Post as a guest













































































Popular posts from this blog

pylint3 and pip3 broken

Missing snmpget and snmpwalk

How to enroll fingerprints to Ubuntu 17.10 with VFS491