Downloading Entirety of Lubuntu/Ubuntu Man-pages?
up vote
3
down vote
favorite
I know about this page which is almost exactly what I want. Unfortunately, it is not current.
What I would like to do is to have the entirety of the Ubuntu man-pages in a nice, easy to read, PDF format. I'll accept other formats but I'd prefer an indexed PDF file for simplicity and portability.
I am also aware of HTTrack which can pull down the pages in HTML format. There are a few reasons that I wish to avoid this - the primary reason being that it's not really a nice thing to do to their bandwidth and servers.
I've searched the Ubuntu site, used an external search engine, and have searched this site. I did find one answer that led me back to HTTrack which is a potential solution but not the ideal solution and, as mentioned, isn't very nice to their servers or bandwidth.
Even more special would be being able to get this specifically for Lubuntu because there are a few differences in software and I'm an avid Lubuntu user but, if need be, I can make due with just the Ubuntu man-pages.
The reason that I want this is because, well, I'd like to read it - in its entirety. More like a book than like a file that is called when needed. I want to be able to read it while I only have access to my phone, tablet, or other compute device and in an easier to read format than the man-pages typically use.
EDIT:
Specifically for Ubuntu (or Lubuntu) version 15.10, as noted in the tags and title. Also, yes - all the man-pages (even redundant and short ones). I'm aware that this is a lot of information which is one of the reasons that I'm trying to avoid using HTTrack.
lubuntu pdf 15.10 documentation manpage
add a comment |Â
up vote
3
down vote
favorite
I know about this page which is almost exactly what I want. Unfortunately, it is not current.
What I would like to do is to have the entirety of the Ubuntu man-pages in a nice, easy to read, PDF format. I'll accept other formats but I'd prefer an indexed PDF file for simplicity and portability.
I am also aware of HTTrack which can pull down the pages in HTML format. There are a few reasons that I wish to avoid this - the primary reason being that it's not really a nice thing to do to their bandwidth and servers.
I've searched the Ubuntu site, used an external search engine, and have searched this site. I did find one answer that led me back to HTTrack which is a potential solution but not the ideal solution and, as mentioned, isn't very nice to their servers or bandwidth.
Even more special would be being able to get this specifically for Lubuntu because there are a few differences in software and I'm an avid Lubuntu user but, if need be, I can make due with just the Ubuntu man-pages.
The reason that I want this is because, well, I'd like to read it - in its entirety. More like a book than like a file that is called when needed. I want to be able to read it while I only have access to my phone, tablet, or other compute device and in an easier to read format than the man-pages typically use.
EDIT:
Specifically for Ubuntu (or Lubuntu) version 15.10, as noted in the tags and title. Also, yes - all the man-pages (even redundant and short ones). I'm aware that this is a lot of information which is one of the reasons that I'm trying to avoid using HTTrack.
lubuntu pdf 15.10 documentation manpage
You could use wget + the prerequisites option to download the page..... but you need should have all the manual pages you for the packages you have installed via theman
command - you should be able to pipe the output of the man command to various file formats, which you can then read on various devices
â Wilf
Nov 18 '15 at 15:27
@Wilf Yeah, thanks. I want the man-pages for everything that's available for Ubuntu in the official repositories, installed or not. Therein lies the rub. I know, it's strange. It's still my objective and I've not yet found quite what I'm looking for online and readily available. I'm probably going to have to make it, from the looks of things. I'll be sure to upload it and share it, when I'm done - that way it needn't be done multiple times and I can then keep it up-to-date in the future.
â KGIII
Nov 18 '15 at 15:30
add a comment |Â
up vote
3
down vote
favorite
up vote
3
down vote
favorite
I know about this page which is almost exactly what I want. Unfortunately, it is not current.
What I would like to do is to have the entirety of the Ubuntu man-pages in a nice, easy to read, PDF format. I'll accept other formats but I'd prefer an indexed PDF file for simplicity and portability.
I am also aware of HTTrack which can pull down the pages in HTML format. There are a few reasons that I wish to avoid this - the primary reason being that it's not really a nice thing to do to their bandwidth and servers.
I've searched the Ubuntu site, used an external search engine, and have searched this site. I did find one answer that led me back to HTTrack which is a potential solution but not the ideal solution and, as mentioned, isn't very nice to their servers or bandwidth.
Even more special would be being able to get this specifically for Lubuntu because there are a few differences in software and I'm an avid Lubuntu user but, if need be, I can make due with just the Ubuntu man-pages.
The reason that I want this is because, well, I'd like to read it - in its entirety. More like a book than like a file that is called when needed. I want to be able to read it while I only have access to my phone, tablet, or other compute device and in an easier to read format than the man-pages typically use.
EDIT:
Specifically for Ubuntu (or Lubuntu) version 15.10, as noted in the tags and title. Also, yes - all the man-pages (even redundant and short ones). I'm aware that this is a lot of information which is one of the reasons that I'm trying to avoid using HTTrack.
lubuntu pdf 15.10 documentation manpage
I know about this page which is almost exactly what I want. Unfortunately, it is not current.
What I would like to do is to have the entirety of the Ubuntu man-pages in a nice, easy to read, PDF format. I'll accept other formats but I'd prefer an indexed PDF file for simplicity and portability.
I am also aware of HTTrack which can pull down the pages in HTML format. There are a few reasons that I wish to avoid this - the primary reason being that it's not really a nice thing to do to their bandwidth and servers.
I've searched the Ubuntu site, used an external search engine, and have searched this site. I did find one answer that led me back to HTTrack which is a potential solution but not the ideal solution and, as mentioned, isn't very nice to their servers or bandwidth.
Even more special would be being able to get this specifically for Lubuntu because there are a few differences in software and I'm an avid Lubuntu user but, if need be, I can make due with just the Ubuntu man-pages.
The reason that I want this is because, well, I'd like to read it - in its entirety. More like a book than like a file that is called when needed. I want to be able to read it while I only have access to my phone, tablet, or other compute device and in an easier to read format than the man-pages typically use.
EDIT:
Specifically for Ubuntu (or Lubuntu) version 15.10, as noted in the tags and title. Also, yes - all the man-pages (even redundant and short ones). I'm aware that this is a lot of information which is one of the reasons that I'm trying to avoid using HTTrack.
lubuntu pdf 15.10 documentation manpage
lubuntu pdf 15.10 documentation manpage
edited Apr 13 '17 at 12:24
Communityâ¦
1
1
asked Nov 18 '15 at 14:26
KGIII
1,2231817
1,2231817
You could use wget + the prerequisites option to download the page..... but you need should have all the manual pages you for the packages you have installed via theman
command - you should be able to pipe the output of the man command to various file formats, which you can then read on various devices
â Wilf
Nov 18 '15 at 15:27
@Wilf Yeah, thanks. I want the man-pages for everything that's available for Ubuntu in the official repositories, installed or not. Therein lies the rub. I know, it's strange. It's still my objective and I've not yet found quite what I'm looking for online and readily available. I'm probably going to have to make it, from the looks of things. I'll be sure to upload it and share it, when I'm done - that way it needn't be done multiple times and I can then keep it up-to-date in the future.
â KGIII
Nov 18 '15 at 15:30
add a comment |Â
You could use wget + the prerequisites option to download the page..... but you need should have all the manual pages you for the packages you have installed via theman
command - you should be able to pipe the output of the man command to various file formats, which you can then read on various devices
â Wilf
Nov 18 '15 at 15:27
@Wilf Yeah, thanks. I want the man-pages for everything that's available for Ubuntu in the official repositories, installed or not. Therein lies the rub. I know, it's strange. It's still my objective and I've not yet found quite what I'm looking for online and readily available. I'm probably going to have to make it, from the looks of things. I'll be sure to upload it and share it, when I'm done - that way it needn't be done multiple times and I can then keep it up-to-date in the future.
â KGIII
Nov 18 '15 at 15:30
You could use wget + the prerequisites option to download the page..... but you need should have all the manual pages you for the packages you have installed via the
man
command - you should be able to pipe the output of the man command to various file formats, which you can then read on various devicesâ Wilf
Nov 18 '15 at 15:27
You could use wget + the prerequisites option to download the page..... but you need should have all the manual pages you for the packages you have installed via the
man
command - you should be able to pipe the output of the man command to various file formats, which you can then read on various devicesâ Wilf
Nov 18 '15 at 15:27
@Wilf Yeah, thanks. I want the man-pages for everything that's available for Ubuntu in the official repositories, installed or not. Therein lies the rub. I know, it's strange. It's still my objective and I've not yet found quite what I'm looking for online and readily available. I'm probably going to have to make it, from the looks of things. I'll be sure to upload it and share it, when I'm done - that way it needn't be done multiple times and I can then keep it up-to-date in the future.
â KGIII
Nov 18 '15 at 15:30
@Wilf Yeah, thanks. I want the man-pages for everything that's available for Ubuntu in the official repositories, installed or not. Therein lies the rub. I know, it's strange. It's still my objective and I've not yet found quite what I'm looking for online and readily available. I'm probably going to have to make it, from the looks of things. I'll be sure to upload it and share it, when I'm done - that way it needn't be done multiple times and I can then keep it up-to-date in the future.
â KGIII
Nov 18 '15 at 15:30
add a comment |Â
3 Answers
3
active
oldest
votes
up vote
3
down vote
Even more special would be being able to get this specifically for Lubuntu because there are a few differences in software and I'm an avid Lubuntu user but, if need be, I can make due with just the Ubuntu man-pages.
There are no differences in manpages between Lubuntu and Ubuntu. One of the points of becoming a recognized flavour is using the same repositories as Ubuntu, so the software is identical, it's only the starting points that differ.
Also, http://manpages.ubuntu.com suffers from a bug where identically named manpages from different packages aren't distinguished - the manpages of the last package read show up.
Instead of hammering the manpages site, hammer the repositories.
Get a list of manpages, for, say, the binary-amd64
architecture (should be identical to the others):
mkdir temp
cd temp
curl http://archive.ubuntu.com/ubuntu/dists/wily/Contents-amd64.gz |
gunzip |
grep 'share/man' |
sed 's/.* //;s/,/n/g' |
awk -F/ 'print $NF' |
sort -u > packages.txt
while IFS= read -r package
do
apt-get download "$package"
dpkg-deb --fsys-tarfile "$package"*.deb | tar x ./usr/share/man
mkdir "$package"-manpages
find ./usr/share/man/man* -type f -exec mv -t "$package"-manpages +
rm "$package"*.deb
for page in "$package"-manpages/*
do
man -t "$page" | ps2pdf - > "$page".pdf
done
done < packages.txt
If course, this is going to consume an insane amount of bandwidth - the repository servers are used to it, the question is: is your network upto the task?
My network can handle it. I've got a separate line and compute cycles that I can throw at it. :D I've got @kos in chat working on another one so it looks like this is going to get done. I'll upload and share the results, regardless of which way I go, so as to make sure it need only be done once. Also, won't there be differences in the man-pages as Lubuntu has different default software than Ubuntu or is all software listed in the man-page site? Or am I missing something?
â KGIII
Nov 18 '15 at 15:22
add a comment |Â
up vote
2
down vote
For this approach, you will need html2ps
,ps2pdf
and a working LaTeX installation. You should be able to install all requirements with
sudo apt-get install html2ps ghostscript texlive-latex-base
Once you've installed the required packages, run this to get the man pages as pdf files:
curl http://manpages.ubuntu.com/manpages/wily/en/man1/ |
grep -oP 'href="K.*?.1.html' |
while read man; do
wget http://manpages.ubuntu.com/manpages/wily/en/man1/"$man" &&
html2ps "$man" | ps2pdf - "$man/.html/.pdf"
done
You should now have a (huge) collection of pdf files in the directory you ran the command in. By the way, make sure to run the command in a new, empty directory.
Now, to combine them into a single, indexed PDF file, you'll need LaTeX and you'll need to rename them because LaTeX doesn't like .
in file names:
rename 's/./-/g;s/-pdf/.pdf/' *pdf
cat <<EoF > man1.tex
documentclassarticle
usepackage[colorlinks=true,linkcolor=blue]hyperref
usepackagepdfpages
begindocument
tableofcontents
newpage
EoF
for f in *.pdf; do
file="$f/.pdf/"
printf 'section%snincludepdf[pages=-]%snn' "$file" "$f" >> man1.tex
done
echo "enddocument" >> man1.tex
pdflatex man1.tex && pdflatex man1.tex
The result is an indexed PDF file of all man pages (I only used 10 for testing):
This one keeps stopping atUse of assignment to $[ is deprecated at /usr/bin/html2ps line 3409.
It has stopped there twice now. Any ideas?
â KGIII
Nov 19 '15 at 17:49
@KGIII sounds lika a bug inhtml2ps
. I didn't encounter it in the 1st 10 I used for testing. You could tryhtml2pdf
instead but I didn't find it in the Ubuntu repos. You might have to compile it.
â terdonâ¦
Nov 19 '15 at 18:09
Hmm... It's not letting me type @ username for some reason, anyhow - I'll compile that and test again later. :D Thanks. I was checking out your code and looking to see if there was a way to support resume. :/ I don't know of a way to do so.
â KGIII
Nov 19 '15 at 18:11
@KGIII you can't ping me because the owner of a post always gets notified and there's nobody else to ping. As for resuming, just run the first code without the while loop, only the curl, grep and wget. You will then have all the html files locally and can play with them at your leisure.
â terdonâ¦
Nov 19 '15 at 18:47
Thanks for the info - I've got the locally now. Now to convert 'em. This could be a minute.
â KGIII
Nov 19 '15 at 18:49
add a comment |Â
up vote
0
down vote
Updated based on Chat discussion
You want to get all the manpages in PDF (or HTML) form. No system that I am aware of, nor any Distro for that matter, provides a complete set of all the manpages for your viewing. That'd be millions of pages, and my guess would be lots and lots of storage space being needed, of which would be useless to actually store. This would also be hammering the manpages server where the manpages exist already in web-viewable format. If you really want to achieve this you'd convert the manpages yourself and download all the packages in the repositories that contain manpages. (That's also equally evil)
You should not have to read all the manpages, nor do I know why you'd want that many pages. (Imagine a seven-hundred-thousand volume encyclopedia, that's effectively what you're asking for)
This section was before chat cleared up misconceptions about what was being asked
I don't think there's a 'PDF' for every thing, but you can make PDFs for manpages you wish to use.
There is a site which contains manpages for different Ubuntu releases for different packages in the Ubuntu repositories. Assuming the manpage in question in from a Repository package, you can open the manpage there, and then print these to a file (via either Print the Page or otherwise) and then save them as a PDF, which you can then use elsewhere.
The downside: it's one manpage at a time - you'll have to spend time to get the ones you want.
(Note: to PDF all the manpages would be a hellish project so I doubt you're going to find PDFs for every manpage you want... that'd usually be done by upstream, not Ubuntu).
They don't have to be in PDF format. Once I get the data (without hammering on their servers - ideally) I can go about manipulating it. I'd prefer to avoid that much effort but I'll go through it, if needed. My main criteria is avoiding hammering on their servers.
â KGIII
Nov 18 '15 at 14:38
@KGIII Unavoidable, unless you want to download and install all packages that have manpages. (Basically, Impossible Situation(TM).)
â Thomas Wardâ¦
Nov 18 '15 at 14:52
Following the chat, we're at a bit over 30,000 pages. Hrm... Maybe someone's already done this? It looks like I can slow the requests down with HTTrack and not hammer on the server too bad. I may just have to live with an older version from the site linked in my question - I really don't want to beat on their servers without a better reason than "bathroom reading."
â KGIII
Nov 18 '15 at 14:56
add a comment |Â
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
3
down vote
Even more special would be being able to get this specifically for Lubuntu because there are a few differences in software and I'm an avid Lubuntu user but, if need be, I can make due with just the Ubuntu man-pages.
There are no differences in manpages between Lubuntu and Ubuntu. One of the points of becoming a recognized flavour is using the same repositories as Ubuntu, so the software is identical, it's only the starting points that differ.
Also, http://manpages.ubuntu.com suffers from a bug where identically named manpages from different packages aren't distinguished - the manpages of the last package read show up.
Instead of hammering the manpages site, hammer the repositories.
Get a list of manpages, for, say, the binary-amd64
architecture (should be identical to the others):
mkdir temp
cd temp
curl http://archive.ubuntu.com/ubuntu/dists/wily/Contents-amd64.gz |
gunzip |
grep 'share/man' |
sed 's/.* //;s/,/n/g' |
awk -F/ 'print $NF' |
sort -u > packages.txt
while IFS= read -r package
do
apt-get download "$package"
dpkg-deb --fsys-tarfile "$package"*.deb | tar x ./usr/share/man
mkdir "$package"-manpages
find ./usr/share/man/man* -type f -exec mv -t "$package"-manpages +
rm "$package"*.deb
for page in "$package"-manpages/*
do
man -t "$page" | ps2pdf - > "$page".pdf
done
done < packages.txt
If course, this is going to consume an insane amount of bandwidth - the repository servers are used to it, the question is: is your network upto the task?
My network can handle it. I've got a separate line and compute cycles that I can throw at it. :D I've got @kos in chat working on another one so it looks like this is going to get done. I'll upload and share the results, regardless of which way I go, so as to make sure it need only be done once. Also, won't there be differences in the man-pages as Lubuntu has different default software than Ubuntu or is all software listed in the man-page site? Or am I missing something?
â KGIII
Nov 18 '15 at 15:22
add a comment |Â
up vote
3
down vote
Even more special would be being able to get this specifically for Lubuntu because there are a few differences in software and I'm an avid Lubuntu user but, if need be, I can make due with just the Ubuntu man-pages.
There are no differences in manpages between Lubuntu and Ubuntu. One of the points of becoming a recognized flavour is using the same repositories as Ubuntu, so the software is identical, it's only the starting points that differ.
Also, http://manpages.ubuntu.com suffers from a bug where identically named manpages from different packages aren't distinguished - the manpages of the last package read show up.
Instead of hammering the manpages site, hammer the repositories.
Get a list of manpages, for, say, the binary-amd64
architecture (should be identical to the others):
mkdir temp
cd temp
curl http://archive.ubuntu.com/ubuntu/dists/wily/Contents-amd64.gz |
gunzip |
grep 'share/man' |
sed 's/.* //;s/,/n/g' |
awk -F/ 'print $NF' |
sort -u > packages.txt
while IFS= read -r package
do
apt-get download "$package"
dpkg-deb --fsys-tarfile "$package"*.deb | tar x ./usr/share/man
mkdir "$package"-manpages
find ./usr/share/man/man* -type f -exec mv -t "$package"-manpages +
rm "$package"*.deb
for page in "$package"-manpages/*
do
man -t "$page" | ps2pdf - > "$page".pdf
done
done < packages.txt
If course, this is going to consume an insane amount of bandwidth - the repository servers are used to it, the question is: is your network upto the task?
My network can handle it. I've got a separate line and compute cycles that I can throw at it. :D I've got @kos in chat working on another one so it looks like this is going to get done. I'll upload and share the results, regardless of which way I go, so as to make sure it need only be done once. Also, won't there be differences in the man-pages as Lubuntu has different default software than Ubuntu or is all software listed in the man-page site? Or am I missing something?
â KGIII
Nov 18 '15 at 15:22
add a comment |Â
up vote
3
down vote
up vote
3
down vote
Even more special would be being able to get this specifically for Lubuntu because there are a few differences in software and I'm an avid Lubuntu user but, if need be, I can make due with just the Ubuntu man-pages.
There are no differences in manpages between Lubuntu and Ubuntu. One of the points of becoming a recognized flavour is using the same repositories as Ubuntu, so the software is identical, it's only the starting points that differ.
Also, http://manpages.ubuntu.com suffers from a bug where identically named manpages from different packages aren't distinguished - the manpages of the last package read show up.
Instead of hammering the manpages site, hammer the repositories.
Get a list of manpages, for, say, the binary-amd64
architecture (should be identical to the others):
mkdir temp
cd temp
curl http://archive.ubuntu.com/ubuntu/dists/wily/Contents-amd64.gz |
gunzip |
grep 'share/man' |
sed 's/.* //;s/,/n/g' |
awk -F/ 'print $NF' |
sort -u > packages.txt
while IFS= read -r package
do
apt-get download "$package"
dpkg-deb --fsys-tarfile "$package"*.deb | tar x ./usr/share/man
mkdir "$package"-manpages
find ./usr/share/man/man* -type f -exec mv -t "$package"-manpages +
rm "$package"*.deb
for page in "$package"-manpages/*
do
man -t "$page" | ps2pdf - > "$page".pdf
done
done < packages.txt
If course, this is going to consume an insane amount of bandwidth - the repository servers are used to it, the question is: is your network upto the task?
Even more special would be being able to get this specifically for Lubuntu because there are a few differences in software and I'm an avid Lubuntu user but, if need be, I can make due with just the Ubuntu man-pages.
There are no differences in manpages between Lubuntu and Ubuntu. One of the points of becoming a recognized flavour is using the same repositories as Ubuntu, so the software is identical, it's only the starting points that differ.
Also, http://manpages.ubuntu.com suffers from a bug where identically named manpages from different packages aren't distinguished - the manpages of the last package read show up.
Instead of hammering the manpages site, hammer the repositories.
Get a list of manpages, for, say, the binary-amd64
architecture (should be identical to the others):
mkdir temp
cd temp
curl http://archive.ubuntu.com/ubuntu/dists/wily/Contents-amd64.gz |
gunzip |
grep 'share/man' |
sed 's/.* //;s/,/n/g' |
awk -F/ 'print $NF' |
sort -u > packages.txt
while IFS= read -r package
do
apt-get download "$package"
dpkg-deb --fsys-tarfile "$package"*.deb | tar x ./usr/share/man
mkdir "$package"-manpages
find ./usr/share/man/man* -type f -exec mv -t "$package"-manpages +
rm "$package"*.deb
for page in "$package"-manpages/*
do
man -t "$page" | ps2pdf - > "$page".pdf
done
done < packages.txt
If course, this is going to consume an insane amount of bandwidth - the repository servers are used to it, the question is: is your network upto the task?
edited Nov 18 '15 at 15:23
answered Nov 18 '15 at 15:16
muru
130k19274467
130k19274467
My network can handle it. I've got a separate line and compute cycles that I can throw at it. :D I've got @kos in chat working on another one so it looks like this is going to get done. I'll upload and share the results, regardless of which way I go, so as to make sure it need only be done once. Also, won't there be differences in the man-pages as Lubuntu has different default software than Ubuntu or is all software listed in the man-page site? Or am I missing something?
â KGIII
Nov 18 '15 at 15:22
add a comment |Â
My network can handle it. I've got a separate line and compute cycles that I can throw at it. :D I've got @kos in chat working on another one so it looks like this is going to get done. I'll upload and share the results, regardless of which way I go, so as to make sure it need only be done once. Also, won't there be differences in the man-pages as Lubuntu has different default software than Ubuntu or is all software listed in the man-page site? Or am I missing something?
â KGIII
Nov 18 '15 at 15:22
My network can handle it. I've got a separate line and compute cycles that I can throw at it. :D I've got @kos in chat working on another one so it looks like this is going to get done. I'll upload and share the results, regardless of which way I go, so as to make sure it need only be done once. Also, won't there be differences in the man-pages as Lubuntu has different default software than Ubuntu or is all software listed in the man-page site? Or am I missing something?
â KGIII
Nov 18 '15 at 15:22
My network can handle it. I've got a separate line and compute cycles that I can throw at it. :D I've got @kos in chat working on another one so it looks like this is going to get done. I'll upload and share the results, regardless of which way I go, so as to make sure it need only be done once. Also, won't there be differences in the man-pages as Lubuntu has different default software than Ubuntu or is all software listed in the man-page site? Or am I missing something?
â KGIII
Nov 18 '15 at 15:22
add a comment |Â
up vote
2
down vote
For this approach, you will need html2ps
,ps2pdf
and a working LaTeX installation. You should be able to install all requirements with
sudo apt-get install html2ps ghostscript texlive-latex-base
Once you've installed the required packages, run this to get the man pages as pdf files:
curl http://manpages.ubuntu.com/manpages/wily/en/man1/ |
grep -oP 'href="K.*?.1.html' |
while read man; do
wget http://manpages.ubuntu.com/manpages/wily/en/man1/"$man" &&
html2ps "$man" | ps2pdf - "$man/.html/.pdf"
done
You should now have a (huge) collection of pdf files in the directory you ran the command in. By the way, make sure to run the command in a new, empty directory.
Now, to combine them into a single, indexed PDF file, you'll need LaTeX and you'll need to rename them because LaTeX doesn't like .
in file names:
rename 's/./-/g;s/-pdf/.pdf/' *pdf
cat <<EoF > man1.tex
documentclassarticle
usepackage[colorlinks=true,linkcolor=blue]hyperref
usepackagepdfpages
begindocument
tableofcontents
newpage
EoF
for f in *.pdf; do
file="$f/.pdf/"
printf 'section%snincludepdf[pages=-]%snn' "$file" "$f" >> man1.tex
done
echo "enddocument" >> man1.tex
pdflatex man1.tex && pdflatex man1.tex
The result is an indexed PDF file of all man pages (I only used 10 for testing):
This one keeps stopping atUse of assignment to $[ is deprecated at /usr/bin/html2ps line 3409.
It has stopped there twice now. Any ideas?
â KGIII
Nov 19 '15 at 17:49
@KGIII sounds lika a bug inhtml2ps
. I didn't encounter it in the 1st 10 I used for testing. You could tryhtml2pdf
instead but I didn't find it in the Ubuntu repos. You might have to compile it.
â terdonâ¦
Nov 19 '15 at 18:09
Hmm... It's not letting me type @ username for some reason, anyhow - I'll compile that and test again later. :D Thanks. I was checking out your code and looking to see if there was a way to support resume. :/ I don't know of a way to do so.
â KGIII
Nov 19 '15 at 18:11
@KGIII you can't ping me because the owner of a post always gets notified and there's nobody else to ping. As for resuming, just run the first code without the while loop, only the curl, grep and wget. You will then have all the html files locally and can play with them at your leisure.
â terdonâ¦
Nov 19 '15 at 18:47
Thanks for the info - I've got the locally now. Now to convert 'em. This could be a minute.
â KGIII
Nov 19 '15 at 18:49
add a comment |Â
up vote
2
down vote
For this approach, you will need html2ps
,ps2pdf
and a working LaTeX installation. You should be able to install all requirements with
sudo apt-get install html2ps ghostscript texlive-latex-base
Once you've installed the required packages, run this to get the man pages as pdf files:
curl http://manpages.ubuntu.com/manpages/wily/en/man1/ |
grep -oP 'href="K.*?.1.html' |
while read man; do
wget http://manpages.ubuntu.com/manpages/wily/en/man1/"$man" &&
html2ps "$man" | ps2pdf - "$man/.html/.pdf"
done
You should now have a (huge) collection of pdf files in the directory you ran the command in. By the way, make sure to run the command in a new, empty directory.
Now, to combine them into a single, indexed PDF file, you'll need LaTeX and you'll need to rename them because LaTeX doesn't like .
in file names:
rename 's/./-/g;s/-pdf/.pdf/' *pdf
cat <<EoF > man1.tex
documentclassarticle
usepackage[colorlinks=true,linkcolor=blue]hyperref
usepackagepdfpages
begindocument
tableofcontents
newpage
EoF
for f in *.pdf; do
file="$f/.pdf/"
printf 'section%snincludepdf[pages=-]%snn' "$file" "$f" >> man1.tex
done
echo "enddocument" >> man1.tex
pdflatex man1.tex && pdflatex man1.tex
The result is an indexed PDF file of all man pages (I only used 10 for testing):
This one keeps stopping atUse of assignment to $[ is deprecated at /usr/bin/html2ps line 3409.
It has stopped there twice now. Any ideas?
â KGIII
Nov 19 '15 at 17:49
@KGIII sounds lika a bug inhtml2ps
. I didn't encounter it in the 1st 10 I used for testing. You could tryhtml2pdf
instead but I didn't find it in the Ubuntu repos. You might have to compile it.
â terdonâ¦
Nov 19 '15 at 18:09
Hmm... It's not letting me type @ username for some reason, anyhow - I'll compile that and test again later. :D Thanks. I was checking out your code and looking to see if there was a way to support resume. :/ I don't know of a way to do so.
â KGIII
Nov 19 '15 at 18:11
@KGIII you can't ping me because the owner of a post always gets notified and there's nobody else to ping. As for resuming, just run the first code without the while loop, only the curl, grep and wget. You will then have all the html files locally and can play with them at your leisure.
â terdonâ¦
Nov 19 '15 at 18:47
Thanks for the info - I've got the locally now. Now to convert 'em. This could be a minute.
â KGIII
Nov 19 '15 at 18:49
add a comment |Â
up vote
2
down vote
up vote
2
down vote
For this approach, you will need html2ps
,ps2pdf
and a working LaTeX installation. You should be able to install all requirements with
sudo apt-get install html2ps ghostscript texlive-latex-base
Once you've installed the required packages, run this to get the man pages as pdf files:
curl http://manpages.ubuntu.com/manpages/wily/en/man1/ |
grep -oP 'href="K.*?.1.html' |
while read man; do
wget http://manpages.ubuntu.com/manpages/wily/en/man1/"$man" &&
html2ps "$man" | ps2pdf - "$man/.html/.pdf"
done
You should now have a (huge) collection of pdf files in the directory you ran the command in. By the way, make sure to run the command in a new, empty directory.
Now, to combine them into a single, indexed PDF file, you'll need LaTeX and you'll need to rename them because LaTeX doesn't like .
in file names:
rename 's/./-/g;s/-pdf/.pdf/' *pdf
cat <<EoF > man1.tex
documentclassarticle
usepackage[colorlinks=true,linkcolor=blue]hyperref
usepackagepdfpages
begindocument
tableofcontents
newpage
EoF
for f in *.pdf; do
file="$f/.pdf/"
printf 'section%snincludepdf[pages=-]%snn' "$file" "$f" >> man1.tex
done
echo "enddocument" >> man1.tex
pdflatex man1.tex && pdflatex man1.tex
The result is an indexed PDF file of all man pages (I only used 10 for testing):
For this approach, you will need html2ps
,ps2pdf
and a working LaTeX installation. You should be able to install all requirements with
sudo apt-get install html2ps ghostscript texlive-latex-base
Once you've installed the required packages, run this to get the man pages as pdf files:
curl http://manpages.ubuntu.com/manpages/wily/en/man1/ |
grep -oP 'href="K.*?.1.html' |
while read man; do
wget http://manpages.ubuntu.com/manpages/wily/en/man1/"$man" &&
html2ps "$man" | ps2pdf - "$man/.html/.pdf"
done
You should now have a (huge) collection of pdf files in the directory you ran the command in. By the way, make sure to run the command in a new, empty directory.
Now, to combine them into a single, indexed PDF file, you'll need LaTeX and you'll need to rename them because LaTeX doesn't like .
in file names:
rename 's/./-/g;s/-pdf/.pdf/' *pdf
cat <<EoF > man1.tex
documentclassarticle
usepackage[colorlinks=true,linkcolor=blue]hyperref
usepackagepdfpages
begindocument
tableofcontents
newpage
EoF
for f in *.pdf; do
file="$f/.pdf/"
printf 'section%snincludepdf[pages=-]%snn' "$file" "$f" >> man1.tex
done
echo "enddocument" >> man1.tex
pdflatex man1.tex && pdflatex man1.tex
The result is an indexed PDF file of all man pages (I only used 10 for testing):
answered Nov 18 '15 at 16:45
terdonâ¦
62.1k12128205
62.1k12128205
This one keeps stopping atUse of assignment to $[ is deprecated at /usr/bin/html2ps line 3409.
It has stopped there twice now. Any ideas?
â KGIII
Nov 19 '15 at 17:49
@KGIII sounds lika a bug inhtml2ps
. I didn't encounter it in the 1st 10 I used for testing. You could tryhtml2pdf
instead but I didn't find it in the Ubuntu repos. You might have to compile it.
â terdonâ¦
Nov 19 '15 at 18:09
Hmm... It's not letting me type @ username for some reason, anyhow - I'll compile that and test again later. :D Thanks. I was checking out your code and looking to see if there was a way to support resume. :/ I don't know of a way to do so.
â KGIII
Nov 19 '15 at 18:11
@KGIII you can't ping me because the owner of a post always gets notified and there's nobody else to ping. As for resuming, just run the first code without the while loop, only the curl, grep and wget. You will then have all the html files locally and can play with them at your leisure.
â terdonâ¦
Nov 19 '15 at 18:47
Thanks for the info - I've got the locally now. Now to convert 'em. This could be a minute.
â KGIII
Nov 19 '15 at 18:49
add a comment |Â
This one keeps stopping atUse of assignment to $[ is deprecated at /usr/bin/html2ps line 3409.
It has stopped there twice now. Any ideas?
â KGIII
Nov 19 '15 at 17:49
@KGIII sounds lika a bug inhtml2ps
. I didn't encounter it in the 1st 10 I used for testing. You could tryhtml2pdf
instead but I didn't find it in the Ubuntu repos. You might have to compile it.
â terdonâ¦
Nov 19 '15 at 18:09
Hmm... It's not letting me type @ username for some reason, anyhow - I'll compile that and test again later. :D Thanks. I was checking out your code and looking to see if there was a way to support resume. :/ I don't know of a way to do so.
â KGIII
Nov 19 '15 at 18:11
@KGIII you can't ping me because the owner of a post always gets notified and there's nobody else to ping. As for resuming, just run the first code without the while loop, only the curl, grep and wget. You will then have all the html files locally and can play with them at your leisure.
â terdonâ¦
Nov 19 '15 at 18:47
Thanks for the info - I've got the locally now. Now to convert 'em. This could be a minute.
â KGIII
Nov 19 '15 at 18:49
This one keeps stopping at
Use of assignment to $[ is deprecated at /usr/bin/html2ps line 3409.
It has stopped there twice now. Any ideas?â KGIII
Nov 19 '15 at 17:49
This one keeps stopping at
Use of assignment to $[ is deprecated at /usr/bin/html2ps line 3409.
It has stopped there twice now. Any ideas?â KGIII
Nov 19 '15 at 17:49
@KGIII sounds lika a bug in
html2ps
. I didn't encounter it in the 1st 10 I used for testing. You could try html2pdf
instead but I didn't find it in the Ubuntu repos. You might have to compile it.â terdonâ¦
Nov 19 '15 at 18:09
@KGIII sounds lika a bug in
html2ps
. I didn't encounter it in the 1st 10 I used for testing. You could try html2pdf
instead but I didn't find it in the Ubuntu repos. You might have to compile it.â terdonâ¦
Nov 19 '15 at 18:09
Hmm... It's not letting me type @ username for some reason, anyhow - I'll compile that and test again later. :D Thanks. I was checking out your code and looking to see if there was a way to support resume. :/ I don't know of a way to do so.
â KGIII
Nov 19 '15 at 18:11
Hmm... It's not letting me type @ username for some reason, anyhow - I'll compile that and test again later. :D Thanks. I was checking out your code and looking to see if there was a way to support resume. :/ I don't know of a way to do so.
â KGIII
Nov 19 '15 at 18:11
@KGIII you can't ping me because the owner of a post always gets notified and there's nobody else to ping. As for resuming, just run the first code without the while loop, only the curl, grep and wget. You will then have all the html files locally and can play with them at your leisure.
â terdonâ¦
Nov 19 '15 at 18:47
@KGIII you can't ping me because the owner of a post always gets notified and there's nobody else to ping. As for resuming, just run the first code without the while loop, only the curl, grep and wget. You will then have all the html files locally and can play with them at your leisure.
â terdonâ¦
Nov 19 '15 at 18:47
Thanks for the info - I've got the locally now. Now to convert 'em. This could be a minute.
â KGIII
Nov 19 '15 at 18:49
Thanks for the info - I've got the locally now. Now to convert 'em. This could be a minute.
â KGIII
Nov 19 '15 at 18:49
add a comment |Â
up vote
0
down vote
Updated based on Chat discussion
You want to get all the manpages in PDF (or HTML) form. No system that I am aware of, nor any Distro for that matter, provides a complete set of all the manpages for your viewing. That'd be millions of pages, and my guess would be lots and lots of storage space being needed, of which would be useless to actually store. This would also be hammering the manpages server where the manpages exist already in web-viewable format. If you really want to achieve this you'd convert the manpages yourself and download all the packages in the repositories that contain manpages. (That's also equally evil)
You should not have to read all the manpages, nor do I know why you'd want that many pages. (Imagine a seven-hundred-thousand volume encyclopedia, that's effectively what you're asking for)
This section was before chat cleared up misconceptions about what was being asked
I don't think there's a 'PDF' for every thing, but you can make PDFs for manpages you wish to use.
There is a site which contains manpages for different Ubuntu releases for different packages in the Ubuntu repositories. Assuming the manpage in question in from a Repository package, you can open the manpage there, and then print these to a file (via either Print the Page or otherwise) and then save them as a PDF, which you can then use elsewhere.
The downside: it's one manpage at a time - you'll have to spend time to get the ones you want.
(Note: to PDF all the manpages would be a hellish project so I doubt you're going to find PDFs for every manpage you want... that'd usually be done by upstream, not Ubuntu).
They don't have to be in PDF format. Once I get the data (without hammering on their servers - ideally) I can go about manipulating it. I'd prefer to avoid that much effort but I'll go through it, if needed. My main criteria is avoiding hammering on their servers.
â KGIII
Nov 18 '15 at 14:38
@KGIII Unavoidable, unless you want to download and install all packages that have manpages. (Basically, Impossible Situation(TM).)
â Thomas Wardâ¦
Nov 18 '15 at 14:52
Following the chat, we're at a bit over 30,000 pages. Hrm... Maybe someone's already done this? It looks like I can slow the requests down with HTTrack and not hammer on the server too bad. I may just have to live with an older version from the site linked in my question - I really don't want to beat on their servers without a better reason than "bathroom reading."
â KGIII
Nov 18 '15 at 14:56
add a comment |Â
up vote
0
down vote
Updated based on Chat discussion
You want to get all the manpages in PDF (or HTML) form. No system that I am aware of, nor any Distro for that matter, provides a complete set of all the manpages for your viewing. That'd be millions of pages, and my guess would be lots and lots of storage space being needed, of which would be useless to actually store. This would also be hammering the manpages server where the manpages exist already in web-viewable format. If you really want to achieve this you'd convert the manpages yourself and download all the packages in the repositories that contain manpages. (That's also equally evil)
You should not have to read all the manpages, nor do I know why you'd want that many pages. (Imagine a seven-hundred-thousand volume encyclopedia, that's effectively what you're asking for)
This section was before chat cleared up misconceptions about what was being asked
I don't think there's a 'PDF' for every thing, but you can make PDFs for manpages you wish to use.
There is a site which contains manpages for different Ubuntu releases for different packages in the Ubuntu repositories. Assuming the manpage in question in from a Repository package, you can open the manpage there, and then print these to a file (via either Print the Page or otherwise) and then save them as a PDF, which you can then use elsewhere.
The downside: it's one manpage at a time - you'll have to spend time to get the ones you want.
(Note: to PDF all the manpages would be a hellish project so I doubt you're going to find PDFs for every manpage you want... that'd usually be done by upstream, not Ubuntu).
They don't have to be in PDF format. Once I get the data (without hammering on their servers - ideally) I can go about manipulating it. I'd prefer to avoid that much effort but I'll go through it, if needed. My main criteria is avoiding hammering on their servers.
â KGIII
Nov 18 '15 at 14:38
@KGIII Unavoidable, unless you want to download and install all packages that have manpages. (Basically, Impossible Situation(TM).)
â Thomas Wardâ¦
Nov 18 '15 at 14:52
Following the chat, we're at a bit over 30,000 pages. Hrm... Maybe someone's already done this? It looks like I can slow the requests down with HTTrack and not hammer on the server too bad. I may just have to live with an older version from the site linked in my question - I really don't want to beat on their servers without a better reason than "bathroom reading."
â KGIII
Nov 18 '15 at 14:56
add a comment |Â
up vote
0
down vote
up vote
0
down vote
Updated based on Chat discussion
You want to get all the manpages in PDF (or HTML) form. No system that I am aware of, nor any Distro for that matter, provides a complete set of all the manpages for your viewing. That'd be millions of pages, and my guess would be lots and lots of storage space being needed, of which would be useless to actually store. This would also be hammering the manpages server where the manpages exist already in web-viewable format. If you really want to achieve this you'd convert the manpages yourself and download all the packages in the repositories that contain manpages. (That's also equally evil)
You should not have to read all the manpages, nor do I know why you'd want that many pages. (Imagine a seven-hundred-thousand volume encyclopedia, that's effectively what you're asking for)
This section was before chat cleared up misconceptions about what was being asked
I don't think there's a 'PDF' for every thing, but you can make PDFs for manpages you wish to use.
There is a site which contains manpages for different Ubuntu releases for different packages in the Ubuntu repositories. Assuming the manpage in question in from a Repository package, you can open the manpage there, and then print these to a file (via either Print the Page or otherwise) and then save them as a PDF, which you can then use elsewhere.
The downside: it's one manpage at a time - you'll have to spend time to get the ones you want.
(Note: to PDF all the manpages would be a hellish project so I doubt you're going to find PDFs for every manpage you want... that'd usually be done by upstream, not Ubuntu).
Updated based on Chat discussion
You want to get all the manpages in PDF (or HTML) form. No system that I am aware of, nor any Distro for that matter, provides a complete set of all the manpages for your viewing. That'd be millions of pages, and my guess would be lots and lots of storage space being needed, of which would be useless to actually store. This would also be hammering the manpages server where the manpages exist already in web-viewable format. If you really want to achieve this you'd convert the manpages yourself and download all the packages in the repositories that contain manpages. (That's also equally evil)
You should not have to read all the manpages, nor do I know why you'd want that many pages. (Imagine a seven-hundred-thousand volume encyclopedia, that's effectively what you're asking for)
This section was before chat cleared up misconceptions about what was being asked
I don't think there's a 'PDF' for every thing, but you can make PDFs for manpages you wish to use.
There is a site which contains manpages for different Ubuntu releases for different packages in the Ubuntu repositories. Assuming the manpage in question in from a Repository package, you can open the manpage there, and then print these to a file (via either Print the Page or otherwise) and then save them as a PDF, which you can then use elsewhere.
The downside: it's one manpage at a time - you'll have to spend time to get the ones you want.
(Note: to PDF all the manpages would be a hellish project so I doubt you're going to find PDFs for every manpage you want... that'd usually be done by upstream, not Ubuntu).
edited Nov 18 '15 at 14:51
answered Nov 18 '15 at 14:31
Thomas Wardâ¦
41.5k23112166
41.5k23112166
They don't have to be in PDF format. Once I get the data (without hammering on their servers - ideally) I can go about manipulating it. I'd prefer to avoid that much effort but I'll go through it, if needed. My main criteria is avoiding hammering on their servers.
â KGIII
Nov 18 '15 at 14:38
@KGIII Unavoidable, unless you want to download and install all packages that have manpages. (Basically, Impossible Situation(TM).)
â Thomas Wardâ¦
Nov 18 '15 at 14:52
Following the chat, we're at a bit over 30,000 pages. Hrm... Maybe someone's already done this? It looks like I can slow the requests down with HTTrack and not hammer on the server too bad. I may just have to live with an older version from the site linked in my question - I really don't want to beat on their servers without a better reason than "bathroom reading."
â KGIII
Nov 18 '15 at 14:56
add a comment |Â
They don't have to be in PDF format. Once I get the data (without hammering on their servers - ideally) I can go about manipulating it. I'd prefer to avoid that much effort but I'll go through it, if needed. My main criteria is avoiding hammering on their servers.
â KGIII
Nov 18 '15 at 14:38
@KGIII Unavoidable, unless you want to download and install all packages that have manpages. (Basically, Impossible Situation(TM).)
â Thomas Wardâ¦
Nov 18 '15 at 14:52
Following the chat, we're at a bit over 30,000 pages. Hrm... Maybe someone's already done this? It looks like I can slow the requests down with HTTrack and not hammer on the server too bad. I may just have to live with an older version from the site linked in my question - I really don't want to beat on their servers without a better reason than "bathroom reading."
â KGIII
Nov 18 '15 at 14:56
They don't have to be in PDF format. Once I get the data (without hammering on their servers - ideally) I can go about manipulating it. I'd prefer to avoid that much effort but I'll go through it, if needed. My main criteria is avoiding hammering on their servers.
â KGIII
Nov 18 '15 at 14:38
They don't have to be in PDF format. Once I get the data (without hammering on their servers - ideally) I can go about manipulating it. I'd prefer to avoid that much effort but I'll go through it, if needed. My main criteria is avoiding hammering on their servers.
â KGIII
Nov 18 '15 at 14:38
@KGIII Unavoidable, unless you want to download and install all packages that have manpages. (Basically, Impossible Situation(TM).)
â Thomas Wardâ¦
Nov 18 '15 at 14:52
@KGIII Unavoidable, unless you want to download and install all packages that have manpages. (Basically, Impossible Situation(TM).)
â Thomas Wardâ¦
Nov 18 '15 at 14:52
Following the chat, we're at a bit over 30,000 pages. Hrm... Maybe someone's already done this? It looks like I can slow the requests down with HTTrack and not hammer on the server too bad. I may just have to live with an older version from the site linked in my question - I really don't want to beat on their servers without a better reason than "bathroom reading."
â KGIII
Nov 18 '15 at 14:56
Following the chat, we're at a bit over 30,000 pages. Hrm... Maybe someone's already done this? It looks like I can slow the requests down with HTTrack and not hammer on the server too bad. I may just have to live with an older version from the site linked in my question - I really don't want to beat on their servers without a better reason than "bathroom reading."
â KGIII
Nov 18 '15 at 14:56
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e)
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom))
StackExchange.using('gps', function() StackExchange.gps.track('embedded_signup_form.view', location: 'question_page' ); );
$window.unbind('scroll', onScroll);
;
$window.on('scroll', onScroll);
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f699626%2fdownloading-entirety-of-lubuntu-ubuntu-man-pages%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e)
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom))
StackExchange.using('gps', function() StackExchange.gps.track('embedded_signup_form.view', location: 'question_page' ); );
$window.unbind('scroll', onScroll);
;
$window.on('scroll', onScroll);
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e)
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom))
StackExchange.using('gps', function() StackExchange.gps.track('embedded_signup_form.view', location: 'question_page' ); );
$window.unbind('scroll', onScroll);
;
$window.on('scroll', onScroll);
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e)
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom))
StackExchange.using('gps', function() StackExchange.gps.track('embedded_signup_form.view', location: 'question_page' ); );
$window.unbind('scroll', onScroll);
;
$window.on('scroll', onScroll);
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
You could use wget + the prerequisites option to download the page..... but you need should have all the manual pages you for the packages you have installed via the
man
command - you should be able to pipe the output of the man command to various file formats, which you can then read on various devicesâ Wilf
Nov 18 '15 at 15:27
@Wilf Yeah, thanks. I want the man-pages for everything that's available for Ubuntu in the official repositories, installed or not. Therein lies the rub. I know, it's strange. It's still my objective and I've not yet found quite what I'm looking for online and readily available. I'm probably going to have to make it, from the looks of things. I'll be sure to upload it and share it, when I'm done - that way it needn't be done multiple times and I can then keep it up-to-date in the future.
â KGIII
Nov 18 '15 at 15:30