Log in

No account? Create an account

Request #548913

From: turboswami turboswami Kaleb Smith
: Account
LiveJournal: username: turboswami
style: (S2) core:  public,  i18n:  none,  i18nc:  none,  layout:  public,  theme:  public,  user:  custom
userpics: base + loyalty = userpics
sup enabled:
email validated? yes
cluster: Shishkabob (#8); data version : 8
design: new    friends page: friends
language: en_LJ
underage no;
Is JavaScript enabled: (unknown)
Request sent from Beta:
Photo hosting migration: done
Support category: General/Unknown  [previous | next]
Time posted: Mon, 30 Jan 2006 12:16:50 GMT (15 years ago)
Status: closed (1 point to freso)
Summary: Robot.txt Blocks Entry Saving
Original Request:
I have 5 years of entries and wished to archive them all quickly and save them in a manner which lets me view them similarly to how they appear online currently. (The text/source version that is available through the site covers 1 month at a time, and would take about about 6 hours to archive all five years...and is hard to read)

I purchased a "site saver" program, which is designed to save a given webpage, and all links contained therein up to a certain depth. When I attempt to save my Calendar page, I receive a "robot.txt file prohibits access" error.

I wish to remove or edit this robot.txt file so that I may quickly archive all of my writing to my hard drive in .html form. Do I need to have a paid account to do so, or is there some customization or different program available to me which would allow me to save all I have written?
Diagnostics: Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US; rv:1.7.7) Gecko/20050414 Firefox/1.0.3
freso freso  - Frederik "Freso" S. Olesen
Answer (#2151662)
Posted Mon, 30 Jan 2006 12:52:55 GMT (15 years ago)
The above referenced FAQ explains what steps you took to avoid having your journal indexed by search engines, which is what caused the robots.txt to block your program. You will need to reverse the steps laid out in the FAQ.