All automated bots, spiders, crawlers, data-miners, etc that access information on LiveJournal.com are subject to the following policy. If you have been redirected to this page it is because we believe you to be in control of an automated bot. If that is not the case, please contact us at email@example.com.
We provide a variety of user data in standard XML formats, namely:
A user's recent entries syndicated using the Real Simple Syndication XML format. It's available with public entries only at:
If you want security-restricted posts included and you have access to view them, you may request the auth-required version of the feed using HTTP Digest auth, using:
A user's recent entries syndicated using the Atom XML format. Available at the URL with or without auth:
- Update Stream
For a live stream of all LiveJournal posts, check out:
A user's information page using the Friend of a Friend XML format. Available at the URL
A line separated list of usernames which are friends or friends-of a user. Available at the URL
Interests in a line separated format. Available at the URL
You are encouraged to use these resources instead of "screen-scraping" user pages.
Rates & Limits
You are encouraged to cache the results of your bot's requests, which saves us bandwidth and CPU time. Bots making repeated requests on the same resource (URL) in a short amount of time will be blocked. Please do not multithread your bot to access multiple resources at the same time and do not connect more than five times per second.
Well-Formed User Agents
All bots are required to have a well-formed user agent which includes a contact email address for the bot maintainer, and preferrably a URL to the organization running the bot. Bots without this information have a higher chance of being blocked. An example of a well-formed user agent is:
Bot - http://example.com/ljtoy.html; firstname.lastname@example.org
If we've blocked your bot and you'd like to contact us about it, please email us at email@example.com.