2718.us blog » livejournal http://2718.us/blog Miscellaneous Technological Geekery Tue, 18 May 2010 02:42:55 +0000 en hourly 1 http://wordpress.org/?v=3.0.4 Reposting from the AML TempSite http://2718.us/blog/2009/09/06/reposting-from-the-aml-tempsite/ http://2718.us/blog/2009/09/06/reposting-from-the-aml-tempsite/#comments Mon, 07 Sep 2009 01:43:41 +0000 2718.us http://2718.us/blog/?p=172 This is not likely to be of interest to many people, but for anyone who used uJournal (uJ) or AboutMyLife (AML), which absorbed uJ after its demise, it is worth knowing that there has been a temporary site up at http://aboutmylife.net/tempsite/ at which one can get a very bare dump of their entire journal.  For those interested, it may also be of interest to take all those entries and post them into one’s current journal.  Here is a process for doing that.

THIS INFORMATION IS PROVIDED AS-IS WITH NO EXPRESS OR IMPLIED WARRANTY. USE AT YOUR OWN RISK. It worked for me, but who knows what that may mean for you.

Requires: Python v2.something (maybe 2.4?)–Mac OS X 10.4 works fine, as will most current linux/unix things, I think.

  1. Go to the AML tempsite, log in, and save the file that shows up (which is all your entries, but totally lacking formatting, etc.) as “entries.html”
  2. Download pyLJxmlrpc.py from Google Code (I just put it there; I wrote it), save it in the same directory as entries.html
  3. Copy/paste the following into a file (I called it “processEntries.py” but it doesn’t really matter), and change the USERNAME and PASSWORD to the username and password of the account to which you want to post (you can also change “www.livejournal.com” to other journal sites–it should work on any LJ site that supports the XML-RPC protocol). line wrapping and whitespace are important
    
    #!/usr/bin/python
    
    import re
    
    f = open('entries.html')
    s = f.read()
    a = s.split('</td></tr><tr></tr><tr><td width="25%">')
    r = re.compile(r'([0-9]{4})-([0-9]{2})-([0-9]{2}) ([0-9]{2}):([0-9]{2}):[0-9]{2}</td><td width="75%">(.*)</td></tr><tr><td> </td><td>(.*)',re.DOTALL)
    
    processedEntries = {}
    for e in a:
        m = r.search(e)
        t = "%s-%s-%s %s:%s" % (m.group(1), m.group(2), m.group(3), m.group(4), m.group(5))
        processedEntries[t] = {'year':m.group(1), 'mon':m.group(2), 'day':m.group(3), 'hour':m.group(4), 'min':m.group(5), 'subject':m.group(6), 'body':m.group(7)}
    
    sk = processedEntries.keys()
    sk.sort()
    
    import pyLJxmlrpc
    
    lj = pyLJxmlrpc.pyLJxmlrpc()
    
    for k in sk:
        lj.call_withParams_atURL_forUser_withPassword_('postevent',{'event':processedEntries[k]['body'],'linenedings':'unix','subject':processedEntries[k]['subject'],'security':'private','year':processedEntries[k]['year'],'mon':processedEntries[k]['mon'],'day':processedEntries[k]['day'],'hour':processedEntries[k]['hour'],'min':processedEntries[k]['min'],'props':{'opt_backdated':True,'taglist':'aml-raw'}},'http://www.livejournal.com/interface/xmlrpc/','USERNAME','PASSWORD')
        print "%s: %s" % (k,processedEntries[k]['subject'])
    
  4. At a command prompt (Mac: run Terminal), change to the directory in which you saved the two .py files and entries.html, and run
    python processEntries.py

    and watch it go–it’ll only take a few seconds to pull apart the HTML file, but reposting entries takes time; it prints the date/subject of each entry *after* attempting to post it, so errors you might see pertain to the date/subject immediately after the error.

Every entry from AML that didn’t have an empty body will be posted with its date-time maintained, set to private, and backdated; you will see error messages for any entries that were blank (since the AML tempsite thing strips out all HTML, this left me with some blank entries where meme/quiz results had been).

]]>
http://2718.us/blog/2009/09/06/reposting-from-the-aml-tempsite/feed/ 0
Dynamic URLs for XML-RPC Calls in AppleScript http://2718.us/blog/2009/02/12/dynamic-urls-for-xml-rpc-calls-in-applescript/ http://2718.us/blog/2009/02/12/dynamic-urls-for-xml-rpc-calls-in-applescript/#comments Thu, 12 Feb 2009 16:42:38 +0000 2718.us http://2718.us/blog/?p=123 I started working on asLJ after I came across this. One of the problems that I quickly ran into was that the URLs in the
tell application "<url>" to call xmlrpc ...

bits had to be hard-coded. That is, AppleScript didn’t like it when I tried to assemble the URL string on the fly. It took me a while to come up with a workaround, which should slightly impact the speed of the call, but doesn’t seem to make a noticeable difference. Here’s my generic handler for making LJ-based server XML-RPC calls:

– make a LiveJournal-type XML-RPC call to serverString for the method methodName with the parameters in parameterArray
  1. on callLJraw(serverString, methodName, parameterArray)
  2.     run script "on run {paramArray}
  3.                 tell application \"http://" & serverString & "/interface/xmlrpc\" to call xmlrpc ¬
  4.                     {method name:\"LJ.XMLRPC.\" & \"" & methodName & "\", parameters:{paramArray}}
  5.             end run" with parameters {parameterArray}
  6.     return result
  7. end callLJraw
]]>
http://2718.us/blog/2009/02/12/dynamic-urls-for-xml-rpc-calls-in-applescript/feed/ 0
asLJ: a Mac OS X 10.5+ LiveJournal Client http://2718.us/blog/2009/02/09/aslj/ http://2718.us/blog/2009/02/09/aslj/#comments Mon, 09 Feb 2009 08:24:18 +0000 2718.us http://2718.us/blog/?p=121 asLJ is a new client for Macs running Leopard that easily handles multiple accounts on LiveJournal and other LJ-based sites and facilitates cross-posting across accounts. Release notes and download link are in [info]aslj_client. The community for users is [info]aslj_users.

(As it is very LJ-centric, most of the information about it will be over at LJ, in the two places linked above, but there is a page for it here, as well.)

]]>
http://2718.us/blog/2009/02/09/aslj/feed/ 0
Statistics on LiveJournal-based Sites v2.0 http://2718.us/blog/2008/10/22/statistics-on-livejournal-based-sites-v20/ http://2718.us/blog/2008/10/22/statistics-on-livejournal-based-sites-v20/#comments Wed, 22 Oct 2008 18:05:39 +0000 2718.us http://2718.us/blog/?p=111 The reworking of my site that shows comparative statistics on every site based on the code from LiveJournal is now up and live and at a new URL:  http://lj-stat.2718.us/.  Moreover, there are now graphs of the data over time.  The data is updated at noon and midnight central time (U.S.).

One of the things that took the most work to get right was the thickness of the graph lines.  Because of the nature of the graphs, it was an absolute necessity that the lines be drawn with antialiasing enabled.  PHP’s interface to GD (or perhaps it’s GD itself?) ignores the line thickness setting when antialiasing is enabled.  The solution I eventually settled on is to, more or less, draw several one-pixel-wide lines next to and on top of one another to get the appearance of a thicker line.

As an aside, I’m using the technique mentioned here for permanently redirecting the old URL to the new URL:

… if you actually moved something to a new location (forever) use:

<?php
 header("HTTP/1.1 301 Moved Permanently");
 header("Location: http://example.org/foo");
?>
]]>
http://2718.us/blog/2008/10/22/statistics-on-livejournal-based-sites-v20/feed/ 0
An Overhaul of LJ-Stat http://2718.us/blog/2008/10/12/an-overhaul-of-lj-stat/ http://2718.us/blog/2008/10/12/an-overhaul-of-lj-stat/#comments Sun, 12 Oct 2008 12:03:44 +0000 2718.us http://2718.us/blog/?p=108 I’m currently working on an overhaul of LJ-Stat.

It looks like there’s some issue in using curl_multi_exec() in PHP with too many requests at once causing some requests to fail strangely, potentially accounting for the lack of data from several sites that are clearly not down and clearly provide stats.txt.  My current workaround is to do the requests in smaller blocks.

I’m also trying to provide more detail as to why there aren’t stats for the sites that don’t have stats.

But the biggest development is that there will probably be graphs of the data over time.  I say “probably” because while the code is pretty much written, I’ve only been storing historical data for about a day so far (in the past, only the most recent data was kept), so it’s hard to tell whether the graphs will look okay with a lot of data and whether producing the graphs will put a significant load on the server.  The data will probably update more regularly and more frequently–likely noon and midnight CT.

Also, if anyone knows for sure if Bloty, IziBlog, and/or LiveLogCity are still alive or definitively dead, I’d like to know.  Oh, and CommieJournal seems to be looking at the posibility of moving to a different codebase, though I can’t for the life of me see why anyone would want to try to move thousands of accounts from the LJ codebase to something incompatible and with a different working paradigm.

]]>
http://2718.us/blog/2008/10/12/an-overhaul-of-lj-stat/feed/ 0
Ads on the LJ-Stat page? http://2718.us/blog/2008/04/30/ads-on-the-lj-stat-page/ http://2718.us/blog/2008/04/30/ads-on-the-lj-stat-page/#comments Thu, 01 May 2008 02:53:20 +0000 2718.us http://2718.us/blog/?p=30 I’m wondering if I’d gain anything from putting a small Google AdSense unit or maybe an AdSense link unit on the LJ-Stat page.  And by “gain anything” I mean get a few cents to help pay my hosting bills.  It could be relatively unobtrusive…  It’s just that, thus far, I’ve avoided putting any ads on 2718.us.  Well, that, and that my best experience with AdSense is monetizing the visits of people who mistakenly ended up on my site by providing them with ads for what they really wanted, rather than what’s actually on my site.

]]>
http://2718.us/blog/2008/04/30/ads-on-the-lj-stat-page/feed/ 0
LJ-clone News http://2718.us/blog/2008/04/22/lj-clone-news/ http://2718.us/blog/2008/04/22/lj-clone-news/#comments Wed, 23 Apr 2008 04:43:13 +0000 2718.us http://2718.us/blog/?p=26 Scribblit => Inksome (but scribblit.com still works… for now? until May 10 [updated based on comment below])

CommieJournal to close May 1 unless they raise the money to cover $169/month hosting bill.  (If they cover this month’s bill, will the date just become June 1?  I don’t know.)

]]>
http://2718.us/blog/2008/04/22/lj-clone-news/feed/ 3
Limitations of lj-stat http://2718.us/blog/2008/04/13/limitations-of-lj-stat/ http://2718.us/blog/2008/04/13/limitations-of-lj-stat/#comments Sun, 13 Apr 2008 20:23:47 +0000 2718.us http://2718.us/blog/?p=16 To the best of my knowledge and research, my LJ-code-base Site Statistics page (lj-stat) has the most comprehensive list of sites running off of LiveJournal’s codebase (if you know of any that I’ve missed, please let me know).  The main point, though, is the comparative statistics.  This is where things get strange.  LJ and most of the sites provide a pretty statistics page at /stats.bml and in most (or all?) instances, stats.bml says at the top (this is from LJ itself)

Raw data can be picked up here.

where “here” links to /stats/stats.txt.  On at least one site, stats.bml has this text, but stats/stats.txt returns a 404.  On at least one site, both stats.bml and stats/stats.txt return 404.  Since it looks to me like the whole point of providing stats.txt was to provide a more machine-readable set of stats that didn’t require loading a full web page and screen-scraping, I have no intention of trying to screen-scrape the info I want.

Now, to make things even stranger, some sites are missing what I’d call “key” stats from their stats.txt files.  In particular, the one I care most about is the “active in some way in the past 30 days” measure since I think that’s the best measure of the vitality of a site (well, either that, or what portion of the total userbase it represents).  Stranger still is that some sites report numbers in stats.txt that not only don’t match stats.bml, but make no sense whatsoever (DeadJournal perpetually reports only 10 accounts updating in the past 30 days, even though stats.bml has more sensible numbers).

Unrelated to the content of stats.txt is the “Speed Index” column–based on the rate of transfer reported by libcurl when retrieving stats.txt, where the speed index of a site is given as the percentage of the fastest transfer rate.  What I don’t quite understand is how InsaneJournal is always at least twice as speedy as any other site, often at least 4x or 6x the speed.  It actually made me wonder if my server and theirs were somehow in the same datacenter or something, but there are at least a dozen hops between us (which is more than from my server to some other LJ-based sites), so maybe it does have something to do with the servers themselves and not just network conditions.

Please let me know if you have any suggestions about enhancements to lj-stat.  Also, feel free to try to convince the sites that don’t provide stats.txt to start providing it and to try to get sites where the numbers are clearly wrong to try to fix it.

]]>
http://2718.us/blog/2008/04/13/limitations-of-lj-stat/feed/ 0
Statistics on LJ and LJ-Clone Sites http://2718.us/blog/2008/04/06/statistics-on-lj-and-lj-clone-sites/ http://2718.us/blog/2008/04/06/statistics-on-lj-and-lj-clone-sites/#comments Mon, 07 Apr 2008 00:13:55 +0000 2718.us http://2718.us/blog/?p=5 http://2718.us/lj-stat/ is a page giving some comparative statistics on various LJ-code-based sites.

The underlying data is updated approximately daily. All numbers are based on what is supposed to be the “raw” data at /stat/stats.txt, even though on some or all sites there are significant discrepancies between the numbers reported in /stats/stats.txt and those shown at /stats.bml.

If you know of other LJ-code-based sites that you’d like to see added, please comment with the name/url. Also comment if you have any suggestions as to design, features, or other statistics you’d like to see.

]]>
http://2718.us/blog/2008/04/06/statistics-on-lj-and-lj-clone-sites/feed/ 0