World's largest web comic panel
Alan Johnson
alan at datdec.com
Fri Sep 21 09:52:19 EDT 2012
On Fri, Sep 21, 2012 at 1:05 AM, Joshua Judson Rosen
<rozzin at geekspace.com>wrote:
> Alan Johnson <alan at datdec.com> writes:
> >
> > On Thu, Sep 20, 2012 at 12:33 PM, Ben Scott <dragonhawk at gmail.com>
> wrote:
> >
> > > On Thu, Sep 20, 2012 at 11:52 AM, Joshua Judson Rosen
> > > <rozzin at geekspace.com> wrote:
> > > > ... thwarted by the unholy amount of hole-iness in the map:
> > > > you can't just start at the center, walk until you hit `the end'
> > > > of the world ...
> >
> > Personally, I don't think something
> > that tries to walk to the end is all that brutish. =)
>
> Well, it's at least barbarian ;)
>
The code or the author? ;-)
> > I mean, I get that not all the tile locations actually
> > > have image files there, but presumably you just get the 404 error and
> > > move on.
> > >
> > > wget hxxp://
> imgs.xkcd.com/clickdrag/{1..256}{n,s}{1..256}{e,w}.png
> > >
> > > Granted, this would hammer the server with lots of requests for
> > > non-existent files. And I imagine it would take some time to run
> > > through 256*256*2*2 HTTP GET requests. And maybe hit command line
> > > length limits. So polite or efficient, it's not. But if you want
> > > brute force and ignorance.... :)
> > >
> > >
> > I got around the command line issue by dumping a list of generated URLs
> to
> > a file and then feeding that file to wget:
> > ajohnson at helium:~/tmp/xkcdclickdrag$ for v in n s; do for h in e w; do
> for
> > x in `seq 100`; do for y in `seq 1 100`; do echo
> > http://imgs.xkcd.com/clickdrag/$y$v$x$h.png; done; done; done; done >
> urls
> > ajohnson at helium:~/tmp/xkcdclickdrag$ wget -qi urls
>
> xargs FTW. Though, actually..., you should be able to just pipe
> stdout from that loop directly into "wget -qi", shouldn't you?
>
Yes, wget -qi -, but I wanted to confirm the output and have the potential
to groom it if I wanted to.
>
> But, if you want be more *magically bruticious*, try parallel:
>
> http://www.gnu.org/software/parallel/
>
> You can also use cURL in stead of wget, and just use cURL's range/set
> syntax:
>
> curl -f -JO '
> http://imgs.xkcd.com/clickdrag/[1-100]{n,s}[1-100]{e,w}.png'
>
>
Ah, curl. If only I had discovered that before wget...
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.gnhlug.org/mailman/private/gnhlug-discuss/attachments/20120921/ad63419f/attachment.html
More information about the gnhlug-discuss
mailing list