User Controls

  1. 1
  2. 2
  3. 3
  4. ...
  5. 135
  6. 136
  7. 137
  8. 138
  9. 139
  10. 140
  11. ...
  12. 169
  13. 170
  14. 171
  15. 172

Thanked Posts by Lanny

  1. Lanny Bird of Courage
    Hehe, this is kinda fun

    iss=> SELECT username, COUNT(*) AS autisms FROM "ISS_post" JOIN "ISS_poster" AS p ON p.id = author_id WHERE content ILIKE '%gont%' GROUP BY username ORDER BY autisms DESC LIMIT 30;
    username | autisms
    ------------------------+---------
    SCronaldo_J_Trump | 139
    Bill Krozby | 91
    infinityshock | 14
    Darth Beaver | 9
    Totse 2001 | 9
    Sophie | 6
    -SpectraL | 5
    Dargo | 4
    greenplastic | 3
    RestStop | 3
    mmQ | 3
    Oasis | 3
    Lanny | 3
    NARCassist | 2
    Phoenix | 2
    spacepantz | 1
    Merlin | 1
    benny vader | 1
    THE Bearded Homosexual | 1
    Rhymin Hymen | 1
    Bill Krozbythecatpart2 | 1
    AltarEgo | 1
    Ajax | 1
    Vizier | 1
    Coathangers Suck -WS | 1
    snab_snib | 1
    cerakote | 1
    bling bling | 1
    Dionysus | 1
    reject | 1
    (30 rows)
    The following users say it would be alright if the author of this post didn't die in a fire!
  2. Lanny Bird of Courage
    Lol, nah, Jenkins isn't my project. They picked the name because it's like a "butler name", similar to Jeeves. We use it at work, I kinda chuckle at the emails of people being like "jenkins is fucking slow" and "jenkins isn't working again".

    Originally posted by SBTlauien Looks interesting but I personally don't think I would need it for anything I do.

    So what exactly is an automation server? I looked it up and it appears to be something that makes it easier for a programmer to user his/her program with other applications. Is that correct?

    So it's basically just a build server, but it's expanded to do a lot of stuff other than builds so they call this kind of thing an "automation server". A really common workflow is to periodically grab the latest code on a project, build it, run a test suite, and deploy it to some testing environment and notify people if any step in that process fails. That may sound pretty trivial if you haven't worked on massive projects before but deployment ready builds can a long time to run, on the order of hours isn't uncommon, involvings dozens of discrete steps. Test suites can be comparably slow. Deployments can likewise have a lot of steps and be pretty complex if you're doing things like signing/verification or DNS sorcery or any of a dozen other things that fall under the heading of "deployment".

    There's an argument to be made that things being complex enough to justify this kind of software existing is a pathology, but it does make life easier. I do a similar kind of thing for ISS, automatic build/test you can check out here if you're interested for some reason. Deployment is still an SSH script though.
    The following users say it would be alright if the author of this post didn't die in a fire!
  3. Lanny Bird of Courage
    Also yee, drinking in the shower really makes you feel like an alcoholic but it's also strangely satisfying at the same time. Beer is best because it's the most tolerant to getting water in it. Wine also works, reds being better than whites. I wonder if anyone's ever given any thought to soap/drink pairings.
    The following users say it would be alright if the author of this post didn't die in a fire!
  4. Lanny Bird of Courage
    Originally posted by newfag …a good whiskey…

    Originally posted by Shitfucker crown royal

    bro...
    The following users say it would be alright if the author of this post didn't die in a fire!
  5. Lanny Bird of Courage
    ZAINT ZETAX
    The following users say it would be alright if the author of this post didn't die in a fire!
  6. Lanny Bird of Courage
    "2001" means "schizophrenia" in kabbalah
    The following users say it would be alright if the author of this post didn't die in a fire!
  7. Lanny Bird of Courage
    Speaking of mail, my spinner's coming tomorrow. I'm hyped. Going to work from home so I can play with it as soon as it arrives.

    Also bank holidays/hours are pretty comic, but I'm not complaining, I get a 5 day weekend as a result lol
    The following users say it would be alright if the author of this post didn't die in a fire!
  8. Lanny Bird of Courage
    I think I figured out why I like spinners
    The following users say it would be alright if the author of this post didn't die in a fire!
  9. Lanny Bird of Courage
    Originally posted by Captain Falcon No not u 2 :(

    I've never actually seen one used before since I don't associate with children but I mean why are people bugged by these things? Do you know how many dumb pointless things kids get into and forget about over the course of a year? Why is this one particularly upsetting?
    The following users say it would be alright if the author of this post didn't die in a fire!
  10. Lanny Bird of Courage
    Originally posted by RestStop I was going to make a "we go swimming here in that weather" joke but shit it's literally 10 degrees colder in San Fran than it is here in Ohio..weird..I guess right?

    Cool summers and it doesn't really get hot until like August. The winters are a lot more mild than Ohio
    The following users say it would be alright if the author of this post didn't die in a fire!
  11. Lanny Bird of Courage
    Originally posted by Sophie I kind of got it was a joke, silly.

    Empirically the comic element of communism jokes, at least as delivered by me, aren't immediately obvious.



    Originally posted by RestStop How is Larry King still alive? I'm watching him talk to some soulless bottom feeding "Omega XL" rep. as we speak the dude has to be Illuminati and or reptilian at this point because no way is he still naturally alive.

    Bro, I was saying the same thing like five years ago, I assumed he had died since then. What the FUCK is that nigga?
    The following users say it would be alright if the author of this post didn't die in a fire!
  12. Lanny Bird of Courage
    Originally posted by Malice Post another pic. Prove us wrong and demonstrate that moe really can exist in the 3D world. The best outfit and pose you can think of, I know you have it in you.

    She's enjoying playing hard to get
    The following users say it would be alright if the author of this post didn't die in a fire!
  13. Lanny Bird of Courage
    you know it's kinda impressive that Bill Krozby can keep on coming up with stupider things to say. You would think at some point he'd just hit the max, but nope, there's always some new height of idiocy he climbs to. He's like an olympic level dumbass.
    The following users say it would be alright if the author of this post didn't die in a fire!
  14. Lanny Bird of Courage
    Originally posted by Phoenix

    ayyy, nice bangs bae
    The following users say it would be alright if the author of this post didn't die in a fire!
  15. Lanny Bird of Courage
    Somebody asked me for some help with a scraper for mangareader.net recently and I ended up writing a fair amount on it, most of it is applicable to web scraping in general. Thought it might have some use to a wider audience so here's a lightly edited version:

    For web scraping kind of things you want to use node which can run outside of a browser context, the browser security model makes running scraping code in-browser a significant hurdle. You'll need node and npm installed.

    On a very high level web scraping consists of two parts: enumeration and retrieval. Enumeration is finding the list of all the things you want to scrape. Retrieval is fetching the actual content you want (pages of manga in this case) once they've been identified

    If you want to do a full site scrape enumeration looks kind of hierarchical: first you'll want to collect all the series hosted on a site, then all the chapters belonging to each series, then each page belonging to each chapter. For the sake of simplicity I suggest starting with scraping all the pages from just one chapter. Once you have code that works on one chapter you can start to generalize, turn it into a parameterized process that fetches all the pages of some chapter and move onto enumerating chapters of a series. You can work bottom-up in this fashion. In general this is a pretty good strategy for scraping.

    So to look at something a little more concrete let's take a look at scraping all the pages from a particular chapter on mangareader. Let's look at the markup for a page like this one: http://www.mangareader.net/bitter-virgin/1

    We're looking for something that will list all the pages in the chapter. The jump-to-page thing looks promising:

    <select id="pageMenu" name="pageMenu"><option value="/bitter-virgin/1" selected="selected">1</option>
    <option value="/bitter-virgin/1/2">2</option>
    <option value="/bitter-virgin/1/3">3</option>
    ...
    </select>


    Awesome, it looks like there's an element that has a bunch of sub elements with `value` attributes that point to each page in the chapter. Now we need to write some code to grab those. Here's what I came up with on the fly, no error handling or anything but it's simple. It depends on two libraries, "request" and "jsdom" for making requests and parsing responses respectively, you'll need to install these on your system using npm if you haven't before:

    #!/usr/bin/env node

    var jsdom = require('jsdom');
    var request = require('request');

    var firstPage = 'http://www.mangareader.net/bitter-virgin/1';

    // Make a request to the firstPage url which we know contains urls to each
    // page of the chapter.
    request(firstPage, function(err, response, body) {

    // `response` is just a string, here we parse the content into something we
    // can work with.
    var content = new jsdom.JSDOM(body);

    // Use a CSS selector to get the elements we're interested in. The selector is
    // the '#pageMenu option' part. It says "return the list of all option elements
    // which are a descendent of the element with pageMenu as its id". We know
    // each of those elements have a value attribute we're interested in
    var opts = content.window.document.querySelectorAll('#pageMenu option')

    // Iterate over all the options elements we just collected and print their
    // value attribute.
    opts.forEach(function(opt) {
    console.log(opt.value);
    });
    });


    This works for me, it outputs a list of page urls. So there's enumeration done, now we need to write fetching logic. Each page has an element with "img" as its id, which points to the image of the page. So we need to fetch the viewer page to get that and then fetch the image itself and save it. Here's what that looks like:

    #!/usr/bin/env node

    var jsdom = require('jsdom');
    var request = require('request');
    var fs = require('fs');

    // Visit a page url and download and save the page image
    function fetchPage(pageUrl, filename, done) {
    // Grab the page url. Note there's a difference between the asset at the page
    // url (which contains html for ads and navigation and such) and the actual
    // image which we want to save.
    request(pageUrl, function(err, response, body) {
    // Parse content
    var content = new jsdom.JSDOM(body);
    // Identify img tag pointing to the actual manga page image
    var img = content.window.document.querySelector('#img')

    // Make another request to get the actual image data
    request({
    url: img.src,
    // Must specify null encoding so the response doesn't get intrepreted
    // as text
    encoding: null
    }, function(err, response, body) {
    // Write the image data to the disk.
    fs.writeFileSync(filename, body);

    // Call done so we can either fetch another page or terminate the program.
    done();
    });
    });
    }

    // Fetch the each image from a list of pages and save them to disk
    function fetchPageList(pageList, timeout) {
    var idx = 0;

    // This recursive style may looks strange, you don't _really_ need to worry
    // about it, but it's important becauses the async nature of HTTP request.
    // If we did a simple for loop here every request would be fired in parallel,
    // so instead we'll process one page at a time, starting the next page's fetch
    // slightly after the previous one finishes.
    function fetchNext() {
    // Wait some amount of time between making each request. We could make
    // all these reqeust in parallel or one right after the other but aside
    // from being unkind to the host, many websites will refuse to serve requests
    // if we make too many at once or too close to eachother
    setTimeout(function() {
    // Fetch the actual page
    fetchPage(pageList[idx], 'bitter-virgin-' + idx + '.jpg', function() {
    // Fetch complete. Move onto the next item if there is one.
    idx++;
    if (idx < pageList.length) {
    fetchNext();
    } else {
    console.log('Done!');
    }
    });
    }, timeout);
    }

    fetchNext();
    }

    var firstPage = 'http://www.mangareader.net/bitter-virgin/1';

    // Make a request to the firstPage url which we know contains urls to each
    // page of the chapter.
    request(firstPage, function(err, response, body) {

    // `response` is just a string, here we parse the content into something we
    // can work with.
    var content = new jsdom.JSDOM(body);
    fs.writeFileSync('bar', body);

    // Use a CSS selector to get the elements we're interested in. The selector is
    // the '#pageMenu option' part. It says "return the list of all option elemnts
    // which are a descendent of the element with pageMenu as its id". We know
    // each of those elements have a value attribute we're interested in
    var opts = content.window.document.querySelectorAll('#pageMenu option')

    var pageList = [];
    opts.forEach(function(opt, index) {
    pageList.push('http://www.mangareader.net' + opt.value);
    });

    fetchPageList(pageList, 200);
    });


    As it stands fetchPage is parameterized, that is it's equipped to fetch any page from any manga on mangareader. `fetchPageList` and the logic for fetching a chapter assume one particular series however. To do a full scrape of the site they need to be generalized so that another process can enumerate the series and chapters and execute the generalized version of this process for each. That generalization is left as an exercise for the reader.
    The following users say it would be alright if the author of this post didn't die in a fire!
  16. Lanny Bird of Courage
    Originally posted by Coathangers Suck -WS so smart

    I am extremely intelligent, thank you for recognizing it
    The following users say it would be alright if the author of this post didn't die in a fire!
  17. Lanny Bird of Courage
    Huh, embedding with timestamps usually works but I hadn't seen the duration notation before. Should work now:

    The following users say it would be alright if the author of this post didn't die in a fire!
  18. Lanny Bird of Courage
    I don't mean to be a jerk, but just based on your posts in the past it seems like you're willing to do what's necessary to get ahead as well no? I think I remember you posting about stealing from employers and gaming unemployment benefits. Do those not fall under morally unacceptable for you or am I remembering wrong?
    The following users say it would be alright if the author of this post didn't die in a fire!
  19. Lanny Bird of Courage
    It's 'aight, been kinda down for the last year although not for any real reason. My life's quite comfortable but that kinda makes me feel stagnant, bored. It's happened before, I know what it takes to get out of a slump, it just takes a lot of work and empty hedonism is much easier.
    The following users say it would be alright if the author of this post didn't die in a fire!
  20. Lanny Bird of Courage
    Originally posted by greenplastic it's right in your original post you clown

    where?
    The following users say it would be alright if the author of this post didn't die in a fire!
  1. 1
  2. 2
  3. 3
  4. ...
  5. 135
  6. 136
  7. 137
  8. 138
  9. 139
  10. 140
  11. ...
  12. 169
  13. 170
  14. 171
  15. 172
Jump to Top