Recipe 20.8. Finding Fresh Links (Perl Cookbook)

Perl Cookbook

Perl CookbookSearch this book
Previous: 20.7. Finding Stale LinksChapter 20
Web Automation
Next: 20.9. Creating HTML Templates
 

20.8. Finding Fresh Links

Problem

Given a list of URLs, you want to determine which have been most recently modified.

Solution

The program in Example 20.6 reads URLs from standard input, rearranges by date, and prints them back to standard output with those dates prepended.

Example 20.6: surl

#!/usr/bin/perl -w
# surl - sort URLs by their last modification date

use LWP::UserAgent;
use HTTP::Request;
use URI::URL qw(url);

my($url, %Date);
my $ua = LWP::UserAgent->new();

while ( $url = url(scalar <>) ) {
    my $ans;
    next unless $url->scheme =~ /^(file|https?)$/;
    $ans = $ua->request(HTTP::Request->new("HEAD", $url));
    if ($ans->is_success) {
        $Date{$url} = $ans->last_modified || 0;  # unknown
    } else {
        print STDERR "$url: Error [", $ans->code, "] ", $ans->message, "!\n";
    }
}

foreach $url ( sort { $Date{$b} <=> $Date{$a} } keys %Date ) {
    printf "%-25s %s\n", $Date{$url} ? (scalar localtime $Date{$url})
                                     : "<NONE SPECIFIED>", $url;
}

Discussion

The surl script works more like a traditional filter program. It reads from standard input one URL per line. (Actually, it reads from <ARGV>, which defaults to STDIN if @ARGV is empty.) The last-modified date on each URL is fetched using a HEAD request. That date is stored in a hash using the URL for a key. Then a simple sort by value is run on the hash to reorder the URLs by date. On output, the internal date is converted into localtime format.

Here's an example of using the xurl program from the earlier recipe to extract the URLs, then running that program's output to feed into surl.

% xurl http://www.perl.com/  | surl | head
Mon Apr 20 06:16:02 1998  http://electriclichen.com/linux/srom.html
Fri Apr 17 13:38:51 1998  http://www.oreilly.com/
Fri Mar 13 12:16:47 1998  http://www2.binevolve.com/
Sun Mar  8 21:01:27 1998  http://www.perl.org/
Tue Nov 18 13:41:32 1997  http://www.perl.com/universal/header.map
Wed Oct  1 12:55:13 1997  http://www.songline.com/
Sun Aug 17 21:43:51 1997  http://www.perl.com/graphics/perlhome_header.jpg
Sun Aug 17 21:43:47 1997  http://www.perl.com/graphics/perl_id_313c.gif
Sun Aug 17 21:43:46 1997  http://www.perl.com/graphics/ora_logo.gif
Sun Aug 17 21:43:44 1997  http://www.perl.com/graphics/header-nav.gif

Having a variety of small programs that each do one thing and that can be combined into more powerful constructs is the hallmark of good programming. You could even argue that xurl should work on files, and that some other program should actually fetch the URL's contents over the Web to feed into xurl, churl, or surl. That program would probably be called gurl, except that a program by that name already exists: the LWP module suite has a program called lwp-request with aliases HEAD, GET, and POST to run those operations in shell scripts.

See Also

The documentation for the CPAN modules LWP::UserAgent, HTTP::Request, and URI::URL; Recipe 20.7


Previous: 20.7. Finding Stale LinksPerl CookbookNext: 20.9. Creating HTML Templates
20.7. Finding Stale LinksBook Index20.9. Creating HTML Templates

Library Navigation Links

Copyright © 2001 O'Reilly & Associates. All rights reserved.