Skip to content

Instantly share code, notes, and snippets.

@tavurth
Created June 7, 2016 18:44
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save tavurth/4d4afd20fe04957a66a8e527dde251e2 to your computer and use it in GitHub Desktop.
Save tavurth/4d4afd20fe04957a66a8e527dde251e2 to your computer and use it in GitHub Desktop.
Coding test for sms-online.com
#! /usr/bin/perl
=begin comment
Construct a console-based application using the framework AnyEvent which takes a list of URLs on STDIN and calls them all at once in asynchronous mode.
Display the responses as they are received and statistics for each call rate slashes after the completion of all calls.
=cut
use strict;
use warnings;
use AnyEvent;
use AnyEvent::HTTP;
use Time::HiRes qw(time);
# Make sure that we have the say command availible
sub say { print @_, "\n" }
# Trim functionality (From http://perlmaven.com/trim)
sub trim { my $s = shift; $s =~ s/^\s+|\s+$//g; return $s };
# Is the passed variable a string?
sub is_string { my $s = shift; return $s & ~$s; }
# Print something with nice titles
sub title { say '-'x88; say ' '.join(' ', @_); say '-'x88; }
# Some user infomation
title 'Please enter the urls you would like to fetch... Use STOP to finish collecting URLS.';
# Keep our Urls
my @urlsToFetch = ();
# Keep grabbing URLs from STDIN until the user decides to stop
while (1) {
# Print the console key
print " > ";
# Grab user input
my $input = <>;
# Quit input mode if the user types stop
if (uc($input) =~ "STOP") {
last;
}
# Make sure it's a URL of some length
elsif (length $input) {
# Add the url to the list
push(@urlsToFetch, trim($input));
}
}
# Create the conditional lock (Will only exit program when all AnyEvent processes)
my $counter = AnyEvent->condvar;
my %stats = ();
# Do for every url
foreach my $url (@urlsToFetch) {
# Prepend http if not found
if (index($url, 'http://') < 0) {
$url = 'http://' . $url;
}
# Skip this URL if it's not a string
if (! is_string $url) {
next;
}
# Setup the stats hash for this $url
$stats{$url} = {
'length' => 0,
'endTime' => -1,
'errorCodes' => -1,
'url' => $url,
'startTime' => time,
};
# Increment the counter to signify the beginning of a new wait
$counter->begin;
# Grab the data from the url
http_get $url, sub {
# Grab the data from the callback
my ($html) = @_;
my $error = 0;
# Make sure we've received some valid data
if (! length $html) {
$html = '';
$error += 1;
}
# Print some nice output
title(length $html, "received from $url");
# Update the stats for the URL
my $stat = $stats{$url};
$$stat{'endTime'} = time;
$$stat{'errorCodes'} = $error;
$$stat{'length'} = length $html;
$$stat{'totalTime'} = $$stat{'endTime'} - $$stat{'startTime'};
# Make sure to tell the counter we've finished the work
$counter->end;
};
};
# Wait for the AnyEvent processes to catch up
$counter->recv();
# Walk the hash sorted by response time
for my $site (sort { $stats{$a}{'totalTime'} <=> $stats{$b}{'totalTime'} } keys %stats) {
# Grab a reference to the stat in question
my $stat = $stats{$site};
# Start a string buffer
my $string = 'Site: ' . $$stat{'url'};
# Building the string buffer with some data
$string .= "\n\t" . 'Was loaded in ' . $$stat{'totalTime'} . 's';
$string .= "\n\t" . 'Gave us '. $$stat{'length'} . ' bytes of data';
# Add the error code response if one exists
if ($$stat{'errorCodes'} != 0) {
$string .= "\n\t" . 'Also gave us an error';
}
# Print the title to the STDOUT
title $string;
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment