Fun with Digg’s API

Digg’s api was released about a week ago a while ago, and since that time, various flash projects have surfaced, probably due to that pesky contest. But I don’t have any flash experience, so PHP, here we come.

We’re going to make various tag clouds, a Reddit clone, Reddit/Better RSS feed, and a live diggs app.

Update – 05/09/07 – 3 PM – Stupid host seems to be having database issues. Great. I wish I could afford a dedicated, or I guess VPS, so I don’t have to put with this crap.

Update – 05/09/07 – 6 PM – It seems that it was a combination of their db sucking and the fact they got rid of some essential pear packages. WTF?

Update – 05/12/07 – 11 AM – Yup. They cut off my MySQL abilities. I can’t connect to any of my databases on xrho.com. This site is fine however. I guess inserting 100+ entries a minute might have annoyed them?

Update – 05/17/07 – 7 PM – The MySQL abilities are still cut off. Fucking netfirms. Also, it seems in one of my updates, it cut off the bottom, which also happened to include the download links. You can download a ZIP or Gzip of the files. Since I’m having problems with that, feel free to download them and run them on your own site. Let me know if you do so I can add the link.

Demos of most of the scripts are available here, and more specifically:

Due to the length of this post, I’m splitting it. I hate making people click through, but it’s long enough to constitute it.

Part A: The Setup

First, we need the Pear package, from http://bugs.joestump.net/code/Services_Digg/Services_Digg-0.0.2.tgz. You can install it using the Pear install command, but I just copied the folder into my directory.

We’re going to be using MySQL database, so create a new database and here is the table structure I used:

-- 
-- Table structure for table `diggslive`
-- 

CREATE TABLE `diggslive` (
  `digging_id` int(10) unsigned NOT NULL auto_increment,
  `story_id` int(10) unsigned NOT NULL,
  `id` int(10) unsigned NOT NULL,
  `username` varchar(250) collate latin1_general_ci NOT NULL,
  `time` int(10) NOT NULL,
  `status` varchar(45) collate latin1_general_ci NOT NULL,
  PRIMARY KEY  (`digging_id`),
  UNIQUE KEY `id` (`id`),
  KEY `username` (`username`),
  KEY `story_id` (`story_id`)
) ENGINE=MyISAM  DEFAULT CHARSET=latin1 COLLATE=latin1_general_ci AUTO_INCREMENT=137897 ;

-- --------------------------------------------------------

-- 
-- Table structure for table `diggstories`
-- 

CREATE TABLE `diggstories` (
  `digg_id` int(10) unsigned NOT NULL auto_increment,
  `title` text character set latin1 collate latin1_general_ci NOT NULL,
  `description` text character set latin1 collate latin1_general_ci NOT NULL,
  `diggs` int(5) NOT NULL,
  `comments` int(4) NOT NULL,
  `id` int(10) unsigned NOT NULL,
  `link` varchar(1000) character set latin1 collate latin1_general_ci NOT NULL,
  `submitted` int(10) NOT NULL,
  `promoted` int(10) NOT NULL,
  `href` varchar(1000) character set latin1 collate latin1_general_ci NOT NULL,
  `status` varchar(45) character set latin1 collate latin1_general_ci NOT NULL,
  `user_id` int(10) unsigned NOT NULL,
  `username` varchar(250) character set latin1 collate latin1_general_ci NOT NULL,
  `icon` text character set latin1 collate latin1_general_ci NOT NULL,
  `registered` int(10) NOT NULL,
  `profileviews` int(10) unsigned NOT NULL,
  `topic_long` varchar(100) character set latin1 collate latin1_general_ci NOT NULL,
  `topic_short` varchar(100) character set latin1 collate latin1_general_ci NOT NULL,
  `container_long` varchar(100) character set latin1 collate latin1_general_ci NOT NULL,
  `container_short` varchar(100) character set latin1 collate latin1_general_ci NOT NULL,
  `host` varchar(250) collate utf8_bin default NULL,
  PRIMARY KEY  (`digg_id`),
  UNIQUE KEY `href` (`href`),
  FULLTEXT KEY `title` (`title`),
  FULLTEXT KEY `description` (`description`),
  FULLTEXT KEY `title_2` (`title`,`description`),
  FULLTEXT KEY `link` (`link`)
) ENGINE=MyISAM  DEFAULT CHARSET=utf8 COLLATE=utf8_bin AUTO_INCREMENT=3818 ;

-- --------------------------------------------------------

-- 
-- Table structure for table `diggusers`
-- 

CREATE TABLE `diggusers` (
  `user_id` int(10) unsigned NOT NULL auto_increment,
  `username` varchar(250) collate latin1_general_ci NOT NULL,
  `icon` text collate latin1_general_ci,
  `registered` int(10) NOT NULL,
  `profileviews` int(10) unsigned NOT NULL default '0',
  PRIMARY KEY  (`user_id`),
  UNIQUE KEY `username` (`username`),
  KEY `username_2` (`username`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1 COLLATE=latin1_general_ci AUTO_INCREMENT=2550 ;

There are 3 tables. Table diggstories will store the all the info about the the popular stories that is given to us through the api. There are 3 FULLTEXT indecies to enable a better search as shown later.

Now for the config file. Edit as necessary for your setup:
config.php:


Here are some functions that we’ll use later.
functions.php:

","'","""),array("<",">","'","""),$content);
}

function timeparse($time) {
	$since = time() - $time;
	if($since < 60) {
		return ($since == 1) ? "$since second ago" : "$since seconds ago";
	}
	elseif($since < 3600) {
		$since = floor($since/60);
		return ($since == 1) ? "$since minute ago" : "$since minutes ago";
	}
	elseif($since < 86400) {
		$since = floor($since/3600);
		return ($since == 1) ? "$since hour ago" : "$since hours ago";
	}
	else {
		$since = floor($since/86400);
		return ($since == 1) ? "$since day ago" : "$since days ago";
	}
}


function toshort($text) {
	return strtolower(str_replace(" ","_",$text));
}


function host($url) {
	$exploded = explode("/",$url);
	return str_replace("www.","",$exploded[2]);
}

?>

Part B: Getting data

Getting the stories

First off, let’s populate our database with stories. This will take as many stories as possible and inserts them into out `diggstories` table. We are also going to insert our user info that it finds.

insert.php:

 100,'offset' => $offset);
	$request = Services_Digg::factory('Stories')->popular($params);
	foreach ($request->stories as $story) {
		
		$userinfo = mysql_query("SELECT `user_id`, `profileviews` FROM `diggusers` WHERE `username` = '".sqlquote($story->user->name)."' LIMIT 1");
		if(mysql_numrows($userinfo) != 1) {
			mysql_query("INSERT INTO `diggusers`
				( `user_id` , `username`, `icon`, `registered`, `profileviews` )
				VALUES ( 
					NULL, 
					'".sqlquote($story->user->name)."', 
					'".sqlquote($story->user->icon)."', 
					'".sqlquote($story->user->registered)."',
					'".sqlquote($story->user->profileviews)."'
				) ") or die("Inserting user: ".mysql_error());
			$user_id = mysql_insert_id();
		}
		else {
			$user_id = mysql_result($userinfo,0,"user_id");
		}
		mysql_query("INSERT INTO `diggstories` ( 
		`digg_id`, `title`, `description`, `diggs`, `comments`, `id`, `link`, `submitted`, `promoted`, `href`, `status`, 
			`user_id` , `username`, `icon`, `registered`, `profileviews`, `topic_long`, `topic_short`, `container_long`, `container_short`, `host`
		)
		VALUES (
			NULL, 
			'".sqlquote($story->title)."', 
			'".sqlquote($story->description)."', 
			'".sqlquote($story->diggs)."',
			'".sqlquote($story->comments)."',
			'".sqlquote($story->id)."',
			'".sqlquote($story->link)."', 
			'".sqlquote($story->submit_date)."',
			'".sqlquote($story->promote_date)."',
			'".sqlquote($story->href)."',
			'".sqlquote($story->status)."',
			'".sqlquote($user_id)."',
			'".sqlquote($story->user->name)."',
			'".sqlquote($story->user->icon)."',
			'".sqlquote($story->user->registered)."',
			'".sqlquote($story->user->profileviews)."',
			'".sqlquote($story->topic->name)."',
			'".sqlquote($story->topic->short_name)."',
			'".sqlquote($story->container->name)."',
			'".sqlquote($story->container->short_name)."',
			'".sqlquote(host($story->link))."'
		) ") or mysql_query("UPDATE `diggstories` 
			SET `diggs` = '".sqlquote($story->diggs)."', `comments` = '".sqlquote($story->comments)."'
			WHERE `id` = '".sqlquote($story->id)."' 
			LIMIT 1
			") or die(mysql_error());
	}
	echo "
".$offset; $offset += 100; $count = $request->total; } ?>

The first lines,
ini_set('user_agent', 'Trendds/1.0');
ini_set('max_execution_time', 3600);

set the User-Agent and increases the max execution time, since getting over 3000 stories takes some time, and setting a user agent is required.

Services_Digg::$appKey = 'http://www.ja.meswilson.com/blog/Services_Digg_Proxy.php';
Services_Digg::$uri = 'http://services.digg.com';

set the api key and uri, as shown in the Pear tests.

Next, we start requesting and inserting the info. This runs through a while and for each loop inserting each story.

After running that, you should now have about a 3000 entry `diggstories` table along with a couple thousand entry `diggusers` table.

In order to keep these updated, we can run update.php every so often. I did 5 minutes, but that seems to be a bit overkill.

update.php:

 100);
$request = Services_Digg::factory('Stories')->popular($params);
//print_r($request);
//exit;
foreach ($request->stories as $story) {
	
	$userinfo = mysql_query("SELECT `user_id`, `profileviews` FROM `diggusers` WHERE `username` = '".sqlquote($story->user->name)."' LIMIT 1");
	if(mysql_numrows($userinfo) != 1) {
		mysql_query("INSERT INTO `diggusers`
			( `user_id` , `username`, `icon`, `registered`, `profileviews` )
			VALUES ( 
				NULL, 
				'".sqlquote($story->user->name)."', 
				'".sqlquote($story->user->icon)."', 
				'".sqlquote($story->user->registered)."',
				'".sqlquote($story->user->profileviews)."'
			) ") or die("Inserting user: ".mysql_error());
		$user_id = mysql_insert_id();
	}
	else {
		$user_id = mysql_result($userinfo,0,"user_id");
	}
	mysql_query("INSERT INTO `diggstories` ( 
	`digg_id`, `title`, `description`, `diggs`, `comments`, `id`, `link`, `submitted`, `promoted`, `href`, `status`, 
		`user_id` , `username`, `icon`, `registered`, `profileviews`, `topic_long`, `topic_short`, `container_long`, `container_short`, `host`
	)
	VALUES (
		NULL, 
		'".sqlquote($story->title)."', 
		'".sqlquote($story->description)."', 
		'".sqlquote($story->diggs)."',
		'".sqlquote($story->comments)."',
		'".sqlquote($story->id)."',
		'".sqlquote($story->link)."', 
		'".sqlquote($story->submit_date)."',
		'".sqlquote($story->promote_date)."',
		'".sqlquote($story->href)."',
		'".sqlquote($story->status)."',
		'".sqlquote($user_id)."',
		'".sqlquote($story->user->name)."',
		'".sqlquote($story->user->icon)."',
		'".sqlquote($story->user->registered)."',
		'".sqlquote($story->user->profileviews)."',
		'".sqlquote($story->topic->name)."',
		'".sqlquote($story->topic->short_name)."',
		'".sqlquote($story->container->name)."',
		'".sqlquote($story->container->short_name)."',
		'".sqlquote(host($story->link))."'
	) ") or mysql_query("UPDATE `diggstories` 
		SET `diggs` = '".sqlquote($story->diggs)."', `comments` = '".sqlquote($story->comments)."'
		WHERE `id` = '".sqlquote($story->id)."' 
		LIMIT 1
		") or die(mysql_error());
}




?>

In order to not refresh or have a cron job (though if you plan on using this more, a cron job would be good), you can just run crondigg.py, or crondigg.pyw if you don’t want to see a window, to keep the database updated. You will need to edit value for url to point to the location of your update.php.

crondigg.py

#! /usr/bin/env python

url = 'http://localhost/digg/update.php'
wait = 300

import urllib,time

while 1:
	try:
		print 'Requesting %s \r' % url,
		urllib.urlopen(url).read()
		print " " * (len(url)+12),'\r',
		count = 0
		while count < wait:
			print 'Waiting %d seconds \r' % (wait-count),
			count+=1
			time.sleep(1)
	except KeyboardInterrupt:
		import sys
		sys.exit(2)

Note: This python script was written on Windows, where it works fine. On linux however, it's a different story. It still works, just the output doesn't.

Gettings diggs

Next, we're going to get individual diggs. We're not going to go back in time and get as many diggs as possible. The database will get really big pretty quickly, so don't worry about it.

updatediggs.php:

 100,'min_date' => $mindate,'offset'=>$offset);
	$diggs = $api->diggs($params);
	// Go through each digg
	//print_r($diggs);
	foreach ($diggs->diggs as $digg) {
		// This is pretty annoying. It's true if the insert failed. It might not be the most sensible thing, but it's about 2 in the morning, and I like it
		$inserted = false;
		// Insert the digg
		mysql_query("INSERT INTO `diggslive` 
			(`digging_id`, `story_id`, `id`, `username`, `time`, `status`)
			VALUES (
				NULL,
				'".sqlquote($digg->story)."',
				'".sqlquote($digg->id)."',
				'".sqlquote($digg->user)."',
				'".sqlquote($digg->date)."',
				'".sqlquote($digg->status)."'
			)") or setcheck(true);
		if(!$inserted) {
			// Increase the digg count of the story, since it just got dugg
			mysql_query("UPDATE `diggstories` SET `diggs` = `diggs`+1 WHERE `id` = '".sqlquote($digg->id)."' LIMIT 1") or die(mysql_error());
		}
	}
	// If this is the first run, update the timestamp. We could use time(), but it might be slightly off. If we take the timestamp of a later run, it'll be later than what we got back.
	if($runs == 0 AND $diggs->timestamp != "") {
		$timestamp = $diggs->timestamp;
		$runs = 1;
	}
	$total = $diggs->total;
	print $offset."
"; $offset += 100; } // If the timestamp isn't foobared, update the file. If there are network problems or something, this can be wrong. I don't think I've still truely fixed that, but whatever. if(isset($timestamp) AND $timestamp != "" AND is_numeric($timestamp) AND $timestamp > 1000000) { $file = fopen('lastdiggs.txt','w'); fwrite($file,$timestamp); fclose($file); } // This will change the variable to say if it inserted successfully or not, because doing ... or $inserted = true; doesn't work. -_- function setcheck($bool) { global $inserted; $inserted = $bool; } ?>

Before you can use this, you need to insert some digg or something into `diggslive` with `time` set to the current timestamp. After another digg gets inserted, you can get rid of this entry.

This just gets the diggs of all the stories since the last digg we got. The timestamp is one less than the time of the last digg. This is to make sure we got the diggs that occureed at the same second as the previous one.

This also edits a file called lastdigg.txt, which is just an alternative to the sql query.

As I said earlier, the table gets big. Mine was at around 330k entries before I started running deletes. Let's get rid of any diggs older than a day. We don't really need them.

removediggs.p:


To keep this updated, you can run crondiggs.py. This requests your updatediggs.php file every 30 seconds.

crondiggs.py

#! /usr/bin/env python

url = 'http://localhost/digg/updatediggs.php'
url2 = 'http://localhost/digg/removediggs.php'
wait = 30

import urllib,time

while 1:
	try:
		print 'Requesting %s \r' % url,
		urllib.urlopen(url).read()
		if int(time.time()) % 10 == 0:
			print 'Requesting %s \r' % url2,
			urllib.urlopen(url2).read()
		print " " * (len(url)+12),'\r',
		count = 0
		while count < wait:
			print 'Waiting %d seconds \r' % (wait-count),
			count+=1
			time.sleep(1)
	except KeyboardInterrupt:
		import sys
		sys.exit(2)

Part Gamma: Tag Clouds

Now it's time to use all this new fangled data we're collecting, and what's more Web 2.0 than tag clouds? And since we're lazy, we're going to use this tag cloud script.

First cloud set, the container cloud set. The containers are the main topics, like Technology, Science, Sports, etc.

concloud.png

concloud.php:

 $value) {

    // calculate CSS font-size
    // find the $value in excess of $min_qty
    // multiply by the font-size increment ($size)
    // and add the $min_size set above
    $size = $min_size + (($value - $min_qty) * $step);
    // uncomment if you want sizes in whole %:
    // $size = ceil($size);

    // you'll need to put the link destination in place of the #
    // (assuming your tag links to some sort of details page)
    echo ''.$key.' ';
    // notice the space at the end of the link
}



?>

That's pretty boring though. Maybe regular categories, like Apple, Linux/Unix, Playable Web Games, etc, will be more interesting.

catcloud.png

catcloud.php

 $value) {

    // calculate CSS font-size
    // find the $value in excess of $min_qty
    // multiply by the font-size increment ($size)
    // and add the $min_size set above
    $size = $min_size + (($value - $min_qty) * $step);
    // uncomment if you want sizes in whole %:
    // $size = ceil($size);

    // you'll need to put the link destination in place of the #
    // (assuming your tag links to some sort of details page)
    echo ''.$key.' ';
    // notice the space at the end of the link
}



?>

A little better. How about a user cloud based on submissions?

usercloud.png

usercloud.php:

 $value) {

    // calculate CSS font-size
    // find the $value in excess of $min_qty
    // multiply by the font-size increment ($size)
    // and add the $min_size set above
    $size = $min_size + (($value - $min_qty) * $step);
    // uncomment if you want sizes in whole %:
    // $size = ceil($size);

    // you'll need to put the link destination in place of the #
    // (assuming your tag links to some sort of details page)
    echo ''.$key.' ';
    // notice the space at the end of the link
}



?>

Pretty big, but let's try bigger. A user cloud based on number of diggs.

diggusercloud.png

diggusercloud.php:

 $value) {

    // calculate CSS font-size
    // find the $value in excess of $min_qty
    // multiply by the font-size increment ($size)
    // and add the $min_size set above
    $size = $min_size + (($value - $min_qty) * $step);
    // uncomment if you want sizes in whole %:
    // $size = ceil($size);

    // you'll need to put the link destination in place of the #
    // (assuming your tag links to some sort of details page)
    echo ''.$key.' ';
    // notice the space at the end of the link
}



?>

Great. How about a story cloud based on the number of diggs?

diggcloud.png

diggcloud.php:

 $value) {

    // calculate CSS font-size
    // find the $value in excess of $min_qty
    // multiply by the font-size increment ($size)
    // and add the $min_size set above
    $size = $min_size + (($value - $min_qty) * $step);
    // uncomment if you want sizes in whole %:
    // $size = ceil($size);

    // you'll need to put the link destination in place of the #
    // (assuming your tag links to some sort of details page)
    echo ''.$key.' ';
    // notice the space at the end of the link
}



?>

But wait, that doesn't look right. That's because we can't use basically the same SQL query as before. It's grouping the stories by the amount of diggs, so that if there are 3 stories with 750 diggs, and only 1 story with 751 diggs, the 3 stories will be larger. Since the weight of the story is already set as an entry, we don't need to group it, so we can just use this SQL query:

SELECT title AS tag, diggs, href
  FROM diggstories
  ORDER BY title ASC

realdiggcloud.png

 $value) {

    // calculate CSS font-size
    // find the $value in excess of $min_qty
    // multiply by the font-size increment ($size)
    // and add the $min_size set above
    $size = $min_size + (($value - $min_qty) * $step);
    // uncomment if you want sizes in whole %:
    // $size = ceil($size);

    // you'll need to put the link destination in place of the #
    // (assuming your tag links to some sort of details page)
    echo ''.$key.' ';
    // notice the space at the end of the link
}



?>

If you noticed in insert.php and update.php, we had an entry called 'host'. This is equal to whatever the hostname of the link. Let's make a host cloud based on the amount of stories.

hostcloud.png

hostcloud.php:

 $value) {

    // calculate CSS font-size
    // find the $value in excess of $min_qty
    // multiply by the font-size increment ($size)
    // and add the $min_size set above
    $size = $min_size + (($value - $min_qty) * $step);
    // uncomment if you want sizes in whole %:
    // $size = ceil($size);

    // you'll need to put the link destination in place of the #
    // (assuming your tag links to some sort of details page)
    echo ''.$key.' ';
    // notice the space at the end of the link
}



?>

Where to go: Tag clouds are fun. You can easily set it to only show a certain time period, like the past day or week, or you could use both the diggstories and diggslive tables to see what people are digging. You could also combine it with the search that we'll talk about next.

Part Dogma - Improved Search

As I said earlier, the `diggstories` table has 4 FULLTEXT indecies, so let's use those to create a better search. It's not necessarily all that better than the current one, and won't search upcoming stories, but to some, it might be better.

This will allow you to search using multiple queries, like let's say you want to find all of the stories dealing nintendo from arstechnica.com. This search can easily handle that.

It uses MySQL FULLTEXT search to handle it. This has the basic format of WHERE MATCH(`columns`) AGAINST ('query') and will return the results based on the relevance.

Better Search

search1.png

search2.png

search.php:

Search title and desc:

OR

Search in title:
Search in desc:
Search in links:


StoryDiggsComments"; for($i=0;$i<$num;$i++) { echo "".dehtml(mysql_result($result,$i,"title"))." [More] ".mysql_result($result,$i,"diggs")." ".mysql_result($result,$i,"comments")." "; } echo ""; echo "$num results"; }

Where to go: There are a lot of things that can be done to improve this. Maybe add a fetaure to only return stories with over a certain amount of diggs or comments, or that were submitted between certain dates

Part Alien - Reggit/Dreddit - Reddit clone

Homepage

Everyone loves the clean, sexy layout of reddit, so let's take Digg's data and throw it into a reddit layout.

Pretty easy to make. Just copy their page layout and CSS, and throw the digg story information in there. I removed the header and footer from it, since none will really work, and we don't have some badass logo. It's just sorted by promoted time, and not popularity, though it could probably be modded to take account that.

reddit-digg ss

Reddit Clone

reddit.php





digg + reddit = reggit... or dreddit










"; echo ($i % 2 == 0) ? "" : ""; echo ""; echo ""; echo " "; echo ($i % 2 == 0) ? "" : ""; echo ""; echo ""; } ?>
".($i+$offset+1).".
".dehtml(mysql_result($result,$i,"title"))." (".dehtml(mysql_result($result,$i,"host")).")
".mysql_result($result,$i,"diggs")." diggs posted ".( floor( (time() - mysql_result($result,$i,"submitted")) / 3600) )." hours ago by ".mysql_result($result,$i,"username")." ".mysql_result($result,$i,"comments")." comments

Where to go: Subreddits, like technology.redditclone to only show the stories from the container 'Technology', or apple.redditclone or apple.technology.redditclone to only show stories from the topic 'Apple'. Sort by popularity or "hotness", like reddit's homepage. Maybe something like ORDER BY `diggs` / (".time()." - `promoted`), but better.

RSS

Another great part of reddit is their RSS feed. No fluff, no mess, just links, so let's rid that off too. This is also really easy to make. If you've ever had to make an RSS feed from a DB, you'll have no trouble.

The RSS feed consists of a title that links directly to the article, with the content having links [link] which goes to the article, [more] which goes to the Digg page, and a [dugg] links which goes to the Duggmirror link.

reddit digg rss

Reddit / Better RSS Feed

redditrss.php:

";
	?>
	
        reggit/dreddit: what's new on digg
        http://digg.com/
        The latest stories, voted on by users like you.
	
	
	\n\t\t".dehtml($title)."\n\t\t".dehtml($link)."\n\t\t".gmdate("T",$time)."\n\t\t
	\t[link] [more] [dugg]]]>
	";
}

Where to go: The RSS feed can further be customized to enable the ability to have the RSS feed of only certain categories, or of only stories containing certain phrases.

Part Fun - Live Diggs - A real application

After screwing with pointless tag clouds and a neat, although not important, reddit clone, we can now build an actual application.

This app will allow you to enter a title of a stories and watch as people digg it. It will display the username and time dugg, and refresh all ajaxily, cause it's got to web 2.0, right?

First, you enter the title of the story, which then will try to find it's story id first by searching our database, then, if that fails, using the Digg API story search.

Once it has the story id, it now displays at most the last 30 diggs, and refreshing every 30 seconds to add new diggs. The diggs are retrieved from our database, though using the digg api to select the new diggs could be done just as well, but we have a table full of diggs, and we're going to use it, dammit! Plus, I think it's a little nicer on digg, since if you have a fair amount of users, instead of that lot of people requesting or your server on their behalf requesting diggs every 30 seconds, there is just a consistent 1 request every 30 seconds.

Anyways, the javascript in this is just hacked together using prototype.js, meaning there are probably better ways to do this, but whatever. This works....for the most part. Sometimes there might be repeat here and there, but nothing devastating.

livediggs.png

livediggs1.png

Here it is:

Live Diggs

livediggs.php:

getStoryByTitle(preg_replace("/[^a-z0-9]/ix","_",$_GET['title']));

	//print_r($story);
	if(isset($story->id) AND $story->id != "" AND is_numeric($story->id)) {

		if($_GET['mode'] == "ajax") {

			echo "storyid=".$story->id;

			exit;

		}

		else {

			header("Location: livediggs.php?storyid=".$story->id);

			exit;

		}

	}
	
	echo "storynotfound";
	

}

elseif(isset($_GET['storyid']) AND $_GET['storyid'] != "" AND is_numeric($_GET['storyid'])) {

	if($_GET['mode'] == "ajax") {

		if(isset($_GET['time']) AND is_numeric($_GET['time'])) {
			if($_GET['time'] == 0) {
				echo "&&×tamp=".time()."&&&";
				exit;
			}
			$where = " AND `digging_id` > '".(sqlquote($_GET['time']))."' ";
			
			$query = "SELECT * FROM `diggslive` WHERE `story_id` = '".sqlquote($_GET['storyid'])."' $where ORDER BY `time` DESC";
			//echo $query;
			$result = mysql_query($query);
			
			$num = mysql_numrows($result);
			
			for($i=0;$i<$num;$i++) {
				//echo "digger=".mysql_result($result,$i,"user")."&time=".timeparse(mysql_result($result,$i,"time"))."\n";
				echo "".mysql_result($result,$i,"username")."".timeparse(mysql_result($result,$i,"time"))."";
			}
			
			echo "&&×tamp=".( ($num == 0) ? $_GET['time'] : mysql_result($result,$num-1,"digging_id"))."&&&";
			exit;
		}
		else {
			$result = mysql_query("SELECT * FROM `diggslive` WHERE `story_id` = '".sqlquote($_GET['storyid'])."' ORDER BY `time` DESC LIMIT 0, 30");

			$num = mysql_numrows($result);

			if($num < 1) {

				echo "No diggs :(";

				std_stop();

			}

			echo "";

			for($i=0;$i<$num;$i++) {

				echo "";

				

			}

			echo "
".mysql_result($result,$i,"username")."".timeparse(mysql_result($result,$i,"time"))."
"; echo ""; } } else { std_start(); $result = mysql_query("SELECT * FROM `diggslive` WHERE `story_id` = '".sqlquote($_GET['storyid'])."' ORDER BY `time` LIMIT 0, 30"); $num = mysql_numrows($result); if($num < 1) { echo "No diggs :("; std_stop(); } echo ""; for($i=0;$i<$num;$i++) { echo ""; } //$lastdiggid = mysql_result($result,$num-1,"digging_id"); //echo $lastdiggid."[email protected]#[email protected]#"; echo "
".mysql_result($result,$i,"username")."".timeparse(mysql_result($result,$i,"time"))."
"; std_stop(); } } else { std_start(); form(); std_stop(); } function std_start() { ?> Live Diggs "; exit; } function form() { echo "
This will allow you to watch people digg your story, live!

"; }

Conclusion

:

If you want to download all of these scripts, you can do so here:
Download - ZIP
Download - Gzip

They're licensed under GPL. If you have any questions, suggestions, or comments, just leave a comment or contact me.

7 thoughts on “Fun with Digg’s API”

  1. Just a little note that Services_Digg was accepted into the PEAR repository so you’ll be able to install it via the normal PEAR procedures Real Soon Now[tm].

  2. hi, thanks for this code,
    when i do the update.php i got a blank page. how can i fix this? thanks

Leave a Reply

Your email address will not be published. Required fields are marked *