Just wondering if anyone can help me further with the following. I want to analyze the url on this website: http://www.directorycritic.com/free-directory-list.html? Pg = 1 & sort = pr
I have the following code:
<?PHP
$url = "http://www.directorycritic.com/free-directory-list.html?pg=1&sort=pr";
$input = @file_get_contents($url) or die("Could not access file: $url");
$regexp = "<a\s[^>]*href=(\"??)([^\" >]*?)\\1[^>]*>(.*)<\/a>";
if(preg_match_all("/$regexp/siU", $input, $matches)) {
// $matches[2] = array of link addresses
// $matches[3] = array of link text - including HTML code
}
?>
Currently doing nothing, and I need to do this is to drop the entire URL in the table for all 16-page pages and really appreciate some help on how to make changes to the above to do this, and output the url to a text file.
source
share