When you create WebRequest, you ask the server to provide you with a sample file, this file has not yet been parsed / executed by a web browser and therefore javascript has not done anything on it yet.
You need to use the tool to execute JavaScript on the page if you want to see how the page looks after analysis in the browser. One option is to use the built-in .net web browser: http://msdn.microsoft.com/en-au/library/aa752040(v=vs.85).aspx
The web browser control can move and load the page, and then you can request its DOM, which will be changed by JavaScript on the page.
EDIT (example):
Uri uri = new Uri("http://www.somewebsite.com/somepage.htm"); webBrowserControl.AllowNavigation = true; // optional but I use this because it stops javascript errors breaking your scraper webBrowserControl.ScriptErrorsSuppressed = true; // you want to start scraping after the document is finished loading so do it in the function you pass to this handler webBrowserControl.DocumentCompleted += new WebBrowserDocumentCompletedEventHandler(webBrowserControl_DocumentCompleted); webBrowserControl.Navigate(uri);
private void webBrowserControl_DocumentCompleted(object sender, WebBrowserDocumentCompletedEventArgs e) { HtmlElementCollection divs = webBrowserControl.Document.GetElementsByTagName("div"); foreach (HtmlElement div in divs) {
Pandepic
source share