Get Bitcoin Historical Data

I want to make my own bitcoin chart.

Do you know any reliable way to get historical bitcoin price data? Is there any way to get it using REST? I saw Bitfloor that supports REST, but it does not return any useful value, it has an "internal server error".

I have also seen bitcoin charts, but I think it is limited to 2000 data values.

Could you offer me any framework or system for working on this?

+84
bitcoin
Apr 22 '13 at 9:08 on
source share
7 answers

In fact, you can get the whole history of Bitcoin trading with Bitcoincharts in CSV format here: http://api.bitcoincharts.com/v1/csv/

it is updated twice a day for active exchanges, and there are also several dead exchanges.

EDIT: since there are no column headers in CSV, this is what they are: column 1) trademark, column 2) price, column 3) transaction volume

+96
Jan 27 '14 at 21:29
source share
+24
Jul 05 '13 at 15:22
source share

In case you want to get bitstop trading data with their websocket in higher resolution for a longer period of time, you can use script log_bitstamp_trades.py below.

The script uses the python websocket-client and pusher_client_python libraries, so install them.

#!/usr/bin/python import pusherclient import time import logging import sys import datetime import signal import os logging.basicConfig() log_file_fd = None def sigint_and_sigterm_handler(signal, frame): global log_file_fd log_file_fd.close() sys.exit(0) class BitstampLogger: def __init__(self, log_file_path, log_file_reload_path, pusher_key, channel, event): self.channel = channel self.event = event self.log_file_fd = open(log_file_path, "a") self.log_file_reload_path = log_file_reload_path self.pusher = pusherclient.Pusher(pusher_key) self.pusher.connection.logger.setLevel(logging.WARNING) self.pusher.connection.bind('pusher:connection_established', self.connect_handler) self.pusher.connect() def callback(self, data): utc_timestamp = time.mktime(datetime.datetime.utcnow().timetuple()) line = str(utc_timestamp) + " " + data + "\n" if os.path.exists(self.log_file_reload_path): os.remove(self.log_file_reload_path) self.log_file_fd.close() self.log_file_fd = open(log_file_path, "a") self.log_file_fd.write(line) def connect_handler(self, data): channel = self.pusher.subscribe(self.channel) channel.bind(self.event, self.callback) def main(log_file_path, log_file_reload_path): global log_file_fd bitstamp_logger = BitstampLogger( log_file_path, log_file_reload_path, "de504dc5763aeef9ff52", "live_trades", "trade") log_file_fd = bitstamp_logger.log_file_fd signal.signal(signal.SIGINT, sigint_and_sigterm_handler) signal.signal(signal.SIGTERM, sigint_and_sigterm_handler) while True: time.sleep(1) if __name__ == '__main__': log_file_path = sys.argv[1] log_file_reload_path = sys.argv[2] main(log_file_path, log_file_reload_path 

and logrotate file configuration

 /mnt/data/bitstamp_logs/bitstamp-trade.log { rotate 10000000000 minsize 10M copytruncate missingok compress postrotate touch /mnt/data/bitstamp_logs/reload_log > /dev/null endscript } 

then you can run it in the background

 nohup ./log_bitstamp_trades.py /mnt/data/bitstamp_logs/bitstamp-trade.log /mnt/data/bitstamp_logs/reload_log & 
+10
Oct 11 '14 at 23:41
source share

Bitstap contains real-time bitcoin data that is available in the JSON this link . Do not try to access it more than 600 times in ten minutes, otherwise they will block your IP address (plus, in any case, this is optional; read more here ). The following is a C# approach for getting real-time data:

 using (var WebClient = new System.Net.WebClient()) { var json = WebClient.DownloadString("https://www.bitstamp.net/api/ticker/"); string value = Convert.ToString(json); // Parse/use from here } 

Here you can JSON and save it in a database (or use MongoDB paste it directly) and then access it.

For historical data (depending on the database โ€” if you fit it), insert from a flat file that allows most databases to be used (for example, using SQL Server you can make a BULK INSERT from a CSV file).

+4
Nov 27 '13 at 14:23
source share

I wrote a java example for this case:

Use the json.org library to extract JSONObjects and JSONArrays. The example below uses blockchain.info data, which can be retrieved as JSONObject.

  public class main { public static void main(String[] args) throws MalformedURLException, IOException { JSONObject data = getJSONfromURL("https://blockchain.info/charts/market-price?format=json"); JSONArray data_array = data.getJSONArray("values"); for (int i = 0; i < data_array.length(); i++) { JSONObject price_point = data_array.getJSONObject(i); // Unix time int x = price_point.getInt("x"); // Bitcoin price at that time double y = price_point.getDouble("y"); // Do something with x and y. } } public static JSONObject getJSONfromURL(String URL) { try { URLConnection uc; URL url = new URL(URL); uc = url.openConnection(); uc.setConnectTimeout(10000); uc.addRequestProperty("User-Agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)"); uc.connect(); BufferedReader rd = new BufferedReader( new InputStreamReader(uc.getInputStream(), Charset.forName("UTF-8"))); StringBuilder sb = new StringBuilder(); int cp; while ((cp = rd.read()) != -1) { sb.append((char)cp); } String jsonText = (sb.toString()); return new JSONObject(jsonText.toString()); } catch (IOException ex) { return null; } } } 
+4
Dec 03 '13 at 16:22
source share

Scrambling it in JSON using Node.js will be fun :)

https://github.com/f1lt3r/bitcoin-scraper

enter image description here

 [ [ 1419033600, // Timestamp (1 for each minute of entire history) 318.58, // Open 318.58, // High 318.58, // Low 318.58, // Close 0.01719605, // Volume (BTC) 5.478317609, // Volume (Currency) 318.58 // Weighted Price (USD) ] ] 
+3
Dec 20 '14 at 7:13
source share

Coinbase has a REST API that gives you access to historical prices from their website. It looks like the data shows the price of Coinbase (in US dollars) every ten minutes.

Results are returned in CSV format. You must request the page number that you want to use through the API. There are 1000 results (or price points) on the page. This is about 7 days of data per page.

+2
Feb 09 '14 at 22:32
source share



All Articles