How to load a large table into a table for data visualization?

I can connect the table to my database, but the size of the table here is really big. Every time I try to load a table into a table, it crashes and I can’t find a job. The size of the table varies from 10 million to 400 million rows. How should I approach this issue with any proposal?

+4
source share
5 answers

I found a simple solution to optimize Tableau for working with very large data sets (1 billion + rows): Google BigQuery, which is essentially a managed data warehouse.

  • Upload data to BigQuery (you can add multiple files to one table).

Tableau SQL- BigQuery , "". Google, .

100- ~ 1 MacBook.

+6

Tableau, Tableau . Tableau , ( ), , .

, , CNT ( ) , Tableau . - "select count (*) from xxx".

, , Tableau () ( ) . Tableau , , . : , ..

, - , , Tableau.

. . , . , , , , .

, , . , 60 , , , , . , min, max, sum, avg .., .

, Tableau Tableau. , . Tableau, , , , . , .

+11

:

  • ( , ..) 10M 400M, Tableau - . Tableau. .

  • (, -), N , 10M 400M. , Tableau. 10- -, - , , Tableau (, /, ).

+1

, , , Tableau.

, SQL Tableau. , , , .

, Tableau Tableau.

0

Tableau (1 + ): Google BigQuery, .

Loading data into BigQuery (you can add several files to one table). Associate this table with a table as an external data source. Then Tableau sends SQL-like commands to BigQuery whenever a new "view" is requested. Requests are quickly processed on Google computer hardware, which then sends a small amount of information back to the table.

This method allowed me to visualize a 100 gigabyte call record dataset with ~ 1 billion lines on a MacBook.

-2
source

All Articles