Can I use TensorBoard with Google Colab?

Is there a way to use TensorBoard when learning the TensorFlow model in Google Colab?

+60
tensorflow tensorboard google-colaboratory
source share
14 answers

UPDATE: You probably want to try the official %tensorboard magic , available in TensorFlow 1.13 onwards.


Before the magic of %tensorboard , the standard way to achieve this was to proxy network traffic to Colab VMs using ngrok . An example with Colab can be found here .

These are the steps (code fragments represent cells of type "code" in colab):

  1. Launch TensorBoard in the background.
    Inspired by this answer .

     LOG_DIR = '/tmp/log' get_ipython().system_raw( 'tensorboard --logdir {} --host 0.0.0.0 --port 6006 &' .format(LOG_DIR) ) 
  2. Download and unzip ngrok .
    Replace the link passed to wget with the correct link to load your OS.

     ! wget https://bin.equinox.io/c/4VmDzA7iaHb/ngrok-stable-linux-amd64.zip ! unzip ngrok-stable-linux-amd64.zip 
  3. Launch ngrok background process ...

     get_ipython().system_raw('./ngrok http 6006 &') 

    ... and get a public URL. A source

     ! curl -s http://localhost:4040/api/tunnels | python3 -c \ "import sys, json; print(json.load(sys.stdin)['tunnels'][0]['public_url'])" 
+69
source share

Here's an easier way to do the same ngrok tunneling method in Google Colab.

 !pip install tensorboardcolab 

then,

 from tensorboardcolab import TensorBoardColab, TensorBoardColabCallback tbc=TensorBoardColab() 

Assuming you are using Keras:

 model.fit(......,callbacks=[TensorBoardColabCallback(tbc)]) 

You can read the original post here .

+19
source share

TensorBoard for TensorFlow runs on Google Colab using a tensor board. It uses ngrok for tunneling.

  1. Install TensorBoardColab

!pip install tensorboardcolab

  1. Create tenorboardcolab object

tbc = TensorBoardColab()

This automatically creates a TensorBoard link that you can use. This Tensorboard reads data in './Graph'

  1. Create a FileWriter pointing to this location

summary_writer = tbc.get_writer()

The tenorboardcolab library has a method that returns a FileWriter object pointing to the location above. / Graph.

  1. Start adding summary information to the event files at './Graph' using the summary_writer object

You can add scalar information or a graph or histogram data.

Link: https://github.com/taomanwai/tensorboardcolab

+11
source share

I tried, but did not get the result, but when using, as shown below, I got the results

 import tensorboardcolab as tb tbc = tb.TensorBoardColab() 

After that, open the link from the exit.

 import tensorflow as tf import numpy as np 

Explicitly create a Graph object

 graph = tf.Graph() with graph.as_default() 

Full example:

 with tf.name_scope("variables"): # Variable to keep track of how many times the graph has been run global_step = tf.Variable(0, dtype=tf.int32, name="global_step") # Increments the above 'global_step' Variable, should be run whenever the graph is run increment_step = global_step.assign_add(1) # Variable that keeps track of previous output value: previous_value = tf.Variable(0.0, dtype=tf.float32, name="previous_value") # Primary transformation Operations with tf.name_scope("exercise_transformation"): # Separate input layer with tf.name_scope("input"): # Create input placeholder- takes in a Vector a = tf.placeholder(tf.float32, shape=[None], name="input_placeholder_a") # Separate middle layer with tf.name_scope("intermediate_layer"): b = tf.reduce_prod(a, name="product_b") c = tf.reduce_sum(a, name="sum_c") # Separate output layer with tf.name_scope("output"): d = tf.add(b, c, name="add_d") output = tf.subtract(d, previous_value, name="output") update_prev = previous_value.assign(output) # Summary Operations with tf.name_scope("summaries"): tf.summary.scalar('output', output) # Creates summary for output node tf.summary.scalar('product of inputs', b, ) tf.summary.scalar('sum of inputs', c) # Global Variables and Operations with tf.name_scope("global_ops"): # Initialization Op init = tf.initialize_all_variables() # Collect all summary Ops in graph merged_summaries = tf.summary.merge_all() # Start a Session, using the explicitly created Graph sess = tf.Session(graph=graph) # Open a SummaryWriter to save summaries writer = tf.summary.FileWriter('./Graph', sess.graph) # Initialize Variables sess.run(init) def run_graph(input_tensor): """ Helper function; runs the graph with given input tensor and saves summaries """ feed_dict = {a: input_tensor} output, summary, step = sess.run([update_prev, merged_summaries, increment_step], feed_dict=feed_dict) writer.add_summary(summary, global_step=step) # Run the graph with various inputs run_graph([2,8]) run_graph([3,1,3,3]) run_graph([8]) run_graph([1,2,3]) run_graph([11,4]) run_graph([4,1]) run_graph([7,3,1]) run_graph([6,3]) run_graph([0,2]) run_graph([4,5,6]) # Writes the summaries to disk writer.flush() # Flushes the summaries to disk and closes the SummaryWriter writer.close() # Close the session sess.close() # To start TensorBoard after running this file, execute the following command: # $ tensorboard --logdir='./improved_graph' 
+6
source share

Here's how you can display your inline models on Google Colab. Below is a very simple example showing a placeholder:

 from IPython.display import clear_output, Image, display, HTML import tensorflow as tf import numpy as np from google.colab import files def strip_consts(graph_def, max_const_size=32): """Strip large constant values from graph_def.""" strip_def = tf.GraphDef() for n0 in graph_def.node: n = strip_def.node.add() n.MergeFrom(n0) if n.op == 'Const': tensor = n.attr['value'].tensor size = len(tensor.tensor_content) if size > max_const_size: tensor.tensor_content = "<stripped %d bytes>"%size return strip_def def show_graph(graph_def, max_const_size=32): """Visualize TensorFlow graph.""" if hasattr(graph_def, 'as_graph_def'): graph_def = graph_def.as_graph_def() strip_def = strip_consts(graph_def, max_const_size=max_const_size) code = """ <script> function load() {{ document.getElementById("{id}").pbtxt = {data}; }} </script> <link rel="import" href="https://tensorboard.appspot.com/tf-graph-basic.build.html" onload=load()> <div style="height:600px"> <tf-graph-basic id="{id}"></tf-graph-basic> </div> """.format(data=repr(str(strip_def)), id='graph'+str(np.random.rand())) iframe = """ <iframe seamless style="width:1200px;height:620px;border:0" srcdoc="{}"></iframe> """.format(code.replace('"', '&quot;')) display(HTML(iframe)) """Create a sample tensor""" sample_placeholder= tf.placeholder(dtype=tf.float32) """Show it""" graph_def = tf.get_default_graph().as_graph_def() show_graph(graph_def) 

Currently, you cannot start the Tensorboard service in Google Colab the way you start it locally. In addition, you cannot export your entire log to your disk via something like summary_writer = tf.summary.FileWriter('./logs', graph_def=sess.graph_def) , then load it and look at it locally.

+4
source share

I use a backup of Google Drive and sync https://www.google.com/drive/download/backup-and-sync/ . Event files that were previously saved on my Google drive during training are automatically synced to a folder on my computer. Let this folder be called logs . To access the visualizations in tenorboard, I open the command line, go to the folder of the Google synchronized drive and tensorboard --logdir=logs : tensorboard --logdir=logs .

Therefore, automatically synchronizing my drive with my computer (using backup and synchronization), I can use the tensor board as if I am training on my own computer.

Edit: Here is a laptop that might be useful. https://colab.research.google.com/gist/MartijnCa/961c5f4c774930f4bdd32d51829da6f6/tensorboard-with-google-drive-backup-and-sync.ipynb

+2
source share

There is an alternative solution, but we should use TFv2.0 preview. Therefore, if you have no migration problems, try this:

install tfv2.0 for GPU or CPU (TPU is not yet available)

CPU
ts-night-2.0-view
GPU
ts-night-gp-2.0-view

 %%capture !pip install -q tf-nightly-gpu-2.0-preview # Load the TensorBoard notebook extension %load_ext tensorboard.notebook 

import TensorBoard as usual:

from tensorflow.keras.callbacks import TensorBoard

Clear or create the folder where you want to save the logs (run these lines before running fit() workout)

 # Clear any logs from previous runs import time !rm -R ./logs/ # rf log_dir="logs/fit/{}".format(time.strftime("%Y%m%d-%H%M%S", time.gmtime())) tensorboard = TensorBoard(log_dir=log_dir, histogram_freq=1) 

Have fun with TensorBoard! :)

%tensorboard --logdir logs/fit

Here's the official Colab notebook and repo on GitHub

New alpha release TFv2.0:

CPU
!pip install -q tensorflow==2.0.0-alpha0 GPU
!pip install -q tensorflow-gpu==2.0.0-alpha0

+1
source share

To join @ solver149's answer, here is a simple example of using TensorBoard in Google Colab

1. Create a schedule, for example:

 a = tf.constant(3.0, dtype=tf.float32) b = tf.constant(4.0) total = a + b 

2. Install Tensorboard

 !pip install tensorboardcolab # to install tensorboeadcolab if it does not it not exist 

==> The result in my case:

 Requirement already satisfied: tensorboardcolab in /usr/local/lib/python3.6/dist-packages (0.0.22) 

3. Use it :)

The first of all imported TensorBoard from tenzorboaedcolab (you can use import* to import everything at once), and then create your tenorsboeardcolab, then attach a writer to it as follows:

 from tensorboardcolab import * tbc = TensorBoardColab() # To create a tensorboardcolab object it will automatically creat a link writer = tbc.get_writer() # To create a FileWriter writer.add_graph(tf.get_default_graph()) # add the graph writer.flush() 

==> Result

 Using TensorFlow backend. Wait for 8 seconds... TensorBoard link: http://cf426c39.ngrok.io 

4.Check this link: D

Tensorboard_Result_Graph_Image

This example was a token from the TF: TensorBoard manual .

+1
source share

I tried to show TensorBoard on Google Colab today,

 # in case of CPU, you can this line # !pip install -q tf-nightly-2.0-preview # in case of GPU, you can use this line !pip install -q tf-nightly-gpu-2.0-preview # %load_ext tensorboard.notebook # not working on 22 Apr %load_ext tensorboard # you need to use this line instead import tensorflow as tf 

"#################
do workouts
"#################

 # show tensorboard %tensorboard --logdir logs/fit 

Here is an actual example made by Google. https://colab.research.google.com/github/tensorflow/tensorboard/blob/master/docs/r2/get_started.ipynb

+1
source share

TensorBoard works with Google Colab and TensorFlow 2.0

 !pip install tensorflow==2.0.0-alpha0 %load_ext tensorboard.notebook 
+1
source share

A simple and easy way that I have found so far:

Get setup_google_colab.py file with wget

 !wget https://raw.githubusercontent.com/hse-aml/intro-to- dl/master/setup_google_colab.py -O setup_google_colab.py import setup_google_colab 

To start the tensor board in the background, open the port and click on the link.
I assume that you have the right value added to visualize your resume, and then combine all the resumes.

 import os os.system("tensorboard --logdir=./logs --host 0.0.0.0 --port 6006 &") setup_google_colab.expose_port_on_colab(6006) 

After running the above statements, you will be prompted with a link like:

 Open https://a1b2c34d5.ngrok.io to access your 6006 port 

For more help, check out the following git:

 https://github.com/MUmarAmanat/MLWithTensorflow/blob/master/colab_tensorboard.ipynb 
0
source share

Yes, of course, using a tensor board in Google Colab is quite simple. Follow the following steps-

1) Load the tensor board extension

 %load_ext tensorboard.notebook 

2) Add it to keras callback

 tensorboard_callback = tf.keras.callbacks.TensorBoard(logdir, histogram_freq=1) 

3) Run the tensor board

 %tensorboard — logdir logs 

Hope it helps.

0
source share

Many of the answers here are already out of date. So it will be mine, I am sure, in a few weeks. But at the time of this writing, all I had to do was run these lines of code from colab. And the tensor board opened just fine.

 %load_ext tensorboard %tensorboard --logdir logs 
0
source share

Using summary_writer to write a journal for each era in a folder, and then doing the following magic worked for me.

 %load_ext tensorboard %tensorboard --logdir=./logs 
0
source share

All Articles