## Series Introduction

This blog is a part of "A Guide To TensorFlow", where we will explore the TensorFlow API and use it to build multiple machine learning models for real-life examples. In this blog we shall build a complete tensorflow graph and visualize it in tensorboard.
Check out the other parts of the series: Part 1, Part 2 and Part 3

## The Graph

The following is the graph we hope to build, we will be using name scopes, try multiple summaries, variables, ops and every thing we have covered so far.

### Components

• Transformation: Transformation consists of the graph we've be using in our examples, it consists of an input placeholder, the input is fed in a product and a sum node. Their outputs are then added, the resultant is the output of the transformation block.
• Update: This block updates the values of the variables and passes the value from the transformation to the summaries.
• Variables: This block consists of two variables, one to store the accumulated sum of our outputs, and the other variable to keep a track of how many times we have run the graph.
• Summaries: This block contains all the summaries we will log in TensorBoard. (More on summaries later)
• Global Ops: The part is responsible for the initialization of the session and also aggregating the summaries
• Helper Function: This is the function we will use to provide input to our graph and run it.

## The Code

Lets start by importing tensorflow, and declaring our graph.

import tensorflow as tf

graph = tf.Graph
with graph.as_default():
...


Each of the component mentioned above will be put under its own name scope, the following template gives the gist of the overall code skeleton of our graph.

import tensorflow as tf

# Explicitly create a Graph object
graph = tf.Graph()

with graph.as_default():

with tf.name_scope("variables"):
...

with tf.name_scope("transformation"):
...

with tf.name_scope("update"):
...

with tf.name_scope("summaries"):
...

with tf.name_scope("global_ops"):
...

sess = tf.Session(graph=graph)
writer = tf.summary.FileWriter('./improved_graph', graph)
sess.run(init)


### Variables

As mentioned previously, we have two variables we need to declare. global_step to keep track of how many times the graph has been run and total_outputto keep track of the sum of all output values over time. Note that we will be setting Trainable parameter to false since we are going to manually update these variables.

    with tf.name_scope("variables"):
# Variable to keep track of how many times the graph has been run
global_step = tf.Variable(0, dtype=tf.int32, trainable=False, name="global_step")

# Variable that keeps track of the sum of all output values over time:
total_output = tf.Variable(0.0, dtype=tf.float32, trainable=False, name="total_output")


### Transformation

The transformation will consist of the input placeholder and other nodes we need for the computation

    with tf.name_scope("transformation"):

# Separate input layer
with tf.name_scope("input"):
# Create input placeholder- takes in a Vector
a = tf.placeholder(tf.float32, shape=[None], name="input_placeholder_a")

# Separate middle layer
with tf.name_scope("intermediate_layer"):
b = tf.reduce_sum(a, name="sum_b")
c = tf.reduce_prod(a, name="product_c")

# Separate output layer
with tf.name_scope("output"):


### Update

The update block will be responsible for updating the total value and incrementing the value of steps taken.

    with tf.name_scope("update"):
# Increments the total_output Variable by the latest output

# Increments the above global_step Variable, should be run whenever the graph is run


### Summary

The summary we wish to show in tensorboard are the output, the sum of the outputs and the average value. These are going to be scalar values, to log these specific scalar values we need to create operations which we can then execute in our session. Such an operation can be created with the tf.summary.scalar() function.
For calculating the average value we use tf.div() Op to divide update_total by increment_step we created in the update block. Note that we use tf.cast to cast the increment_step to tf.float32 dtype.

We will also create the global ops section that will initialize all the variables and also create a merged summary of the scalars we just created. Also we will create a session, open a summary writer and initialize the variables of our graph.

    with tf.name_scope("summaries"):
# Calculating average (avg = total/steps)
avg = tf.div(update_total, tf.cast(increment_step, tf.float32), name="average")

# Creates summaries for output node
tf.summary.scalar("output_summary", output)
tf.summary.scalar("total_summary", update_total)
tf.summary.scalar("average_summary", avg)

# Global Variables and Operations
with tf.name_scope("global_ops"):
# Initialization Op
init = tf.global_variables_initializer()
# Merge all summaries[…]
merged_summaries = tf.summary.merge_all()
# Start a Session, using the explicitly created Graph
sess = tf.Session(graph=graph)

# Open a SummaryWriter to save summaries
writer = tf.summary.FileWriter('./improved_graph', graph)

# Initialize Variables
sess.run(init)


We use the tf.summary.merge_all() function to merges all summaries collected in the default graph.

### Helper Function

Thus far we have almost all the code necessary for performing the computation and producing the tensorboard output. However we will need a helper function the runs the graph with given input tensor and saves summaries.

We need a dictionary to be used as feed_dict for our placeholder. We need this for our input placeholder a. Next we need to let the Session know that we wish to run the graph using our feed_dict and that we want to make sure that we run output, increment_step, and our merged_summaries Ops

We need to save the global_step and merged_summaries values in order to write our summaries, so we save them to the result, step and summary Python variables. We wont actually need the output value in this function, so as an alternate we can use an underscore _ to indicate that we don’t care about storing the value of output and that it is a "throwaway" variable.

Finally, we add the summaries to our summary.FileWriter using writer.add_summary

def run_graph(input_tensor):
"""
Helper function; runs the graph with given input tensor and saves summaries
"""
feed_dict = {a: input_tensor}
result, step, summary = sess.run([output, increment_step, merged_summaries],
feed_dict=feed_dict)


### Almost Done

Now we can run the graph with various inputs by calling our helper function. For example run_graph([2,8]) and once that done, we must not forget to close the writer and our session.

# Run the graph with various inputs
run_graph([2,8])
run_graph([3,1,3,3])
run_graph([8])
run_graph([1,2,3])
run_graph([11,4])
run_graph([4,1])
run_graph([7,3,1])
run_graph([6,3])
run_graph([0,2])
run_graph([4,5,6])

# Write the summaries to disk
writer.flush()

# Close the SummaryWriter
writer.close()

# Close the session
sess.close()


Once you run this graph, all the summaries get saved to disk, this can be then viewed in TensorBoard. Run the following command in your terminal or command prompt: tensorboard --logdir='./improved_graph'

You can then view your summaries in the browser.

## Interpreting TensorBoard

We can open the tensorboard by typing localhost:6000 in the browser. You can view the graph under the 'Graph' tab. On this page, we can observe that each block corresponds to a name scope, you can expand these blocks to see the ops under the name scopes.

In the Scalar tab you will see multiple charts.

If you notice, the smoothing is set to 0.6, you can reduce it to 0, this will show a clearer graph, with the precise values for each instance of our scalar summaries. If you expand these graphs, you can see the individual graphs, hovering on that graph you can view the respective values that correspond to each point in the graph.

### Sum of Results

I hope that you are now pretty much aware of the TensorFlow API. Now that we are confident about building, running and logging graphs we can start working with simple machine learning models. In the future blogs, we shall work on linear regressions, explore optimizers in TensorFlow and even build neural networks.