Hyperparameter optimization in neural networks is generally done heuristically, by varying each individual parameter such as learning rate, batch size and number of steps. Sklearn automates this by using the GridSearchCV 
Usually Sklearn’s examples and documentation is spot on and copy/pasting an example works with minimal changes. However this wasn’t quite the case with skflow and sklearn used in conjunction.
Hereunder is an example of using a tensorflow NeuralNetwork implemented in Skflow undergoing a hyperparameter optimization by using sklearn’s GridSearch:
In line 11, note that in contrast to the examples we normally run across the scoring method needs to be specified manually since Skflow doesn’t specify this intrinsically. Also note the n_jobs parameter is set to -1 to run N jobs in parallel, with N being the number of processors on your host. Last, note that the cv parameter is set to 2, meaning that the number of folds in the cross validation that GridSearch uses to judge accuracy is set to 2, rather than to the default 3. This last option obviously varies on a case by case basis