Fix dict unpacking lint error#356
Conversation
|
Test FAILed. |
|
@dcrankshaw Do you know which version of yapf is the test-container using? I pulled the lastest yapf and the result disagree with the one on Jenkins: Lastest yapf: Found 2 Python PEP8 format violations
--- ./integration-tests/deploy_tensorflow_models.py (original)
+++ ./integration-tests/deploy_tensorflow_models.py (reformatted)
@@ -139,9 +139,11 @@
sess.run(train, feed_dict={x: X_train, y_labels: y_train})
if i % 1000 == 0:
print('Cost , Accuracy')
- print(sess.run(
- [loss, accuracy], feed_dict={x: X_train,
- y_labels: y_train}))
+ print(
+ sess.run(
+ [loss, accuracy],
+ feed_dict={x: X_train,
+ y_labels: y_train}))
return sess
--- ./integration-tests/deploy_pyspark_pipeline_models.py (original)
+++ ./integration-tests/deploy_pyspark_pipeline_models.py (reformatted)
@@ -85,8 +85,9 @@
prediction))
# test predict function
- print(predict(spark, model,
- [json.dumps((np.random.randint(1000), "spark abcd"))]))
+ print(
+ predict(spark, model,
+ [json.dumps((np.random.randint(1000), "spark abcd"))]))
try:
clipper_conn = create_docker_connection(Jenkins: Found 3 Python PEP8 format violations
--- ./clipper_admin/setup.py (original)
+++ ./clipper_admin/setup.py (reformatted)
@@ -26,14 +26,8 @@
package_data={'clipper_admin': ['*.txt', '*/*.yaml']},
keywords=['clipper', 'prediction', 'model', 'management'],
install_requires=[
- 'requests',
- 'subprocess32',
- 'pyyaml',
- 'docker',
- 'kubernetes',
- 'prometheus_client',
- 'six',
- 'cloudpickle>=0.5.2'
+ 'requests', 'subprocess32', 'pyyaml', 'docker', 'kubernetes',
+ 'prometheus_client', 'six', 'cloudpickle>=0.5.2'
],
extras_require={
'PySpark': ['pyspark'],
--- ./integration-tests/deploy_tensorflow_models.py (original)
+++ ./integration-tests/deploy_tensorflow_models.py (reformatted)
@@ -139,11 +139,9 @@
sess.run(train, feed_dict={x: X_train, y_labels: y_train})
if i % 1000 == 0:
print('Cost , Accuracy')
- print(
- sess.run(
- [loss, accuracy],
- feed_dict={x: X_train,
- y_labels: y_train}))
+ print(sess.run(
+ [loss, accuracy], feed_dict={x: X_train,
+ y_labels: y_train}))
return sess
--- ./integration-tests/deploy_pyspark_pipeline_models.py (original)
+++ ./integration-tests/deploy_pyspark_pipeline_models.py (reformatted)
@@ -85,9 +85,8 @@
prediction))
# test predict function
- print(
- predict(spark, model,
- [json.dumps((np.random.randint(1000), "spark abcd"))]))
+ print(predict(spark, model,
+ [json.dumps((np.random.randint(1000), "spark abcd"))]))
try:
clipper_conn = create_docker_connection( |
|
I format the code by hand to comply with the Jenkins version for now. |
|
You can update the yapf submodule to the latest version as part of this PR. It's definitely several months out of date. |
|
Test FAILed. |
|
Test FAILed. |
|
Test FAILed. |
|
Test FAILed. |
|
@dcrankshaw I finally figured it out. It turns out to be python2 vs. python3 issue. Should resolve this soon. |
|
Test FAILed. |
|
Test PASSed. |
Fixes #355