RocketCE for Power

 View Only

How to use Container Images of Tensorflow from RocketCE v1.8.0

  • 1.  How to use Container Images of Tensorflow from RocketCE v1.8.0

    ROCKETEER
    Posted 03-16-2023 02:49

    Overview 

    TensorFlow is an open source library to help you develop and train machine learning models. 

    RocketCE includes container images for tensorflow v2.10.1 and are based on the packages available in channel rocketce of anaconda.org. 

    In this version of RocketCE, the container images are available for both CPU and GPU. These images are available for python 3.8, 3.9 and 3.10 versions and these container images are supported for both Power 9 and Power 10 Systems of IBM with linux-ppc64le.

    CPU Images

    CPU based images are available in rocketce/tensorflow-cpu. These images are pulled using the following commands

      • docker pull quay.io/rocketce/tensorflow-cpu

    Available tags

      • py38-rce180 - Based on Python 3.8
      • py39-rce180 - Based on Python 3.9
      • py310-rce180 - Based on Python 3.10

    GPU Images

    CPU based images are available in rocketce/tensorflow. These images are pulled using the following commands

      • docker pull quay.io/rocketce/tensorflow

    Available tags

      • py38-rce180 - Based on Python 3.8
      • py39-rce180 - Based on Python 3.9
      • py310-rce180 - Based on Python 3.10

    Instructions to run RocketCE container

    RocketCE container images can be run using the following commands

    CPU based

      • docker run -it --memory="<memory>" --volume <path>:/shared-volume <imageid>

    GPU based

      • docker run --gpus <gpus> -it --memory="<memory>" --volume <path>:/shared-volume <imageid>

    • Argument
      • --gpus   - GPU devices to be available inside the container
      • --memory  - Memory limit. This can be provided based on the memory required to run the application
      • --volume, -v   - Bind mount a volume
    • Value
      • gpus - "all" to pass all GPUs, 1 to pass one GPU, etc
      • memory - "5g" represents that 5GB of memory is allocated for the container
      • path - Path of the folder to be mounted on container. This path will be available in container as "/shared-volume" folder

    For more information on docker run arguments, refer to this document.

    Container can be verified by the following command

    • CPU container
      • echo 'import tensorflow as tf; x = tf.constant([[1., 2., 3.], [4., 5., 6.]]); print(x)' | python
    • GPU container
      • echo 'import tensorflow as tf; x = tf.constant([[1., 2., 3.], [4., 5., 6.]]); print(x)' | python

    Important packages available as part of these container are

    • CPU container
      • tensorflow-cpu                   - 2.10.1
      • tensorflow-base                 - 2.10.1
      • tensorboard                          - 2.10.1
      • keras                                         - 2.10.0
      • tensorflow-estimator       - 2.10
      • tensorflow-data-server  - 0.6.1
      • openblas                                 - 0.3.21
      • numpy                                      - 1.23.4

    • GPU container
      • tensorflow                             - 2.10.1
      • tensorflow-base                 - 2.10.1
      • tensorboard                          - 2.10.1
      • keras                                         - 2.10.0
      • tensorflow-estimator       - 2.10
      • tensorflow-data-server  - 0.6.1
      • cudatoolkit                            - 11.4.4
      • openblas                                 - 0.3.21
      • numpy                                      - 1.23.4


    ------------------------------
    Rajesh Nukala
    Rocket Internal - All Brands
    ------------------------------