Skip to content

Instantly share code, notes, and snippets.

@Neel-Shah-29
Last active July 23, 2022 22:10
Show Gist options
  • Save Neel-Shah-29/f0371566ca1e24a6b3a9b4097cdd44db to your computer and use it in GitHub Desktop.
Save Neel-Shah-29/f0371566ca1e24a6b3a9b4097cdd44db to your computer and use it in GitHub Desktop.
Implementation of Operators in SOFIE

Implementation of ONNX operators in SOFIE

drawing

I am walking through all the steps required for implementing the operators in SOFIE with the help of an example of Leaky Relu Operator.

Defination of Leaky Relu

The Defination is available at ONNX Documentation here

Working and Implementation

  • Register the Operator in OperatorList.hxx located here.
#include "TMVA/ROperator_LeakyRelu.hxx"
  • Add the Operator to CMakeLists.txt located here.
TMVA/ROperator_LeakyRelu.hxx
  • Implementing a new C++ class for the Operator here
#ifndef TMVA_SOFIE_ROPERATOR_LeakyRelu
#define TMVA_SOFIE_ROPERATOR_LeakyRelu

#include "TMVA/SOFIE_common.hxx"
#include "TMVA/ROperator.hxx"
#include "TMVA/RModel.hxx"

#include <sstream>

namespace TMVA{
namespace Experimental{
namespace SOFIE{

template <typename T>
class ROperator_LeakyRelu final : public ROperator
{

private:

   /* Attributes*/
   float falpha=0.01; //default value
   std::string fNX;
   std::string fNY;
   std::vector<size_t> fShape;
   std::string fType;

Here we have 5 attributes

  1. Alpha which is mentioned in the ONNX defination of Leaky Relu.The default value of alpha is taken as 0.01.
  2. fNX is the input tensor.
  3. fNY is the resultant output tensor.
  4. fShape is the shape of the output tensor which is same as input tensor.
  5. fType is for checking the datatype which is supported by SOFIE or not.
public:
   ROperator_LeakyRelu(){}
   ROperator_LeakyRelu(float alpha,std::string nameX, std::string nameY):
   falpha(alpha),fNX(UTILITY::Clean_name(nameX)), fNY(UTILITY::Clean_name(nameY))
   {
      if(std::is_same<T, float>::value){
         fType = "float";
      }
		else{
			throw
				std::runtime_error("TMVA SOFIE Encountered unsupported type parsing a Leaky Relu operator");
		}
   }
  1. We declare 2 Constructors here, one is the default constructor and other is parameterised constructor with arguments as alpha, fNX --> Input tensor (string), fNY --> Output tensor (string).
  2. We check the datatype here , if its float then we assign ftype as float, otherwise we throw an exception regarding unsupported type for parsing leaky Relu operator.
   std::vector<ETensorType> TypeInference(std::vector<ETensorType> input){
      return input;
   }

The TypeInference method is used to return the type of output tensor.

   std::vector<std::vector<size_t>> ShapeInference(std::vector<std::vector<size_t>> input){
      auto ret = input; //suggest copy to compiler
      return ret;
   }

The ShapeInference method is used to return the type of output tensor.

   void Initialize(RModel& model){
      if (model.CheckIfTensorAlreadyExist(fNX) == false){   //input must be a graph input, or already initialized intermediate tensor
         throw std::runtime_error("TMVA SOFIE Leaky Relu Op Input Tensor is not found in model");
      }
      fShape = model.GetTensorShape(fNX);
      model.AddIntermediateTensor(fNY, model.GetTensorType(fNX), fShape);
   }
  1. In Initialize method we check whether input is a graph input or already initialized intermediate tensor otherwise we throw an exception error.
  2. We add the output tensor to the onnx graph model. The AddIntermediateTensor is defined in RModel.cxx which takes the arguments as the tensor name, type and shape. So here the output tensor type and shape is same as the input tensor.
   std::string Generate(std::string OpName){
      OpName = "op_" + OpName;
      if (fShape.empty()) {
         throw std::runtime_error("TMVA SOFIE Transpose Leaky Relu called to Generate without being initialized first");
      }
      std::stringstream out;
      size_t length = ConvertShapeToLength(fShape);

      out << SP << "float " << OpName << "_alpha = " << std::setprecision(std::numeric_limits<float>::max_digits10) << falpha << ";\n";

      out << "\n//------ LEAKY RELU\n";
      out << SP << "for (int id = 0; id < " << length << " ; id++){\n";
      out << SP << SP << "tensor_" << fNY << "[id] = ((tensor_" << fNX << "[id] > 0 )? tensor_" << fNX << "[id] : "<< OpName << "_alpha * tensor_"<< fNX<<"[id]);\n";
      out << SP << "}\n";
      return out.str();
   }

};

}//SOFIE
}//Experimental
}//TMVA


#endif //TMVA_SOFIE_ROPERATOR_LeakyRelu

In the Generate function we declare the actual defination of operator as mentioned in ONNX documentation.

  1. First we check whether Shape of operater exists or not, if not then we throw an exception error.
  2. We convert Shape to length using ConvertShapeToLength function defined in Sofie_common.cxx , here we basically multiply all dimensions of tensor and get the length of tensor.
  3. We take the input of alpha attribute in float with precision of maximum 10 digits, the default value is 0.01 if value is not provided.
  4. Implement the Leaky-Relu Operator according to following defination
f(x) = alpha * x   ------> for x < 0, 
f(x) = x 	   ------> for x >= 0

Note:-

I faced an error regarding the Length which is listed below.

/TMVA/ROperator_LeakyRelu.hxx:69:48: error: use of undeclared identifier 'length'

My mentor Lorenzo Moneta helped me with it and fix the error by including a line here.It was a general error faced for all SOFIE operators not having a weight tensor.

  • Declare the function make_ROperator_LeakyRelu in RModelParser_ONNX.hxx. This can be found here.
std::unique_ptr<ROperator> make_ROperator_LeakyRelu(const onnx::NodeProto& nodeproto, const onnx::GraphProto& graphproto, std::unordered_map<std::string, ETensorType>& tensor_type);

Add the operator to unordered map factoryMethodMap.

   {"LeakyRelu", &make_ROperator_LeakyRelu}
  • Define the function make_ROperator_LeakyRelu in RModelParser_ONNX.cxx as mentioned here. The RModelParser.cxx is mentioned in my earlier blog, how it works! Here is the documentation of the same.

I hope you all have got a generic idea of implementing the ONNX Operators in SOFIE! I will be back with another interesting content soon. Until then, Good bye!

Thanks and Regards, Neel Shah

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment