Object detection with tensorflow.js not able to detect

297 Views Asked by At

I'm trying to do detection in web application with a custom yolov4 darknet model, which has been converted to tensorflow following the steps here: https://github.com/hunglc007/tensorflow-yolov4-tflite

Then the model has been converted to tensorflow.js following the steps here: https://github.com/tensorflow/tfjs/tree/master/tfjs-converter

My problem is that the model is not able to predict, when I try making an prediction I get this output tensor:

Tensor {kept: false, isDisposedInternal: false, shape: Array(3), type:

'float32', size: 0, …}
dataId: {id: 1725}
dtype: "float32"
id: 947
isDisposedInternal: false
kept: false
rankType: "3"
scopeId: 1213
shape: Array(3)
0: 1
1: 14
2: 5
length: 3
[[Prototype]]: Array(0)
size: 70
strides: (2) [70, 5]
isDisposed: (...)
rank: (...)
[[Prototype]]: Object

I do not really know what the problem is, and would really appreciate some help! My code for making the prediction and loading the model is provided below. I'm not sure if the shape of the image tensor is wrong, because when I look into my model.json file this is the start of the file

{
  "format": "graph-model",
  "generatedBy": "2.3.0-rc0",
  "convertedBy": "TensorFlow.js Converter v3.14.0",
  "signature": {
    "inputs": {
      "input_1": {
        "name": "input_1:0",
        "dtype": "DT_FLOAT",
        "tensorShape": {
          "dim": [
            { "size": "-1" },
            { "size": "416" },
            { "size": "416" },
            { "size": "3" }
          ]
        }
      }
    },

Here the tensor shape is [-1, 416, 416, 3], but the shape of my image tensor is [1, 416, 416, 3], I do not know how to change this or if it is possible to change.

const WEIGHTS = '/model/model.json' 
const [model, setModel] = useState<tf.GraphModel | null>(null)
const imageRef = useRef<HTMLImageElement>(null)

const loadModel = async () => {
    const model = await tf.loadGraphModel(WEIGHTS)
    setModel(model)
  }
useEffect(() => {
    loadModel()
  }, [])

const predict = async (
    model: tf.GraphModel 
  ) => {
  if (model) {
    const image = imageRef.current
    const img = tf.browser.fromPixels(image)
    const resized = img.cast('float32').expandDims(0)
    console.log(resized.shape) // shape is [1, 416, 416, 3]
    const prediction = await model.executeAsync(resized)
    console.log(prediction)
  }
}
1

There are 1 best solutions below

0
On

I think you didn't wait for the model to get ready before running the prediction.

...
const WEIGHTS = require('./model/model.json') 
// Some other code

useEffect(() => {
  tf.ready().then(() => {
    loadModel()
  })
}, [])
...