Hi, I’m using the FVP Corstone 300.
I wonder if it has any file IO support?
I saw the tensorflow models is encoded to a c++ file and get compiled, wondering if it can read from/write to a file?
Hi,
There is no traditional “file” support within the bare-metal applications in the repository. However, the FVP does indeed support loading custom data into specified address for the system. If you need to run inferences on your custom network and you do not want to convert this model file into C++ arrays (happens during the CMake configuration of the project), you can choose to build the inference_runner use case application with dynamic model loading support. This is explained in the following sections:
- How to build: docs/use_cases/inference_runner.md#running-the-fvp-with-dynamic-model-loading
- How to execute: docs/use_cases/inference_runner.md#running-the-fvp-with-dynamic-model-loading
Hope this helps.
Oh, Thanks!
I think that’s exactly what I’m looking for.