Home
last modified time | relevance | path

Searched refs:inference (Results 1 – 11 of 11) sorted by relevance

/openbmc/linux/net/ipv4/
H A Dtcp_lp.c87 u32 inference; member
110 lp->inference = 0; in tcp_lp_init()
284 lp->inference = 3 * delta; in tcp_lp_pkts_acked()
287 if (lp->last_drop && (now - lp->last_drop < lp->inference)) in tcp_lp_pkts_acked()
/openbmc/linux/drivers/accel/ivpu/
H A DKconfig14 inference accelerator for Computer Vision and Deep Learning applications.
/openbmc/openbmc/meta-openembedded/meta-python/recipes-devtools/python/
H A Dpython3-astroid_3.1.0.bb1 SUMMARY = "An abstract syntax tree for Python with inference support."
/openbmc/linux/drivers/accel/qaic/
H A DKconfig15 designed to accelerate Deep Learning inference workloads.
/openbmc/linux/drivers/accel/habanalabs/
H A DKconfig18 designed to accelerate Deep Learning inference and training workloads.
/openbmc/openbmc/poky/meta/recipes-support/db/db/
H A D0001-configure-Add-explicit-tag-options-to-libtool-invoca.patch6 This helps cross compile when tag inference via heuristics
/openbmc/openbmc/meta-openembedded/meta-oe/recipes-support/opencv/
H A Dopencv_4.9.0.bb113 …envino] = "-DWITH_OPENVINO=ON,-DWITH_OPENVINO=OFF,openvino-inference-engine,openvino-inference-eng…
/openbmc/linux/Documentation/accel/
H A Dintroduction.rst19 - Edge AI - doing inference at an edge device. It can be an embedded ASIC/FPGA,
/openbmc/linux/Documentation/accel/qaic/
H A Daic100.rst13 inference workloads. They are AI accelerators.
/openbmc/openbmc/poky/scripts/esdk-tools/
H A Drunqemu33 Dynamic path inference can be avoided by passing a *.qemuboot.conf to\n \
/openbmc/openbmc/poky/scripts/
H A Drunqemu33 Dynamic path inference can be avoided by passing a *.qemuboot.conf to\n \