Home
last modified time | relevance | path

Searched refs:attentions (Results 1 – 7 of 7) sorted by relevance

/openbmc/openpower-hw-diags/test/
H A Dtest-end2end.cpp52 std::vector<attn::Attention> attentions; in main() local
54 attentions.emplace_back(attn::Attention::AttentionType::Special, in main()
57 attentions.emplace_back(attn::Attention::AttentionType::Checkstop, in main()
60 attentions.emplace_back(attn::Attention::AttentionType::Vital, in main()
63 std::for_each(std::begin(attentions), std::end(attentions), in main()
/openbmc/openpower-hw-diags/attn/
H A DAttention_Handler.md27 The attention handler services four types of attentions namely, and in order of
29 Attention (BP) and Checkstop Attention (checkstop). TI and BP attentions are
34 The general handling of attentions is as follows:
84 enabled processor will be queried and a map of active attentions will be
88 used to determine active attentions.
135 These attentions are used to signal to the attention handler that it should
142 attentions.
155 These attentions indicate that a hardware error has occurred and further
181 handler will return to listening for attentions. In most cases no more
182 attentions will be detected unless the currently active attentions are cleared.
[all …]
/openbmc/openbmc/meta-openpower/recipes-phosphor/logging/
H A Dopenpower-libhei_git.bb4 "The library provides a set of tools to isolate hardware attentions driven \
/openbmc/linux/Documentation/ABI/testing/
H A Dsysfs-devices-platform-ipmi118 What: /sys/devices/platform/ipmi_si.*/attentions
150 attentions (RO) Number of time the driver got an
/openbmc/openpower-hw-diags/analyzer/ras-data/
H A Dras-data-definition.md4 return a list of active attentions in the hardware, referred to as `signatures`.
/openbmc/linux/drivers/char/ipmi/
H A Dipmi_si_intf.c807 smi_inc_stat(smi_info, attentions); in smi_event_handler()
1636 IPMI_SI_ATTR(attentions);
/openbmc/linux/drivers/scsi/
H A Dst.c971 int attentions, waits, max_wait, scode; in test_ready() local
979 for (attentions=waits=0; ; ) { in test_ready()
996 if (attentions < MAX_ATTENTIONS) { in test_ready()
997 attentions++; in test_ready()