What is preventing compile time evaluation of this constexpr function?

247 Views Asked by At

I'm working on a class for representing a set of hardware pins of a microcontroller (STM32). The selected pins may be discontinuous on the port, but they are assumed to be ordered. For example, if this PortSegment object is created to represent PA2, PA3 & PA6 pins, I want to be able to make an assignment like segment = 0b101u , which sets PA2 & PA6 and resets PA3.

Currently I haven't implemented ctor for discontinuous pins yet. The current one allows representing only continuous pins like PA2, P3 & PA4. However, the logic for mapping compressed bits (like 0b101u in the above example) to actual hardware bits is implemented for the discontinuous case.

I thought an assignment like segment = 0b101u could be calculated mostly on compile time, and only the loading the actual hardware register (BSRR for STM32, which handles atomic set & reset of hardware pins) takes place during run time, using a pre-calculated value. Unfortunately this isn't what happens and the value to be loaded into BSRR is also calculated in run time.

Here is the somewhat simplified and half-baked version of the code I'm testing. Port selection (GPIOA, GPIOB etd.) code is omitted.

#include <cstdint>

volatile uint32_t BSRR {0}; // Assume it's a HW register for atomic pin access.

class PortSegment {
public:

    constexpr PortSegment(uint8_t start, uint8_t end)
    : selection{calculateSelection(start, end)} {}

    uint16_t operator=(uint16_t setVal) const;
//  operator uint16_t() const; // to be implemented later

private:

    static constexpr uint16_t calculateSelection(uint8_t start, uint8_t end);
    static constexpr uint16_t mapBits(uint16_t val, uint16_t selection);

    uint16_t selection; // Table of used bits in the port

};

// Used in ctor
constexpr uint16_t PortSegment::calculateSelection(uint8_t start, uint8_t end)
{
    uint16_t result {0};
    for (unsigned i = start; i <= end; ++i) result |= (1u << i);
    return result;
}

// static function
constexpr uint16_t PortSegment::mapBits(uint16_t val, uint16_t selection)
{
    uint16_t result {0};
    for (unsigned i = 0; i < 16; ++i) {
        if (selection & 1u)  {
            if (val & (1u << i)) {
                result |= (1u << i);
            }
        }
        else {
            val <<= 1;
        }
        selection >>= 1;
    }
    return result;
}

inline uint16_t PortSegment::operator=(uint16_t setVal) const
{
    uint32_t mapped {mapBits(setVal, selection)};
    BSRR = ((~mapped << 16) | mapped)
            & ((static_cast<uint32_t>(selection) << 16) | selection);
    return setVal;
}

int main()
{
    constexpr PortSegment segment {2,5}; // Use port pins 2,3,4,5
    segment = 0b1010u;
}

The selection member variable represents the pins used in the port. For example, 0b111100 means use PA2, PA3, PA4, PA5. The problem is that, the mapBits() function is not evaluated during compile time. I also tried to make it non-static member function, but nothing changed. According to my logic, when the segment object of PortSegment class created, everything is already known during compile time, and the value to be loaded into BSRR could also be known. But it seems I'm missing something.

Another strange thing I discovered that, if I change selection >>= 1; in the mapBits() function to selection <<= 1; (which makes no sense for the algorithm), mapBits() can be calculated compile time.

Here is the code in Godbolt.

1

There are 1 best solutions below

1
On BEST ANSWER

You have set optimisation to level 1 in Godbolt! Try -O3 instead of -O1.