[Dwarf-Discuss] EXTERNAL: Corner-cases with bitfields

Pedro Alves alves.ped at gmail.com
Tue May 10 02:15:38 PDT 2022

On 2022-05-09 22:41, Robinson, Paul wrote:
>> Pedro Alves wrote:
>> On 2022-05-09 16:48, Ron Brender via Dwarf-Discuss wrote:
>>> So my suggestion is to file a bug report with CLANG, requesting they
>> correct their DWARF output to reflect all details needed
>>> by your language.
>> An issue here is that DWARF does say this, in (DWARF 5, 5.7.6 Data Member
>> Entries, page 119):
>>  "If the size of a data member is not the same as the size of the type
>> given for the
>> ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>> ^^^^^^^
>>  data member, the data member has either a DW_AT_byte_size or a
>>  ^^^^^^^^^^^
>>  DW_AT_bit_size attribute whose integer constant value (see Section 2.19
>> on
>>  page 55) is the amount of storage needed to hold the value of the data
>> member."
>> Note the part I underlined.  In Lancelot's case, the size of the data
>> member
>> IS the same as the size of the type given for the data member.  So Clang
>> could well pedantically
>> claim that they _are_ following the spec.  Shouldn't the spec be clarified
>> here?
> What the spec says is that a producer isn't _required_ to emit the
> DW_AT_bit_size attribute.  But, given that DWARF is a permissive
> standard, the producer is certainly _allowed_ to emit the attribute.  
> If this is a hint that the target debugger will understand, regarding
> the ABI, it seems okay to me for the producer to do that.
>> This then raises the question of whether a debugger can assume that the
>> presence of a DW_AT_bit_size
>> attribute indicates that the field is a bit field at the C/C++ source
>> level.  GDB is assuming that
>> today, as there's really no other way to tell, but I don't think the spec
>> explicitly says so.
> GDB is choosing to make that interpretation, which it's allowed to do.
> The DWARF spec just doesn't promise that interpretation is correct.

OOC, do you know of any consumer that makes a different interpretation?

> You can propose to standardize that interpretation by filing an issue
> with the DWARF committee at https://dwarfstd.org/Comment.php and it might
> or might not become part of DWARF v6.  

We're just here in open dwarf-discuss list discussing potential approaches
and hearing from experience and guidance of/from others before deciding whether
an issue should be filed at all, and if submitting an issue is needed, then
which approach or approaches could have better chances of going through.

> It might be tricky because you'd
> be generalizing something very specific to your environment.

I actually don't see why is this specific to one environment.  See below.

> You can also, separately, try to get Clang to emit the DW_AT_bit_size
> attribute in these cases for the AMDGPU target(s).  This seems more
> likely to work, especially as there's an ABI requirement involved, and
> (given that GDB makes this interpretation) I assume gcc already does this.

GCC emits DW_AT_bit_size for bitfields for all targets and ABIs AFAICT.  At:


we have:

  if (DECL_BIT_FIELD_TYPE (decl))
      add_byte_size_attribute (decl_die, decl);
      add_bit_size_attribute (decl_die, decl);
      add_bit_offset_attribute (decl_die, decl);

I don't see why this needs to be restricted to some ABIs.  I mean, with
DWARF emitted by GCC, with code like this:

 struct Foo
   char a : 8;
   char b : 8;
 Foo foo;
 int main () {}

you get, on x86:

 (gdb) ptype foo
 type = struct Foo {
     char a : 8;
     char b : 8;

and with DWARF emitted by Clang, you get:

 (gdb) ptype foo
 type = struct Foo {
     char a;
     char b;

Sure, on x86, the bitfield vs non-bitfield types are laid out in memory the
same.  But still, at the language level, the types ARE different.  And DWARF is
also about language source to machine code mapping.  C/C++ don't let you take the
address of a bitfield, for example, so it's possible type expressions in the
debugger that behave differently depending on what the DWARF described.
>From the perspective of GDB, IMO, this is a bug, even on x86.

So in my view, you'd want a producer to always indicate that the field
is a bitfield somehow, even if the memory layout wouldn't change, i.e.,
regardless of ABI.  

Maybe we can convince Clang folks of the same as GCC, that certainly sounds simpler,
but I figured that maybe there would be agreement that the spec itself should be tweaked
in this direction, if every producer would come to the same conclusion.

Pedro Alves

More information about the Dwarf-Discuss mailing list