Re: [sv-bc] confusion in determining the type of an self determined binary expression during evalution of type operator

From: Greg Jaxon <Greg.Jaxon_at_.....>
Date: Tue Oct 23 2007 - 17:57:40 PDT
I think I'm in violent agreement with Yulik and Steven that the non-selected
dimensions should remain as declared, and that the selected dimension either
vanishes (bit-select), or is normalized (part-select).

Preserving the declared bounds of selected elements is not optional, because
it affects the interpretation of assignment patterns such as a[5] = '{3: ..., 4:...}

Greg

Steven Sharp wrote:
>> From: "Feldman, Yulik" <yulik.feldman@intel.com>
> 
>> As a thumb rule, I think that the type of part select should be the same
>> as the type of the data selected by it. I.e., if we have "bit [2:1] a
>> [7:5][4:3];", the type of "a[5]" should be "bit [2:1] (imaginary
>> placeholder) [4:3]", without further normalization. 
> 
> I don't agree about the type of a part select.  However, the example
> you give here is not a part select.  It selects part of an object,
> but "part select" is a technical term for a specific kind of select,
> which you are not using here.
> 
> I can accept making a select of an element of an array have the type
> of that element.  Clearly this must be done for arrays of classes or
> unpacked structs.  It seems reasonable for an array of unpacked arrays
> also.  This is not really essential, though.  I don't think those are
> very strongly typed.  Since they are allowed to assign freely to any
> array with the same "shape", even with different bounds, it doesn't
> seem like it would cause major problems if the bounds on the result
> type were normalized.
> 
> For elements that are packed types, this gets even less clear.  The
> reference is an integral value and can be operated on as an integral
> value.  So it is harder to argue that it is really the packed type.
> 
> 
>> There is one situation where the type may be normalized without any bad
>> consequences, which is when a part select selects more than one element
>> of an array, like in "a[6:5]".
> 
> This starts getting us into the area of true part selects (when it is
> a vector) or slices (for arrays).  Here it sounds like you are more in
> agreement with me (and Gord).  I would adjust your description slightly
> though.  It is not the selection of more than one element that should
> matter here; it is the use of the range (or part select) syntax, which
> can select more than one element.  The slice "a[6:6]" only selects one
> element, but it is still a slice.  
> 
>> In this case, the topmost array type
>> (corresponding to the range of selected elements of the array) may be
>> normalized, while the type of the elements of this array should still be
>> left in their original form (I believe). I.e, the type of "a[6:5]" can
>> be either the partly normalized to "bit [2:1] (placeholder) [1:0][4:3]"
>> or be left in a more-or-less "original" form like "bit [2:1]
>> (placeholder) [6:5][4:3]". I do agree that normalizing the top-most
>> array for part selects selecting more than one element of an array may
>> be preferable over not normalizing it, though I'm not sure it really
>> matters for tool's performance.
> 
> This sounds closer to my viewpoint than I assumed from your earlier
> statements.
> 
> I am also unclear how this normalizing would affect run-time performance,
> unless non-normalized bounds were somehow expected to pass through some
> dynamic mechanism, such as ref arguments.
> 
>> If we simplify the above example and look on a simple one-dimensional
>> packed array, like "bit [7:5] b;", then, if we normalize the topmost
>> array, the type of "b[6:5]" will be "bit [1:0] b;". So, unless you think
>> that the whole type should normalized, and not only that of the topmost
>> array, we may be still thinking the same. If you believe everything
>> should be normalized, then we indeed have a disagreement, and we will
>> have to discuss it. In my eyes, normalizing all types, in whatever way,
>> would make this type information mostly, if not completely, useless.
> 
> Gord will have to reply, but it does sound like less disagreement than
> I assumed.
> 
> I agree that normalizing types that are strictly typed, such as classes
> or unpacked structs, would cause problems.  However, I don't think that
> the array cases from your examples are as bad as you suggest.  This is
> because the assignment compatibility rules are based on "shape" rather
> than exact bounds.  So the type information can be used to declare a
> type that is compatible with the element, even if it is normalized.
> The fact that the compatibility rules are based on "shape" actually
> supports the argument that this is all that is retained about the
> type of an array expression.
> 
> I haven't fully considered whether I think that things other than the
> topmost dimension should be normalized.
> 
> Steven Sharp
> sharp@cadence.com
> 
> 


-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.
Received on Tue Oct 23 17:58:03 2007

This archive was generated by hypermail 2.1.8 : Tue Oct 23 2007 - 17:58:18 PDT