RE: [sv-bc] confusion in determining the type of an self determined binary expression during evalution of type operator

From: Feldman, Yulik <yulik.feldman_at_.....>
Date: Fri Oct 19 2007 - 00:03:45 PDT
Maybe I misunderstood your statement, but this approach doesn't seem to
work, since the types of identifiers are already not normalized:

logic [1:0] a;
logic [2:1] b;

Using the same index on a and b, a[1] and b[1], will give you different
bits.

--Yulik.

-----Original Message-----
From: Bresticker, Shalom 
Sent: Thursday, October 18, 2007 9:57 PM
To: Feldman, Yulik
Cc: 'sv-bc@server.eda-stds.org'
Subject: RE: [sv-bc] confusion in determining the type of an self
determined binary expression during evalution of type operator

From a user point of view, it seems to me that normalizing would be
easier in that my code would not have to change depending on whether I
wanted to reference bit n of an expression or an identifier. I would
only have to know its size and then the bit index would be the same for
every expression of that size, even if the exoression was just a simple
identifier.

Shalom

> -----Original Message-----
> From: Feldman, Yulik 
> Sent: Thursday, October 18, 2007 5:12 PM
> To: Gordon Vreugdenhil
> Cc: Bresticker, Shalom; sv-bc@server.eda-stds.org
> Subject: RE: [sv-bc] confusion in determining the type of an 
> self determined binary expression during evalution of type operator
> 
> Generally, I understand that both definitions (normalizing 
> and not normalizing the types) can theoretically work, and 
> thus I'll not argue too much if there is substantial 
> opposition to my point of view. But I'm interested to know 
> what the arguments for normalizing the types are.
> 
> I can understand that due to historical reasons some 
> simulators or other software tools implemented their internal 
> data structures in such a way that only normalized types can 
> be stored, and changing those data structures would mean 
> making major updates in tools' infrastructure, leading to 
> opposition to do such changes.
> 
> However, closing the eyes on historical reasons for a moment, 
> I'm not sure I understand what kind of negative effect 
> storing the original, non-normalized type information may 
> have on tool's efficiency. After all, you just need to let 
> the expression object point to the object storing the type 
> information. If the type object is the "original" one, you 
> anyway have its definition somewhere, so this object doesn't 
> consume additional memory. Also, you anyway need to store the 
> type information of the expression somehow, and I don't think 
> you could do that more compactly than storing a pointer to 
> the type object, even if the types would be normalized. From 
> performance point of view, following the pointer to the type 
> object and querying it in case of a need should be fast 
> enough for most practical needs. So it is not clear to me 
> where the supposed deficiency lies. 
> 
> Is there any other reason for considering using normalized 
> types other than unwillingness to change existing tool 
> implementations (assuming that some of them are currently 
> implementing normalization of the types)?
> 
> --Yulik.
> 
> -----Original Message-----
> From: Gordon Vreugdenhil [mailto:gordonv@model.com]
> Sent: Thursday, October 18, 2007 4:05 PM
> To: Feldman, Yulik
> Cc: Bresticker, Shalom; sv-bc@server.eda-stds.org
> Subject: Re: [sv-bc] confusion in determining the type of an 
> self determined binary expression during evalution of type operator
> 
> Feldman, Yulik wrote:
> > Yes, you're right. Maybe the example would be more persuading if I 
> > would use a typedef:
> > 
> > typedef logic base_type [3:1];
> > base_type a [4:1];
> > base_type b;
> > logic c;
> > assign b = a[3];
> > assign c = b[i];
> > 
> > Anyway, probably this example by itself is not a strong enough 
> > argument for not normalizing the types. Perhaps the other 
> argument I 
> > mentioned, to keep selects with more than one selection, 
> like a[3][1], 
> > unambiguous and similar in semantics with expressions like 
> (a[3])[i], 
> > is a stronger argument.
> > 
> > Another reason to keep the original type, is that if the type is 
> > normalized, you're practically introducing a new type to the model. 
> > For a software tool that means it should keep another object in 
> > memory, for example.
> 
> Yulik,
> 
> You are making a rather deep assumption here.  In my 
> experience with work on multiple simulator implementations, 
> the opposite is in fact true.
> It is more efficient on several fronts to assume 
> normalization everywhere.  This has been Ok to do in Verilog 
> and implementations assume various aspects of a normalized 
> model in fairly deep ways.
> 
> For those reasons, I will continue to have serious 
> reservations about this direction.
> 
> There may be some cases in which it is reasonable to require 
> more specific type information.  Steven has raised some of 
> those.  But requiring fully accurate information in various 
> places is a mine field and is a position for which you'll 
> almost certainly find substantial opposition.
> 
> Gord
> --
> --------------------------------------------------------------------
> Gordon Vreugdenhil                                503-685-0808
> Model Technology (Mentor Graphics)                gordonv@model.com
> 
---------------------------------------------------------------------
Intel Israel (74) Limited

This e-mail and any attachments may contain confidential material for
the sole use of the intended recipient(s). Any review or distribution
by others is strictly prohibited. If you are not the intended
recipient, please contact the sender and delete all copies.

-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.
Received on Fri, 19 Oct 2007 09:03:45 +0200

This archive was generated by hypermail 2.1.8 : Fri Oct 19 2007 - 00:07:23 PDT