Hi Gord:
Thanks for the clarification. I agree that the variable references wouldn't be live. They would be sampled at covergroup creation.
My intention was to indicate that this behavior would be a departure from what is doable today. With current covergroups another tool (e.g., a coverage management tool) can determine the coverage space given the covergroup definition. If constructs like item 3 are used in a covergroup I believe that a simulator or simulator-type evaluation would be required to even calculate the coverage space. Is that correct?
Thanks,
Scott
> -----Original Message-----
> From: owner-sv-ec@eda.org [mailto:owner-sv-ec@eda.org] On Behalf Of
> Gordon Vreugdenhil
> Sent: Tuesday, March 29, 2011 8:52 AM
> To: Little Scott-B11206
> Cc: sv-ec@eda.org; Fais Yaniv-RM96496
> Subject: Re: [sv-ec] Outstanding covergroup filtering (2506) questions
>
> Scott, one clarification -- in (3) you suggest that a simulator would
> be
> required to calculate the with expressions at sim time for variable
> references. I don't think that is required. The simulator would
> certainly
> need to capture the values of all referenced variables at the time of
> covergroup construction and associate the fixed set of values as the
> "closure"
> (see http://en.wikipedia.org/wiki/Closure_%28computer_science%29)
> of the function along with the covergroup.
>
> I didn't hear anyone suggest that variable references should be "live"
> in the sense that changes after covergroup construction would impact
> the function behavior.
>
> So as long as the simulator captured the values, the actual filtering
> could still be done late if one wanted to do so.
>
> Gord.
>
>
> On 3/28/2011 8:00 PM, Little Scott-B11206 wrote:
> > Hi all:
> >
> > Medhi asked me to summarize the primary outstanding issues in the
> covergroup filtering proposal (2506). I will ask the Freescale user
> community for their opinion on these issues. I hope that others will
> do the same and discuss the results on the reflector. I will be
> traveling and only be able to catch the end of the meeting on April
> 11th. Hopefully we can resolve most of these issues over the
> reflector.
> >
> > 1. Use model for with expressions. My understanding is that the
> expected use model for the with expression is to reduce/shape the
> coverage space like is done with ignore bins but using more powerful
> methods than are allowed by the current ignore bins syntax. If this is
> the case, does this imply that the solution should be an enhancement to
> the ignore bins syntax and not something as general as with
> expressions? Are there additional use models envisioned by the user
> community that benefit from the power provided by with expressions?
> >
> > A note on current capabilities:
> > ignore/illegal bins: these must be explicitly specified which is
> tedious for large coverage spaces (ignore_bins are often generated via
> scripts for large spaces). Ignore bins are removed after bin
> distribution. See pg. 496 in P1800-2009 for an example.
>
> >
> > iff: this does not affect the shape of the coverage space. It can
> affect when coverage is collected.
> >
> > intersect: allows reduction of cross bins that contain values in a
> given open range expression. This gives some power but again it is
> very limiting for complex coverage shaping. See pg. 501 in P1800-2009
> for an example.
> >
> > ||,&&: allows a reduction of cross bins by oring or anding
> combinations of the bins involved in the cross. See pg. 501 in P1800-
> 2009 for an example. Again, this is not terribly powerful.
> >
> > 2. Based on the use model for a constructive/generative approach does
> it make sense to apply the expression and then do bin distribution or
> do bin distribution and then apply the filtering? In the first case a
> change of the expression will likely result in very different bin
> assignments. In the second approach there may be empty bins. How
> would these artifacts complicate the coverage merging process? Current
> covergroup constructs may exhibit these merging issues, but we don't
> want to needlessly exacerbate the problem.
> >
> > 3. The proposal adds syntax to use a function to generate a queue of
> elements that define the cross bins. The question is what will users
> want to access in these functions? There are three levels of items
> that may be accessed 1. compile time constants (parameters, etc.) 2.
> const variable/arguments 3. general variable references (these would be
> "sampled" at the instantiation of the covergroup). Which level of
> "purity" do users desire in these functions? It should be noted that
> as we proceed up the list the opportunity for tools to pre/post
> calculate the coverage space is diminished. In fact, if the users
> desire item 3 it would require a simulator to calculate coverage spaces
> of this type.
> >
> > Thanks,
> > Scott
> >
> >
>
> --
> --------------------------------------------------------------------
> Gordon Vreugdenhil 503-685-0808
> Model Technology (Mentor Graphics) gordonv@model.com
>
>
> --
> This message has been scanned for viruses and
> dangerous content by MailScanner, and is
> believed to be clean.
>
-- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean.Received on Tue Mar 29 07:09:24 2011
This archive was generated by hypermail 2.1.8 : Tue Mar 29 2011 - 07:09:28 PDT