Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generating SRAM gets stuck in extraction process of magic #248

Open
FriedrichWu opened this issue Jul 30, 2024 · 3 comments
Open

Generating SRAM gets stuck in extraction process of magic #248

FriedrichWu opened this issue Jul 30, 2024 · 3 comments

Comments

@FriedrichWu
Copy link

Describe the bug
Hello,

I'm trying to generating 2kByte Sram, but the process gets stuck in run_ext for nearly one day. This issue seems to occur more when sram size is large. I could make the whole sram generation process be able to finish by disabling lvs & commenting the stimulus and measurement part, but this is of course not a very good solution.

This is a similar issue like here->#245

And in the .ext.err file I find a lot of warnings, besides "unknown layers" and "boundary was redefined", there is also error shows "Error: Asymmetric device with multiple terminals!", which is also mentioned here->https://web.open-source-silicon.dev/t/424007/i-m-trying-the-sky130-support-in-openram-using-the-example-c

Version
The current latest version

Expected behavior
The process could finish without error

Logs
myconfig_sky.ext.err.zip

@mguthaus
Copy link
Collaborator

This sounds like a run time bug with magic. There isn't much we can do to speed up the extraction on the OpenRAM side.

My only questions are how much memory you have and are using. Especially if you are running it in Docker or a VM.

@mguthaus
Copy link
Collaborator

The asymmetric device message is expected since the SRAM cells have an asymmetric transistor or two.

@FriedrichWu
Copy link
Author

This sounds like a run time bug with magic. There isn't much we can do to speed up the extraction on the OpenRAM side.

My only questions are how much memory you have and are using. Especially if you are running it in Docker or a VM.

Thanks for your quick reply. I'm running on Linux sever, it has around 124 GB memory available and another 37 GB for swap

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants