Compare commits

..

65 Commits

Author SHA1 Message Date
jan
8f9d9d40ab [gdsii_arrow] add gdsii_arrow 2026-01-19 22:13:54 -08:00
ff148c477e [remove_duplicate_vertices] remove the last vertex rather than the first
to better match docs
2026-01-19 22:13:54 -08:00
c93882e8b2 [Polygon.rect] raise a PatternError when given the wrong number of args
instead of assert
2026-01-19 22:13:54 -08:00
jan
fd11df382c [dxf] make sure layer tuple contents are ints 2026-01-19 22:13:54 -08:00
jan
59b871a577 [dxf] don't need to add polygon offset since it's zero 2026-01-19 22:13:54 -08:00
jan
0e109e2a89 update TODO in readme 2026-01-19 22:13:54 -08:00
jan
c22fc723a7 [PortPather] generalize to multi-port functions where possible 2026-01-19 22:13:54 -08:00
jan
cb99505cab [Pattern] use 1-axis instead of axis-1 2026-01-19 22:13:54 -08:00
jan
91aad3b8a3 [AutoTool] S-bend to L-bend fallback does not work yet, should throw an error 2026-01-19 22:13:54 -08:00
jan
3aa0254a47 [Pather / RenderPather] Fix handling of jog polarity 2026-01-19 22:13:54 -08:00
jan
9e1a26e293 [Port] mirror() should not mirror port position, only orientation 2026-01-19 22:13:54 -08:00
jan
2632e333c9 minor readme cleanup 2026-01-19 22:13:54 -08:00
jan
5c8bb642f5 [RenderPather.plug] fix ok_connections param 2026-01-19 22:13:54 -08:00
jan
bb8a182791 cleanup 2026-01-19 22:13:54 -08:00
jan
3dac060246 [PortPather] add some more port-related convenience functions 2026-01-19 22:13:54 -08:00
jan
9384ecd211 [AutoTool / SimpleTool] allow choice between rotating or mirroring bends 2026-01-19 22:13:54 -08:00
jan
5d13269573 [Library.flatten] add dangling_ok param 2026-01-19 22:13:54 -08:00
jan
692198f7b3 [PatherMixin] add thru arg to path_into and rework portlist inheritance 2026-01-19 22:13:54 -08:00
jan
d5679d60a3 [PortPather] add rename_to and rename_from 2026-01-19 22:13:54 -08:00
jan
bcf2441077 [PatherMixin] add at() for generating PortPather 2026-01-19 22:13:54 -08:00
jan
623bd3781d add missing float_raster dep for manhattanize_slow 2026-01-19 22:13:54 -08:00
jan
0cb44e7ac2 [PortPather] add PortPather 2026-01-19 22:13:54 -08:00
jan
209d0cee4e [RenderPather] whitespace 2026-01-19 22:13:54 -08:00
jan
c3bee9608a [plug()] rename inherit_name arg to thru and allow passing a string
Breaking change

Affects Pattern, Builder, Pather, RenderPather
2026-01-19 22:13:54 -08:00
jan
286c7fd2e8 add some whitespace 2026-01-19 22:13:54 -08:00
3f3469006b [Tool / AutoTool / Pather / RenderPather / PatherMixin] add support for S-bends 2026-01-19 22:13:54 -08:00
3533a284fb [Port] add Port.measure_travel() 2026-01-19 22:13:54 -08:00
332f5c3f56 [Tool / Pather] fix some doc typos 2026-01-19 22:13:54 -08:00
0d5fe19353 [Tool / AutoTool] clarify some docstings 2026-01-19 22:13:54 -08:00
f1f760af27 [Pather] clarify a variable name 2026-01-19 22:13:54 -08:00
e229bb4804 [AutoTool] enable S-bends 2026-01-19 22:13:54 -08:00
abd2d18bca [AutoTool / SimpleTool] remove append arg 2026-01-19 22:13:54 -08:00
1cf8f769fc [SimpleTool/AutoTool] clarify some error messages 2026-01-19 22:13:54 -08:00
1d1a4a0e32 [AutoTool/SimpleTool/BasicTool] Rename BasicTool->SimpleTool and remove transition handling. Export AutoTool and SimpleTool at top level. 2026-01-19 22:13:54 -08:00
eff538f348 [AutoTool] pass in kwargs to straight fn call 2026-01-19 22:13:54 -08:00
c65bc3bfd5 [AutoTool] consolidate duplicate code for path() and render() 2026-01-19 22:13:54 -08:00
1ebd4d5b3b [AutoTool] add add_complementary_transitions() 2026-01-19 22:13:54 -08:00
9cb51e86e5 [AutoTool] Use more dataclasses to clarify internal code 2026-01-19 22:13:54 -08:00
51651d0538 [PolyCollection] rename setter arg to placate linter 2026-01-19 22:13:54 -08:00
a7fd07ebec [format_stacktrace] suppress linter 2026-01-19 22:13:54 -08:00
15b3599b3e [AutoTool] support min/max length for straight segments 2026-01-19 22:13:54 -08:00
bb75c7addb [BasicTool/AutoTool] fix port orientation for straight segments when using RenderPather 2026-01-19 22:13:54 -08:00
b7c20c74a8 [AutoTool] Add first pass for AutoTool 2026-01-19 22:13:54 -08:00
4f6c91ebdb [RenderPather] add wrapped label/ref/polygon/rect functions 2026-01-19 22:13:54 -08:00
667885b195 [Pather/RenderPather/PatherMixin] clean up imports 2026-01-19 22:13:54 -08:00
b02ba3ef01 [Pather / RenderPather] move common functionality into PatherMixin; redo hierarchy
- (BREAKING change) Pather.mpath no longer wraps the whole bus into a
container, since this has no equivalent in RenderPather. Possible this
functionality will return in the future
- Removed `tool_port_names` arg from Pather functions
- In general RenderPather should be much closer to Pather now
2026-01-19 22:13:54 -08:00
890ba03340 [pather] code style changes 2026-01-19 22:13:54 -08:00
c753b19d69 [error] also exclude concurrent.futures.process from traces 2026-01-19 22:13:54 -08:00
7042461244 [error] also exclude frames starting with '<frozen' 2026-01-19 22:13:54 -08:00
9b95d37c8f [file.svg] use logger.warning over warnings.warn (for flexibility) 2026-01-19 22:13:54 -08:00
41c34a6e12 [ports] make port mismatch deltas more obvious 2026-01-19 22:13:54 -08:00
abac12b991 [error, ports] Make stack traces more directly reflect teh location of the issue 2026-01-19 22:13:54 -08:00
612c757bd4 misc cleanup: variable naming, typing, comments 2026-01-19 22:13:54 -08:00
f19806e280 [Path / PolyCollection / Polygon] fix order of rotation/offset 2026-01-19 22:13:54 -08:00
8b53f816df [Polygon / Path / PolyCollection] Force polygon/path offset to (0, 0)
And disallow setting it.

This offset was basically just a footgun.
2026-01-19 22:13:54 -08:00
d4035ec8af [Polygon.rect] use floats more explicitly 2026-01-19 22:13:54 -08:00
9484cf7423 Various type-checking improvements 2026-01-19 22:13:54 -08:00
97734297ab [file.gdsii] enable wider annotation key range (to 126 inclusive) 2026-01-19 22:13:54 -08:00
e20bcdc79f [BasicTool] enable straight to handle trees (not just flat patterns) 2026-01-19 22:13:54 -08:00
a16deaf0cf [builder.tools] Handle in_ptype=None 2026-01-19 22:13:54 -08:00
ade8544aca [traits.annotatable] Don't break when setting annotations to None 2026-01-19 22:13:54 -08:00
b226d0421f [shapes] Don't create empty dicts for annotations 2026-01-19 22:13:54 -08:00
jan
e41e91f6e0 [PolyCollection] add PolyCollection shape
based on ndarrays of vertices and offsets
2026-01-19 22:13:09 -08:00
dea3eca178 [utils.curves] ignore re-import of trapeziod 2026-01-19 21:25:30 -08:00
9869f12475 allow annotations to be None
breaking change, but properties are seldom used by anyone afaik
2026-01-19 21:25:30 -08:00
7 changed files with 461 additions and 57 deletions

View File

@ -37,55 +37,6 @@ A layout consists of a hierarchy of `Pattern`s stored in a single `Library`.
Each `Pattern` can contain `Ref`s pointing at other patterns, `Shape`s, `Label`s, and `Port`s. Each `Pattern` can contain `Ref`s pointing at other patterns, `Shape`s, `Label`s, and `Port`s.
Library / Pattern hierarchy:
```
+-----------------------------------------------------------------------+
| Library |
| |
| Name: "MyChip" ...> Name: "Transistor" |
| +---------------------------+ : +---------------------------+ |
| | [Pattern] | : | [Pattern] | |
| | | : | | |
| | shapes: {...} | : | shapes: { | |
| | ports: {...} | : | "Si": [<Polygon>, ...] | |
| | | : | "M1": [<Polygon>, ...]}| |
| | refs: | : | ports: {G, S, D} | |
| | "Transistor": [Ref, Ref]|..: +---------------------------+ |
| +---------------------------+ |
| |
| # (`refs` keys resolve to Patterns within the Library) |
+-----------------------------------------------------------------------+
```
Pattern internals:
```
+---------------------------------------------------------------+
| [Pattern] |
| |
| shapes: { |
| (1, 0): [Polygon, Circle, ...], # Geometry by layer |
| (2, 0): [Path, ...] |
| "M1" : [Path, ...] |
| "M2" : [Polygon, ...] |
| } |
| |
| refs: { # Key sets target name, Ref sets transform |
| "my_cell": [ |
| Ref(offset=(0,0), rotation=0), |
| Ref(offset=(10,0), rotation=R90, repetition=Grid(...)) |
| ] |
| } |
| |
| ports: { |
| "in": Port(offset=(0,0), rotation=0, ptype="M1"), |
| "out": Port(offset=(10,0), rotation=R180, ptype="wg") |
| } |
| |
+---------------------------------------------------------------+
```
`masque` departs from several "classic" GDSII paradigms: `masque` departs from several "classic" GDSII paradigms:
- A `Pattern` object does not store its own name. A name is only assigned when the pattern is placed - A `Pattern` object does not store its own name. A name is only assigned when the pattern is placed
into a `Library`, which is effectively a name->`Pattern` mapping. into a `Library`, which is effectively a name->`Pattern` mapping.
@ -285,4 +236,3 @@ my_pattern.ref(_make_my_subpattern(), offset=..., ...)
* Better interface for polygon operations (e.g. with `pyclipper`) * Better interface for polygon operations (e.g. with `pyclipper`)
- de-embedding - de-embedding
- boolean ops - boolean ops
* tuple / string layer auto-translation

View File

@ -24,7 +24,6 @@ class Abstract(PortList):
When snapping a sub-component to an existing pattern, only the name (not contained When snapping a sub-component to an existing pattern, only the name (not contained
in a `Pattern` object) and port info is needed, and not the geometry itself. in a `Pattern` object) and port info is needed, and not the geometry itself.
""" """
# Alternate design option: do we want to store a Ref instead of just a name? then we can translate/rotate/mirror...
__slots__ = ('name', '_ports') __slots__ = ('name', '_ports')
name: str name: str
@ -49,6 +48,8 @@ class Abstract(PortList):
self.name = name self.name = name
self.ports = copy.deepcopy(ports) self.ports = copy.deepcopy(ports)
# TODO do we want to store a Ref instead of just a name? then we can translate/rotate/mirror...
def __repr__(self) -> str: def __repr__(self) -> str:
s = f'<Abstract {self.name} [' s = f'<Abstract {self.name} ['
for name, port in self.ports.items(): for name, port in self.ports.items():
@ -87,7 +88,7 @@ class Abstract(PortList):
def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self: def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self:
""" """
Rotate the Abstract around a pivot point. Rotate the Abstract around the a location.
Args: Args:
pivot: (x, y) location to rotate around pivot: (x, y) location to rotate around

View File

@ -210,8 +210,7 @@ class Builder(PortList):
self.pattern.rect(*args, **kwargs) self.pattern.rect(*args, **kwargs)
return self return self
# Note: We're a superclass of `Pather`, where path() means something different, # Note: We're a superclass of `Pather`, where path() means something different...
# so we shouldn't wrap Pattern.path()
#@wraps(Pattern.path) #@wraps(Pattern.path)
#def path(self, *args, **kwargs) -> Self: #def path(self, *args, **kwargs) -> Self:
# self.pattern.path(*args, **kwargs) # self.pattern.path(*args, **kwargs)

View File

@ -487,7 +487,7 @@ class RenderPather(PatherMixin):
# Fall back to drawing two L-bends # Fall back to drawing two L-bends
ccw0 = jog > 0 ccw0 = jog > 0
kwargs_no_out = (kwargs | {'out_ptype': None}) kwargs_no_out = (kwargs | {'out_ptype': None})
t_port0, _ = tool.planL( ccw0, length / 2, in_ptype=in_ptype, **kwargs_no_out) # TODO length/2 may fail with asymmetric ptypes t_port0, _ = tool.planL( ccw0, length / 2, in_ptype=in_ptype, **kwargs_no_out)
jog0 = Port((0, 0), 0).measure_travel(t_port0)[0][1] jog0 = Port((0, 0), 0).measure_travel(t_port0)[0][1]
t_port1, _ = tool.planL(not ccw0, abs(jog - jog0), in_ptype=t_port0.ptype, **kwargs) t_port1, _ = tool.planL(not ccw0, abs(jog - jog0), in_ptype=t_port0.ptype, **kwargs)
jog1 = Port((0, 0), 0).measure_travel(t_port1)[0][1] jog1 = Port((0, 0), 0).measure_travel(t_port1)[0][1]

453
masque/file/gdsii_arrow.py Normal file
View File

@ -0,0 +1,453 @@
# ruff: noqa: ARG001, F401
"""
GDSII file format readers and writers using the `TODO` library.
Note that GDSII references follow the same convention as `masque`,
with this order of operations:
1. Mirroring
2. Rotation
3. Scaling
4. Offset and array expansion (no mirroring/rotation/scaling applied to offsets)
Scaling, rotation, and mirroring apply to individual instances, not grid
vectors or offsets.
Notes:
* absolute positioning is not supported
* PLEX is not supported
* ELFLAGS are not supported
* GDS does not support library- or structure-level annotations
* GDS creation/modification/access times are set to 1900-01-01 for reproducibility.
* Gzip modification time is set to 0 (start of current epoch, usually 1970-01-01)
TODO writing
TODO warn on boxes, nodes
"""
from typing import IO, cast, Any
from collections.abc import Iterable, Mapping, Callable
import io
import mmap
import logging
import pathlib
import gzip
import string
from pprint import pformat
import numpy
from numpy.typing import ArrayLike, NDArray
from numpy.testing import assert_equal
import pyarrow
from pyarrow.cffi import ffi
from .utils import is_gzipped, tmpfile
from .. import Pattern, Ref, PatternError, LibraryError, Label, Shape
from ..shapes import Polygon, Path, PolyCollection
from ..repetition import Grid
from ..utils import layer_t, annotations_t
from ..library import LazyLibrary, Library, ILibrary, ILibraryView
logger = logging.getLogger(__name__)
clib = ffi.dlopen('/home/jan/projects/klamath-rs/target/release/libklamath_rs_ext.so')
ffi.cdef('void read_path(char* path, struct ArrowArray* array, struct ArrowSchema* schema);')
path_cap_map = {
0: Path.Cap.Flush,
1: Path.Cap.Circle,
2: Path.Cap.Square,
4: Path.Cap.SquareCustom,
}
def rint_cast(val: ArrayLike) -> NDArray[numpy.int32]:
return numpy.rint(val).astype(numpy.int32)
def _read_to_arrow(
filename: str | pathlib.Path,
*args,
**kwargs,
) -> pyarrow.Array:
path = pathlib.Path(filename)
path.resolve()
ptr_array = ffi.new('struct ArrowArray[]', 1)
ptr_schema = ffi.new('struct ArrowSchema[]', 1)
clib.read_path(str(path).encode(), ptr_array, ptr_schema)
iptr_schema = int(ffi.cast('uintptr_t', ptr_schema))
iptr_array = int(ffi.cast('uintptr_t', ptr_array))
arrow_arr = pyarrow.Array._import_from_c(iptr_array, iptr_schema)
return arrow_arr
def readfile(
filename: str | pathlib.Path,
*args,
**kwargs,
) -> tuple[Library, dict[str, Any]]:
"""
Wrapper for `read()` that takes a filename or path instead of a stream.
Will automatically decompress gzipped files.
Args:
filename: Filename to save to.
*args: passed to `read()`
**kwargs: passed to `read()`
"""
arrow_arr = _read_to_arrow(filename)
assert len(arrow_arr) == 1
results = read_arrow(arrow_arr[0])
return results
def read_arrow(
libarr: pyarrow.Array,
raw_mode: bool = True,
) -> tuple[Library, dict[str, Any]]:
"""
# TODO check GDSII file for cycles!
Read a gdsii file and translate it into a dict of Pattern objects. GDSII structures are
translated into Pattern objects; boundaries are translated into polygons, and srefs and arefs
are translated into Ref objects.
Additional library info is returned in a dict, containing:
'name': name of the library
'meters_per_unit': number of meters per database unit (all values are in database units)
'logical_units_per_unit': number of "logical" units displayed by layout tools (typically microns)
per database unit
Args:
stream: Stream to read from.
raw_mode: If True, constructs shapes in raw mode, bypassing most data validation, Default True.
Returns:
- dict of pattern_name:Patterns generated from GDSII structures
- dict of GDSII library info
"""
library_info = _read_header(libarr)
layer_names_np = libarr['layers'].values.to_numpy().view('i2').reshape((-1, 2))
layer_tups = [tuple(pair) for pair in layer_names_np]
cell_ids = libarr['cells'].values.field('id').to_numpy()
cell_names = libarr['cell_names'].as_py()
def get_geom(libarr: pyarrow.Array, geom_type: str) -> dict[str, Any]:
el = libarr['cells'].values.field(geom_type)
elem = dict(
offsets = el.offsets.to_numpy(),
xy_arr = el.values.field('xy').values.to_numpy().reshape((-1, 2)),
xy_off = el.values.field('xy').offsets.to_numpy() // 2,
layer_inds = el.values.field('layer').to_numpy(),
prop_off = el.values.field('properties').offsets.to_numpy(),
prop_key = el.values.field('properties').values.field('key').to_numpy(),
prop_val = el.values.field('properties').values.field('value').to_pylist(),
)
return elem
rf = libarr['cells'].values.field('refs')
refs = dict(
offsets = rf.offsets.to_numpy(),
targets = rf.values.field('target').to_numpy(),
xy = rf.values.field('xy').to_numpy().view('i4').reshape((-1, 2)),
invert_y = rf.values.field('invert_y').fill_null(False).to_numpy(zero_copy_only=False),
angle_rad = numpy.rad2deg(rf.values.field('angle_deg').fill_null(0).to_numpy()),
scale = rf.values.field('mag').fill_null(1).to_numpy(),
rep_valid = rf.values.field('repetition').is_valid().to_numpy(zero_copy_only=False),
rep_xy0 = rf.values.field('repetition').field('xy0').fill_null(0).to_numpy().view('i4').reshape((-1, 2)),
rep_xy1 = rf.values.field('repetition').field('xy1').fill_null(0).to_numpy().view('i4').reshape((-1, 2)),
rep_counts = rf.values.field('repetition').field('counts').fill_null(0).to_numpy().view('i2').reshape((-1, 2)),
prop_off = rf.values.field('properties').offsets.to_numpy(),
prop_key = rf.values.field('properties').values.field('key').to_numpy(),
prop_val = rf.values.field('properties').values.field('value').to_pylist(),
)
txt = libarr['cells'].values.field('texts')
texts = dict(
offsets = txt.offsets.to_numpy(),
layer_inds = txt.values.field('layer').to_numpy(),
xy = txt.values.field('xy').to_numpy().view('i4').reshape((-1, 2)),
string = txt.values.field('string').to_pylist(),
prop_off = txt.values.field('properties').offsets.to_numpy(),
prop_key = txt.values.field('properties').values.field('key').to_numpy(),
prop_val = txt.values.field('properties').values.field('value').to_pylist(),
)
elements = dict(
boundaries = get_geom(libarr, 'boundaries'),
paths = get_geom(libarr, 'paths'),
boxes = get_geom(libarr, 'boxes'),
nodes = get_geom(libarr, 'nodes'),
texts = texts,
refs = refs,
)
paths = libarr['cells'].values.field('paths')
elements['paths'].update(dict(
width = paths.values.field('width').fill_null(0).to_numpy(),
path_type = paths.values.field('path_type').fill_null(0).to_numpy(),
extensions = numpy.stack((
paths.values.field('extension_start').fill_null(0).to_numpy(),
paths.values.field('extension_end').fill_null(0).to_numpy(),
), axis=-1),
))
global_args = dict(
cell_names = cell_names,
layer_tups = layer_tups,
raw_mode = raw_mode,
)
mlib = Library()
for cc in range(len(libarr['cells'])):
name = cell_names[cell_ids[cc]]
pat = Pattern()
_boundaries_to_polygons(pat, global_args, elements['boundaries'], cc)
_gpaths_to_mpaths(pat, global_args, elements['paths'], cc)
_grefs_to_mrefs(pat, global_args, elements['refs'], cc)
_texts_to_labels(pat, global_args, elements['texts'], cc)
mlib[name] = pat
return mlib, library_info
def _read_header(libarr: pyarrow.Array) -> dict[str, Any]:
"""
Read the file header and create the library_info dict.
"""
library_info = dict(
name = libarr['lib_name'],
meters_per_unit = libarr['meters_per_db_unit'],
logical_units_per_unit = libarr['user_units_per_db_unit'],
)
return library_info
def _grefs_to_mrefs(
pat: Pattern,
global_args: dict[str, Any],
elem: dict[str, Any],
cc: int,
) -> None:
cell_names = global_args['cell_names']
elem_off = elem['offsets'] # which elements belong to each cell
xy = elem['xy']
prop_key = elem['prop_key']
prop_val = elem['prop_val']
targets = elem['targets']
elem_count = elem_off[cc + 1] - elem_off[cc]
elem_slc = slice(elem_off[cc], elem_off[cc] + elem_count + 1) # +1 to capture ending location for last elem
prop_offs = elem['prop_off'][elem_slc] # which props belong to each element
elem_invert_y = elem['invert_y'][elem_slc][:elem_count]
elem_angle_rad = elem['angle_rad'][elem_slc][:elem_count]
elem_scale = elem['scale'][elem_slc][:elem_count]
elem_rep_xy0 = elem['rep_xy0'][elem_slc][:elem_count]
elem_rep_xy1 = elem['rep_xy1'][elem_slc][:elem_count]
elem_rep_counts = elem['rep_counts'][elem_slc][:elem_count]
rep_valid = elem['rep_valid'][elem_slc][:elem_count]
for ee in range(elem_count):
target = cell_names[targets[ee]]
offset = xy[ee]
mirr = elem_invert_y[ee]
rot = elem_angle_rad[ee]
mag = elem_scale[ee]
rep: None | Grid = None
if rep_valid[ee]:
a_vector = elem_rep_xy0[ee]
b_vector = elem_rep_xy1[ee]
a_count, b_count = elem_rep_counts[ee]
rep = Grid(a_vector=a_vector, b_vector=b_vector, a_count=a_count, b_count=b_count)
annotations: None | dict[str, list[int | float | str]] = None
prop_ii, prop_ff = prop_offs[ee], prop_offs[ee + 1]
if prop_ii < prop_ff:
annotations = {str(prop_key[off]): [prop_val[off]] for off in range(prop_ii, prop_ff)}
ref = Ref(offset=offset, mirrored=mirr, rotation=rot, scale=mag, repetition=rep, annotations=annotations)
pat.refs[target].append(ref)
def _texts_to_labels(
pat: Pattern,
global_args: dict[str, Any],
elem: dict[str, Any],
cc: int,
) -> None:
elem_off = elem['offsets'] # which elements belong to each cell
xy = elem['xy']
layer_tups = global_args['layer_tups']
layer_inds = elem['layer_inds']
prop_key = elem['prop_key']
prop_val = elem['prop_val']
elem_count = elem_off[cc + 1] - elem_off[cc]
elem_slc = slice(elem_off[cc], elem_off[cc] + elem_count + 1) # +1 to capture ending location for last elem
prop_offs = elem['prop_off'][elem_slc] # which props belong to each element
elem_layer_inds = layer_inds[elem_slc][:elem_count]
elem_strings = elem['string'][elem_slc][:elem_count]
for ee in range(elem_count):
layer = layer_tups[elem_layer_inds[ee]]
offset = xy[ee]
string = elem_strings[ee]
annotations: None | dict[str, list[int | float | str]] = None
prop_ii, prop_ff = prop_offs[ee], prop_offs[ee + 1]
if prop_ii < prop_ff:
annotations = {str(prop_key[off]): [prop_val[off]] for off in range(prop_ii, prop_ff)}
mlabel = Label(string=string, offset=offset, annotations=annotations)
pat.labels[layer].append(mlabel)
def _gpaths_to_mpaths(
pat: Pattern,
global_args: dict[str, Any],
elem: dict[str, Any],
cc: int,
) -> None:
elem_off = elem['offsets'] # which elements belong to each cell
xy_val = elem['xy_arr']
layer_tups = global_args['layer_tups']
layer_inds = elem['layer_inds']
prop_key = elem['prop_key']
prop_val = elem['prop_val']
elem_count = elem_off[cc + 1] - elem_off[cc]
elem_slc = slice(elem_off[cc], elem_off[cc] + elem_count + 1) # +1 to capture ending location for last elem
xy_offs = elem['xy_off'][elem_slc] # which xy coords belong to each element
prop_offs = elem['prop_off'][elem_slc] # which props belong to each element
elem_layer_inds = layer_inds[elem_slc][:elem_count]
elem_widths = elem['width'][elem_slc][:elem_count]
elem_path_types = elem['path_type'][elem_slc][:elem_count]
elem_extensions = elem['extensions'][elem_slc][:elem_count]
zeros = numpy.zeros((elem_count, 2))
raw_mode = global_args['raw_mode']
for ee in range(elem_count):
layer = layer_tups[elem_layer_inds[ee]]
vertices = xy_val[xy_offs[ee]:xy_offs[ee + 1]]
width = elem_widths[ee]
cap_int = elem_path_types[ee]
cap = path_cap_map[cap_int]
if cap_int == 4:
cap_extensions = elem_extensions[ee]
else:
cap_extensions = None
annotations: None | dict[str, list[int | float | str]] = None
prop_ii, prop_ff = prop_offs[ee], prop_offs[ee + 1]
if prop_ii < prop_ff:
annotations = {str(prop_key[off]): [prop_val[off]] for off in range(prop_ii, prop_ff)}
path = Path(vertices=vertices, offset=zeros[ee], annotations=annotations, raw=raw_mode,
width=width, cap=cap,cap_extensions=cap_extensions)
pat.shapes[layer].append(path)
def _boundaries_to_polygons(
pat: Pattern,
global_args: dict[str, Any],
elem: dict[str, Any],
cc: int,
) -> None:
elem_off = elem['offsets'] # which elements belong to each cell
xy_val = elem['xy_arr']
layer_inds = elem['layer_inds']
layer_tups = global_args['layer_tups']
prop_key = elem['prop_key']
prop_val = elem['prop_val']
elem_count = elem_off[cc + 1] - elem_off[cc]
elem_slc = slice(elem_off[cc], elem_off[cc] + elem_count + 1) # +1 to capture ending location for last elem
xy_offs = elem['xy_off'][elem_slc] # which xy coords belong to each element
xy_counts = xy_offs[1:] - xy_offs[:-1]
prop_offs = elem['prop_off'][elem_slc] # which props belong to each element
prop_counts = prop_offs[1:] - prop_offs[:-1]
elem_layer_inds = layer_inds[elem_slc][:elem_count]
order = numpy.argsort(elem_layer_inds, stable=True)
unilayer_inds, unilayer_first, unilayer_count = numpy.unique(elem_layer_inds, return_index=True, return_counts=True)
zeros = numpy.zeros((elem_count, 2))
raw_mode = global_args['raw_mode']
for layer_ind, ff, nn in zip(unilayer_inds, unilayer_first, unilayer_count, strict=True):
ee_inds = order[ff:ff + nn]
layer = layer_tups[layer_ind]
propless_mask = prop_counts[ee_inds] == 0
poly_count_on_layer = propless_mask.sum()
if poly_count_on_layer == 1:
propless_mask[:] = 0 # Never make a 1-element collection
elif poly_count_on_layer > 1:
propless_vert_counts = xy_counts[ee_inds[propless_mask]] - 1 # -1 to drop closing point
vertex_lists = numpy.empty((propless_vert_counts.sum(), 2), dtype=numpy.float64)
vertex_offsets = numpy.cumsum(numpy.concatenate([[0], propless_vert_counts]))
for ii, ee in enumerate(ee_inds[propless_mask]):
vo = vertex_offsets[ii]
vertex_lists[vo:vo + propless_vert_counts[ii]] = xy_val[xy_offs[ee]:xy_offs[ee + 1] - 1]
polys = PolyCollection(vertex_lists=vertex_lists, vertex_offsets=vertex_offsets, offset=zeros[ee])
pat.shapes[layer].append(polys)
# Handle single polygons
for ee in ee_inds[~propless_mask]:
layer = layer_tups[elem_layer_inds[ee]]
vertices = xy_val[xy_offs[ee]:xy_offs[ee + 1] - 1] # -1 to drop closing point
annotations: None | dict[str, list[int | float | str]] = None
prop_ii, prop_ff = prop_offs[ee], prop_offs[ee + 1]
if prop_ii < prop_ff:
annotations = {str(prop_key[off]): prop_val[off] for off in range(prop_ii, prop_ff)}
poly = Polygon(vertices=vertices, offset=zeros[ee], annotations=annotations, raw=raw_mode)
pat.shapes[layer].append(poly)
#def _properties_to_annotations(properties: pyarrow.Array) -> annotations_t:
# return {prop['key'].as_py(): prop['value'].as_py() for prop in properties}
def check_valid_names(
names: Iterable[str],
max_length: int = 32,
) -> None:
"""
Check all provided names to see if they're valid GDSII cell names.
Args:
names: Collection of names to check
max_length: Max allowed length
"""
allowed_chars = set(string.ascii_letters + string.digits + '_?$')
bad_chars = [
name for name in names
if not set(name).issubset(allowed_chars)
]
bad_lengths = [
name for name in names
if len(name) > max_length
]
if bad_chars:
logger.error('Names contain invalid characters:\n' + pformat(bad_chars))
if bad_lengths:
logger.error(f'Names too long (>{max_length}:\n' + pformat(bad_chars))
if bad_chars or bad_lengths:
raise LibraryError('Library contains invalid names, see log above')

View File

@ -141,6 +141,7 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta):
Args: Args:
tops: Name(s) of the pattern(s) to check. tops: Name(s) of the pattern(s) to check.
Default is all patterns in the library. Default is all patterns in the library.
skip: Memo, set patterns which have already been traversed.
Returns: Returns:
Set of all referenced pattern names Set of all referenced pattern names
@ -273,7 +274,7 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta):
For an in-place variant, see `Pattern.flatten`. For an in-place variant, see `Pattern.flatten`.
Args: Args:
tops: The pattern(s) to flatten. tops: The pattern(s) to flattern.
flatten_ports: If `True`, keep ports from any referenced flatten_ports: If `True`, keep ports from any referenced
patterns; otherwise discard them. patterns; otherwise discard them.
dangling_ok: If `True`, no error will be thrown if any dangling_ok: If `True`, no error will be thrown if any

View File

@ -1241,7 +1241,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
ports specified by `map_out`. ports specified by `map_out`.
Examples: Examples:
========= ======list, ===
- `my_pat.plug(subdevice, {'A': 'C', 'B': 'B'}, map_out={'D': 'myport'})` - `my_pat.plug(subdevice, {'A': 'C', 'B': 'B'}, map_out={'D': 'myport'})`
instantiates `subdevice` into `my_pat`, plugging ports 'A' and 'B' instantiates `subdevice` into `my_pat`, plugging ports 'A' and 'B'
of `my_pat` into ports 'C' and 'B' of `subdevice`. The connected ports of `my_pat` into ports 'C' and 'B' of `subdevice`. The connected ports