Compare commits

..

4 Commits

71 changed files with 6029 additions and 9638 deletions

230
README.md
View File

@ -3,226 +3,43 @@
Masque is a Python module for designing lithography masks. Masque is a Python module for designing lithography masks.
The general idea is to implement something resembling the GDSII file-format, but The general idea is to implement something resembling the GDSII file-format, but
with some vectorized element types (eg. circles, not just polygons) and the ability with some vectorized element types (eg. circles, not just polygons), better support for
to output to multiple formats. E-beam doses, and the ability to output to multiple formats.
- [Source repository](https://mpxd.net/code/jan/masque) - [Source repository](https://mpxd.net/code/jan/masque)
- [PyPI](https://pypi.org/project/masque) - [PyPI](https://pypi.org/project/masque)
- [Github mirror](https://github.com/anewusername/masque)
## Installation ## Installation
Requirements: Requirements:
* python >= 3.11 * python >= 3.8
* numpy * numpy
* klamath (used for GDSII i/o) * klamath (used for `gdsii` i/o and library management)
* matplotlib (optional, used for `visualization` functions and `text`)
Optional requirements: * ezdxf (optional, used for `dxf` i/o)
* `ezdxf` (DXF i/o): ezdxf * fatamorgana (optional, used for `oasis` i/o)
* `oasis` (OASIS i/o): fatamorgana * svgwrite (optional, used for `svg` output)
* `svg` (SVG output): svgwrite * freetype (optional, used for `text`)
* `visualization` (shape plotting): matplotlib
* `text` (`Text` shape): matplotlib, freetype
Install with pip: Install with pip:
```bash ```bash
pip install 'masque[oasis,dxf,svg,visualization,text]' pip3 install 'masque[visualization,oasis,dxf,svg,text]'
``` ```
## Overview Alternatively, install from git
```bash
A layout consists of a hierarchy of `Pattern`s stored in a single `Library`. pip3 install git+https://mpxd.net/code/jan/masque.git@release
Each `Pattern` can contain `Ref`s pointing at other patterns, `Shape`s, `Label`s, and `Port`s.
`masque` departs from several "classic" GDSII paradigms:
- A `Pattern` object does not store its own name. A name is only assigned when the pattern is placed
into a `Library`, which is effectively a name->`Pattern` mapping.
- Layer info for `Shape`ss and `Label`s is not stored in the individual shape and label objects.
Instead, the layer is determined by the key for the container dict (e.g. `pattern.shapes[layer]`).
* This simplifies many common tasks: filtering `Shape`s by layer, remapping layers, and checking if
a layer is empty.
* Technically, this allows reusing the same shape or label object across multiple layers. This isn't
part of the standard workflow since a mixture of single-use and multi-use shapes could be confusing.
* This is similar to the approach used in [KLayout](https://www.klayout.de)
- `Ref` target names are also determined in the key of the container dict (e.g. `pattern.refs[target_name]`).
* This similarly simplifies filtering `Ref`s by target name, updating to a new target, and checking
if a given `Pattern` is referenced.
- `Pattern` names are set by their containing `Library` and are not stored in the `Pattern` objects.
* This guarantees that there are no duplicate pattern names within any given `Library`.
* Likewise, enumerating all the names (and all the `Pattern`s) in a `Library` is straightforward.
- Each `Ref`, `Shape`, or `Label` can be repeated multiple times by attaching a `repetition` object to it.
* This is similar to how OASIS reptitions are handled, and provides extra flexibility over the GDSII
approach of only allowing arrays through AREF (`Ref` + `repetition`).
- `Label`s do not have an orientation or presentation
* This is in line with how they are used in practice, and how they are represented in OASIS.
- Non-polygonal `Shape`s are allowed. For example, elliptical arcs are a basic shape type.
* This enables compatibility with OASIS (e.g. circles) and other formats.
* `Shape`s provide a `.to_polygons()` method for GDSII compatibility.
- Most coordinate values are stored as 64-bit floats internally.
* 1 earth radii in nanometers (6e15) is still represented without approximation (53 bit mantissa -> 2^53 > 9e15)
* Operations that would otherwise clip/round on are still represented approximately.
* Memory usage is usually dominated by other Python overhead.
- `Pattern` objects also contain `Port` information, which can be used to "snap" together
multiple sub-components by matching up the requested port offsets and rotations.
* Port rotations are defined as counter-clockwise angles from the +x axis.
* Ports point into the interior of their associated device.
* Port rotations may be `None` in the case of non-oriented ports.
* Ports have a `ptype` string which is compared in order to catch mismatched connections at build time.
* Ports can be exported into/imported from `Label`s stored directly in the layout,
editable from standard tools (e.g. KLayout). A default format is provided.
In one important way, `masque` stays very orthodox:
References are accomplished by listing the target's name, not its `Pattern` object.
- The main downside of this is that any operations that traverse the hierarchy require
both the `Pattern` and the `Library` which is contains its reference targets.
- This guarantees that names within a `Library` remain unique at all times.
* Since this can be tedious in cases where you don't actually care about the name of a
pattern, patterns whose names start with `SINGLE_USE_PREFIX` (default: an underscore)
may be silently renamed in order to maintain uniqueness.
See `masque.library.SINGLE_USE_PREFIX`, `masque.library._rename_patterns()`,
and `ILibrary.add()` for more details.
- Having all patterns accessible through the `Library` avoids having to perform a
tree traversal for every operation which needs to touch all `Pattern` objects
(e.g. deleting a layer everywhere or scaling all patterns).
- Since `Pattern` doesn't know its own name, you can't create a reference by passing in
a `Pattern` object -- you need to know its name.
- You *can* reference a `Pattern` before it is created, so long as you have already decided
on its name.
- Functions like `Pattern.place()` and `Pattern.plug()` need to receive a pattern's name
in order to create a reference, but they also need to access the pattern's ports.
* One way to provide this data is through an `Abstract`, generated via
`Library.abstract()` or through a `Library.abstract_view()`.
* Another way is use `Builder.place()` or `Builder.plug()`, which automatically creates
an `Abstract` from its internally-referenced `Library`.
## Glossary
- `Library`: A collection of named cells. OASIS or GDS "library" or file.
- `Tree`: Any `{name: pattern}` mapping which has only one topcell.
- `Pattern`: A collection of geometry, text labels, and reference to other patterns.
OASIS or GDS "Cell", DXF "Block".
- `Ref`: A reference to another pattern. GDS "AREF/SREF", OASIS "Placement".
- `Shape`: Individual geometric entity. OASIS or GDS "Geometry element", DXF "LWPolyline" or "Polyline".
- `repetition`: Repetition operation. OASIS "repetition".
GDS "AREF" is a `Ref` combined with a `Grid` repetition.
- `Label`: Text label. Not rendered into geometry. OASIS, GDS, DXF "Text".
- `annotation`: Additional metadata. OASIS or GDS "property".
## Syntax, shorthand, and design patterns
Most syntax and behavior should follow normal python conventions.
There are a few exceptions, either meant to catch common mistakes or to provide a shorthand for common operations:
### `Library` objects don't allow overwriting already-existing patterns
```python3
library['mycell'] = pattern0
library['mycell'] = pattern1 # Error! 'mycell' already exists and can't be overwritten
del library['mycell'] # We can explicitly delete it
library['mycell'] = pattern1 # And now it's ok to assign a new value
library.delete('mycell') # This also deletes all refs pointing to 'mycell' by default
``` ```
### Insert a newly-made hierarchical pattern (with children) into a layout ## Translation
```python3 - `Pattern`: OASIS or GDS "Cell", DXF "Block"
# Let's say we have a function which returns a new library containing one topcell (and possibly children) - `SubPattern`: GDS "AREF/SREF", OASIS "Placement"
tree = make_tree(...) - `Shape`: OASIS or GDS "Geometry element", DXF "LWPolyline" or "Polyline"
- `repetition`: OASIS "repetition". GDS "AREF" is a `SubPattern` combined with a `Grid` repetition.
# To reference this cell in our layout, we have to add all its children to our `library` first: - `Label`: OASIS, GDS, DXF "Text".
top_name = tree.top() # get the name of the topcell - `annotation`: OASIS or GDS "property"
name_mapping = library.add(tree) # add all patterns from `tree`, renaming elgible conflicting patterns
new_name = name_mapping.get(top_name, top_name) # get the new name for the cell (in case it was auto-renamed)
my_pattern.ref(new_name, ...) # instantiate the cell
# This can be accomplished as follows
new_name = library << tree # Add `tree` into `library` and return the top cell's new name
my_pattern.ref(new_name, ...) # instantiate the cell
# In practice, you may do lots of
my_pattern.ref(lib << make_tree(...), ...)
# With a `Builder` and `place()`/`plug()` the `lib <<` portion can be implicit:
my_builder = Builder(library=lib, ...)
...
my_builder.place(make_tree(...))
```
We can also use this shorthand to quickly add and reference a single flat (as yet un-named) pattern:
```python3
anonymous_pattern = Pattern(...)
my_pattern.ref(lib << {'_tentative_name': anonymous_pattern}, ...)
```
### Place a hierarchical pattern into a layout, preserving its port info
```python3
# As above, we have a function that makes a new library containing one topcell (and possibly children)
tree = make_tree(...)
# We need to go get its port info to `place()` it into our existing layout,
new_name = library << tree # Add the tree to the library and return its name (see `<<` above)
abstract = library.abstract(tree) # An `Abstract` stores a pattern's name and its ports (but no geometry)
my_pattern.place(abstract, ...)
# With shorthand,
abstract = library <= tree
my_pattern.place(abstract, ...)
# or
my_pattern.place(library << make_tree(...), ...)
### Quickly add geometry, labels, or refs:
The long form for adding elements can be overly verbose:
```python3
my_pattern.shapes[layer].append(Polygon(vertices, ...))
my_pattern.labels[layer] += [Label('my text')]
my_pattern.refs[target_name].append(Ref(offset=..., ...))
```
There is shorthand for the most common elements:
```python3
my_pattern.polygon(layer=layer, vertices=vertices, ...)
my_pattern.rect(layer=layer, xctr=..., xmin=..., ymax=..., ly=...) # rectangle; pick 4 of 6 constraints
my_pattern.rect(layer=layer, ymin=..., ymax=..., xctr=..., lx=...)
my_pattern.path(...)
my_pattern.label(layer, 'my_text')
my_pattern.ref(target_name, offset=..., ...)
```
### Accessing ports
```python3
# Square brackets pull from the underlying `.ports` dict:
assert pattern['input'] is pattern.ports['input']
# And you can use them to read multiple ports at once:
assert pattern[('input', 'output')] == {
'input': pattern.ports['input'],
'output': pattern.ports['output'],
}
# But you shouldn't use them for anything except reading
pattern['input'] = Port(...) # Error!
has_input = ('input' in pattern) # Error!
```
### Building patterns
```python3
library = Library(...)
my_pattern_name, my_pattern = library.mkpat(some_name_generator())
...
def _make_my_subpattern() -> str:
# This function can draw from the outer scope (e.g. `library`) but will not pollute the outer scope
# (e.g. the variable `subpattern` will not be accessible from outside the function; you must load it
# from within `library`).
subpattern_name, subpattern = library.mkpat(...)
subpattern.rect(...)
...
return subpattern_name
my_pattern.ref(_make_my_subpattern(), offset=..., ...)
```
## TODO ## TODO
@ -230,8 +47,5 @@ my_pattern.ref(_make_my_subpattern(), offset=..., ...)
* Better interface for polygon operations (e.g. with `pyclipper`) * Better interface for polygon operations (e.g. with `pyclipper`)
- de-embedding - de-embedding
- boolean ops - boolean ops
* Tests tests tests * Construct polygons from bitmap using `skimage.find_contours`
* check renderpather * Deal with shape repetitions for dxf, svg
* pather and renderpather examples
* context manager for retool
* allow a specific mismatch when connecting ports

View File

@ -2,33 +2,29 @@
import numpy import numpy
from masque.file import gdsii import masque
from masque import Arc, Pattern import masque.file.klamath
from masque import shapes
def main(): def main():
pat = Pattern() pat = masque.Pattern(name='ellip_grating')
layer = (0, 0) for rmin in numpy.arange(10, 15, 0.5):
pat.shapes[layer].extend([ pat.shapes.append(shapes.Arc(
Arc(
radii=(rmin, rmin), radii=(rmin, rmin),
width=0.1, width=0.1,
angles=(-numpy.pi/4, numpy.pi/4), angles=(-numpy.pi/4, numpy.pi/4),
) layer=(0, 0),
for rmin in numpy.arange(10, 15, 0.5)] ))
)
pat.label(string='grating centerline', offset=(1, 0), layer=(1, 2)) pat.labels.append(masque.Label(string='grating centerline', offset=(1, 0), layer=(1, 2)))
pat.scale_by(1000) pat.scale_by(1000)
pat.visualize() pat.visualize()
pat2 = pat.copy()
pat2.name = 'grating2'
lib = { masque.file.klamath.writefile((pat, pat2), 'out.gds.gz', 1e-9, 1e-3)
'ellip_grating': pat,
'grating2': pat.copy(),
}
gdsii.writefile(lib, 'out.gds.gz', meters_per_unit=1e-9, logical_units_per_unit=1e-3)
if __name__ == '__main__': if __name__ == '__main__':

View File

@ -1,29 +0,0 @@
import numpy
from pyclipper import (
Pyclipper, PT_CLIP, PT_SUBJECT, CT_UNION, CT_INTERSECTION, PFT_NONZERO,
scale_to_clipper, scale_from_clipper,
)
p = Pyclipper()
p.AddPaths([
[(-10, -10), (-10, 10), (-9, 10), (-9, -10)],
[(-10, 10), (10, 10), (10, 9), (-10, 9)],
[(10, 10), (10, -10), (9, -10), (9, 10)],
[(10, -10), (-10, -10), (-10, -9), (10, -9)],
], PT_SUBJECT, closed=True)
#p.Execute2?
#p.Execute?
p.Execute(PT_UNION, PT_NONZERO, PT_NONZERO)
p.Execute(CT_UNION, PT_NONZERO, PT_NONZERO)
p.Execute(CT_UNION, PFT_NONZERO, PFT_NONZERO)
p = Pyclipper()
p.AddPaths([
[(-10, -10), (-10, 10), (-9, 10), (-9, -10)],
[(-10, 10), (10, 10), (10, 9), (-10, 9)],
[(10, 10), (10, -10), (9, -10), (9, 10)],
[(10, -10), (-10, -10), (-10, -9), (10, -9)],
], PT_SUBJECT, closed=True)
r = p.Execute2(CT_UNION, PFT_NONZERO, PFT_NONZERO)
#r.Childs

View File

@ -1,43 +0,0 @@
# pip install pillow scikit-image
# or
# sudo apt install python3-pil python3-skimage
from PIL import Image
from skimage.measure import find_contours
from matplotlib import pyplot
import numpy
from masque import Pattern, Polygon
from masque.file.gdsii import writefile
#
# Read the image into a numpy array
#
im = Image.open('./Desktop/Camera/IMG_20220626_091101.jpg')
aa = numpy.array(im.convert(mode='L').getdata()).reshape(im.height, im.width)
threshold = (aa.max() - aa.min()) / 2
#
# Find edge contours and plot them
#
contours = find_contours(aa, threshold)
pyplot.imshow(aa)
for contour in contours:
pyplot.plot(contour[:, 1], contour[:, 0], linewidth=2)
pyplot.show(block=False)
#
# Create the layout from the contours
#
pat = Pattern()
pat.shapes[(0, 0)].extend([
Polygon(vertices=vv) for vv in contours if len(vv) < 1_000
])
lib = {}
lib['my_mask_name'] = pat
writefile(lib, 'test_contours.gds', meters_per_unit=1e-9)

View File

@ -1,138 +1,103 @@
from pprint import pprint
from pathlib import Path
import numpy import numpy
from numpy import pi from numpy import pi
import masque import masque
from masque import Pattern, Ref, Arc, Library import masque.file.gdsii
import masque.file.klamath
import masque.file.dxf
import masque.file.oasis
from masque import shapes, Pattern, SubPattern
from masque.repetition import Grid from masque.repetition import Grid
from masque.file import gdsii, dxf, oasis
from pprint import pprint
def main(): def main():
lib = Library() pat = masque.Pattern(name='ellip_grating')
cell_name = 'ellip_grating'
pat = masque.Pattern()
layer = (0, 0)
for rmin in numpy.arange(10, 15, 0.5): for rmin in numpy.arange(10, 15, 0.5):
pat.shapes[layer].append(Arc( pat.shapes.append(shapes.Arc(
radii=(rmin, rmin), radii=(rmin, rmin),
width=0.1, width=0.1,
angles=(0 * -pi/4, pi/4), angles=(0*-numpy.pi/4, numpy.pi/4),
annotations={'1': ['blah']}, annotations={'1': ['blah']},
)) ))
pat.scale_by(1000) pat.scale_by(1000)
# pat.visualize() # pat.visualize()
lib[cell_name] = pat pat2 = pat.copy()
print(f'\nAdded {cell_name}:') pat2.name = 'grating2'
pat3 = Pattern('sref_test')
pat3.subpatterns = [
SubPattern(pat, offset=(1e5, 3e5), annotations={'4': ['Hello I am the base subpattern']}),
SubPattern(pat, offset=(2e5, 3e5), rotation=pi/3),
SubPattern(pat, offset=(3e5, 3e5), rotation=pi/2),
SubPattern(pat, offset=(4e5, 3e5), rotation=pi),
SubPattern(pat, offset=(5e5, 3e5), rotation=3*pi/2),
SubPattern(pat, mirrored=(True, False), offset=(1e5, 4e5)),
SubPattern(pat, mirrored=(True, False), offset=(2e5, 4e5), rotation=pi/3),
SubPattern(pat, mirrored=(True, False), offset=(3e5, 4e5), rotation=pi/2),
SubPattern(pat, mirrored=(True, False), offset=(4e5, 4e5), rotation=pi),
SubPattern(pat, mirrored=(True, False), offset=(5e5, 4e5), rotation=3*pi/2),
SubPattern(pat, mirrored=(False, True), offset=(1e5, 5e5)),
SubPattern(pat, mirrored=(False, True), offset=(2e5, 5e5), rotation=pi/3),
SubPattern(pat, mirrored=(False, True), offset=(3e5, 5e5), rotation=pi/2),
SubPattern(pat, mirrored=(False, True), offset=(4e5, 5e5), rotation=pi),
SubPattern(pat, mirrored=(False, True), offset=(5e5, 5e5), rotation=3*pi/2),
SubPattern(pat, mirrored=(True, True), offset=(1e5, 6e5)),
SubPattern(pat, mirrored=(True, True), offset=(2e5, 6e5), rotation=pi/3),
SubPattern(pat, mirrored=(True, True), offset=(3e5, 6e5), rotation=pi/2),
SubPattern(pat, mirrored=(True, True), offset=(4e5, 6e5), rotation=pi),
SubPattern(pat, mirrored=(True, True), offset=(5e5, 6e5), rotation=3*pi/2),
]
pprint(pat3)
pprint(pat3.subpatterns)
pprint(pat.shapes) pprint(pat.shapes)
new_name = lib.get_name(cell_name) rep = Grid(a_vector=[1e4, 0],
lib[new_name] = pat.copy()
print(f'\nAdded a copy of {cell_name} as {new_name}')
pat3 = Pattern()
pat3.refs[cell_name] = [
Ref(offset=(1e5, 3e5), annotations={'4': ['Hello I am the base Ref']}),
Ref(offset=(2e5, 3e5), rotation=pi/3),
Ref(offset=(3e5, 3e5), rotation=pi/2),
Ref(offset=(4e5, 3e5), rotation=pi),
Ref(offset=(5e5, 3e5), rotation=3*pi/2),
Ref(mirrored=True, offset=(1e5, 4e5)),
Ref(mirrored=True, offset=(2e5, 4e5), rotation=pi/3),
Ref(mirrored=True, offset=(3e5, 4e5), rotation=pi/2),
Ref(mirrored=True, offset=(4e5, 4e5), rotation=pi),
Ref(mirrored=True, offset=(5e5, 4e5), rotation=3*pi/2),
Ref(offset=(1e5, 5e5)).mirror_target(1),
Ref(offset=(2e5, 5e5), rotation=pi/3).mirror_target(1),
Ref(offset=(3e5, 5e5), rotation=pi/2).mirror_target(1),
Ref(offset=(4e5, 5e5), rotation=pi).mirror_target(1),
Ref(offset=(5e5, 5e5), rotation=3*pi/2).mirror_target(1),
Ref(offset=(1e5, 6e5)).mirror2d_target(True, True),
Ref(offset=(2e5, 6e5), rotation=pi/3).mirror2d_target(True, True),
Ref(offset=(3e5, 6e5), rotation=pi/2).mirror2d_target(True, True),
Ref(offset=(4e5, 6e5), rotation=pi).mirror2d_target(True, True),
Ref(offset=(5e5, 6e5), rotation=3*pi/2).mirror2d_target(True, True),
]
lib['sref_test'] = pat3
print('\nAdded sref_test:')
pprint(pat3)
pprint(pat3.refs)
rep = Grid(
a_vector=[1e4, 0],
b_vector=[0, 1.5e4], b_vector=[0, 1.5e4],
a_count=3, a_count=3,
b_count=2, b_count=2,)
) pat4 = Pattern('aref_test')
pat4 = Pattern() pat4.subpatterns = [
pat4.refs[cell_name] = [ SubPattern(pat, repetition=rep, offset=(1e5, 3e5)),
Ref(repetition=rep, offset=(1e5, 3e5)), SubPattern(pat, repetition=rep, offset=(2e5, 3e5), rotation=pi/3),
Ref(repetition=rep, offset=(2e5, 3e5), rotation=pi/3), SubPattern(pat, repetition=rep, offset=(3e5, 3e5), rotation=pi/2),
Ref(repetition=rep, offset=(3e5, 3e5), rotation=pi/2), SubPattern(pat, repetition=rep, offset=(4e5, 3e5), rotation=pi),
Ref(repetition=rep, offset=(4e5, 3e5), rotation=pi), SubPattern(pat, repetition=rep, offset=(5e5, 3e5), rotation=3*pi/2),
Ref(repetition=rep, offset=(5e5, 3e5), rotation=3*pi/2), SubPattern(pat, repetition=rep, mirrored=(True, False), offset=(1e5, 4e5)),
Ref(repetition=rep, mirrored=True, offset=(1e5, 4e5)), SubPattern(pat, repetition=rep, mirrored=(True, False), offset=(2e5, 4e5), rotation=pi/3),
Ref(repetition=rep, mirrored=True, offset=(2e5, 4e5), rotation=pi/3), SubPattern(pat, repetition=rep, mirrored=(True, False), offset=(3e5, 4e5), rotation=pi/2),
Ref(repetition=rep, mirrored=True, offset=(3e5, 4e5), rotation=pi/2), SubPattern(pat, repetition=rep, mirrored=(True, False), offset=(4e5, 4e5), rotation=pi),
Ref(repetition=rep, mirrored=True, offset=(4e5, 4e5), rotation=pi), SubPattern(pat, repetition=rep, mirrored=(True, False), offset=(5e5, 4e5), rotation=3*pi/2),
Ref(repetition=rep, mirrored=True, offset=(5e5, 4e5), rotation=3*pi/2), SubPattern(pat, repetition=rep, mirrored=(False, True), offset=(1e5, 5e5)),
Ref(repetition=rep, offset=(1e5, 5e5)).mirror_target(1), SubPattern(pat, repetition=rep, mirrored=(False, True), offset=(2e5, 5e5), rotation=pi/3),
Ref(repetition=rep, offset=(2e5, 5e5), rotation=pi/3).mirror_target(1), SubPattern(pat, repetition=rep, mirrored=(False, True), offset=(3e5, 5e5), rotation=pi/2),
Ref(repetition=rep, offset=(3e5, 5e5), rotation=pi/2).mirror_target(1), SubPattern(pat, repetition=rep, mirrored=(False, True), offset=(4e5, 5e5), rotation=pi),
Ref(repetition=rep, offset=(4e5, 5e5), rotation=pi).mirror_target(1), SubPattern(pat, repetition=rep, mirrored=(False, True), offset=(5e5, 5e5), rotation=3*pi/2),
Ref(repetition=rep, offset=(5e5, 5e5), rotation=3*pi/2).mirror_target(1), SubPattern(pat, repetition=rep, mirrored=(True, True), offset=(1e5, 6e5)),
Ref(repetition=rep, offset=(1e5, 6e5)).mirror2d_target(True, True), SubPattern(pat, repetition=rep, mirrored=(True, True), offset=(2e5, 6e5), rotation=pi/3),
Ref(repetition=rep, offset=(2e5, 6e5), rotation=pi/3).mirror2d_target(True, True), SubPattern(pat, repetition=rep, mirrored=(True, True), offset=(3e5, 6e5), rotation=pi/2),
Ref(repetition=rep, offset=(3e5, 6e5), rotation=pi/2).mirror2d_target(True, True), SubPattern(pat, repetition=rep, mirrored=(True, True), offset=(4e5, 6e5), rotation=pi),
Ref(repetition=rep, offset=(4e5, 6e5), rotation=pi).mirror2d_target(True, True), SubPattern(pat, repetition=rep, mirrored=(True, True), offset=(5e5, 6e5), rotation=3*pi/2),
Ref(repetition=rep, offset=(5e5, 6e5), rotation=3*pi/2).mirror2d_target(True, True),
] ]
lib['aref_test'] = pat4 folder = 'layouts/'
print('\nAdded aref_test') masque.file.klamath.writefile((pat, pat2, pat3, pat4), folder + 'rep.gds.gz', 1e-9, 1e-3)
folder = Path('./layouts/') cells = list(masque.file.klamath.readfile(folder + 'rep.gds.gz')[0].values())
folder.mkdir(exist_ok=True) masque.file.klamath.writefile(cells, folder + 'rerep.gds.gz', 1e-9, 1e-3)
print(f'...writing files to {folder}...')
gds1 = folder / 'rep.gds.gz' masque.file.dxf.writefile(pat4, folder + 'rep.dxf.gz')
gds2 = folder / 'rerep.gds.gz' dxf, info = masque.file.dxf.readfile(folder + 'rep.dxf.gz')
print(f'Initial write to {gds1}') masque.file.dxf.writefile(dxf, folder + 'rerep.dxf.gz')
gdsii.writefile(lib, gds1, 1e-9, 1e-3)
print(f'Read back and rewrite to {gds2}')
readback_lib, _info = gdsii.readfile(gds1)
gdsii.writefile(readback_lib, gds2, 1e-9, 1e-3)
dxf1 = folder / 'rep.dxf.gz'
dxf2 = folder / 'rerep.dxf.gz'
print(f'Write aref_test to {dxf1}')
dxf.writefile(lib, 'aref_test', dxf1)
print(f'Read back and rewrite to {dxf2}')
dxf_lib, _info = dxf.readfile(dxf1)
print(Library(dxf_lib))
dxf.writefile(dxf_lib, 'Model', dxf2)
layer_map = {'base': (0,0), 'mylabel': (1,2)} layer_map = {'base': (0,0), 'mylabel': (1,2)}
oas1 = folder / 'rep.oas' masque.file.oasis.writefile((pat, pat2, pat3, pat4), folder + 'rep.oas.gz', 1000, layer_map=layer_map)
oas2 = folder / 'rerep.oas' oas, info = masque.file.oasis.readfile(folder + 'rep.oas.gz')
print(f'Write lib to {oas1}') masque.file.oasis.writefile(list(oas.values()), folder + 'rerep.oas.gz', 1000, layer_map=layer_map)
oasis.writefile(lib, oas1, 1000, layer_map=layer_map) print(info)
print(f'Read back and rewrite to {oas2}')
oas_lib, oas_info = oasis.readfile(oas1)
oasis.writefile(oas_lib, oas2, 1000, layer_map=layer_map)
print('OASIS info:')
pprint(oas_info)
if __name__ == '__main__': if __name__ == '__main__':

View File

@ -1,39 +0,0 @@
masque Tutorial
===============
Contents
--------
- [basic_shapes](basic_shapes.py):
* Draw basic geometry
* Export to GDS
- [devices](devices.py)
* Reference other patterns
* Add ports to a pattern
* Snap ports together to build a circuit
* Check for dangling references
- [library](library.py)
* Create a `LazyLibrary`, which loads / generates patterns only when they are first used
* Explore alternate ways of specifying a pattern for `.plug()` and `.place()`
* Design a pattern which is meant to plug into an existing pattern (via `.interface()`)
- [pather](pather.py)
* Use `Pather` to route individual wires and wire bundles
* Use `BasicTool` to generate paths
* Use `BasicTool` to automatically transition between path types
- [renderpather](rendpather.py)
* Use `RenderPather` and `PathTool` to build a layout similar to the one in [pather](pather.py),
but using `Path` shapes instead of `Polygon`s.
Additionaly, [pcgen](pcgen.py) is a utility module for generating photonic crystal lattices.
Running
-------
Run from inside the examples directory:
```bash
cd examples/tutorial
python3 basic_shapes.py
klayout -e basic_shapes.gds
```

View File

@ -1,21 +1,21 @@
from collections.abc import Sequence from typing import Tuple, Sequence
import numpy import numpy
from numpy import pi from numpy import pi
from masque import ( from masque import layer_t, Pattern, SubPattern, Label
layer_t, Pattern, Label, Port, from masque.shapes import Circle, Arc, Polygon
Circle, Arc, Polygon, from masque.builder import Device, Port
) from masque.library import Library, DeviceLibrary
import masque.file.gdsii import masque.file.gdsii
# Note that masque units are arbitrary, and are only given # Note that masque units are arbitrary, and are only given
# physical significance when writing to a file. # physical significance when writing to a file.
GDS_OPTS = dict( GDS_OPTS = {
meters_per_unit = 1e-9, # GDS database unit, 1 nanometer 'meters_per_unit': 1e-9, # GDS database unit, 1 nanometer
logical_units_per_unit = 1e-3, # GDS display unit, 1 micron 'logical_units_per_unit': 1e-3, # GDS display unit, 1 micron
) }
def hole( def hole(
@ -30,12 +30,11 @@ def hole(
layer: Layer to draw the circle on. layer: Layer to draw the circle on.
Returns: Returns:
Pattern containing a circle. Pattern, named `'hole'`
""" """
pat = Pattern() pat = Pattern('hole', shapes=[
pat.shapes[layer].append( Circle(radius=radius, offset=(0, 0), layer=layer)
Circle(radius=radius, offset=(0, 0)) ])
)
return pat return pat
@ -51,7 +50,7 @@ def triangle(
layer: Layer to draw the circle on. layer: Layer to draw the circle on.
Returns: Returns:
Pattern containing a triangle Pattern, named `'triangle'`
""" """
vertices = numpy.array([ vertices = numpy.array([
(numpy.cos( pi / 2), numpy.sin( pi / 2)), (numpy.cos( pi / 2), numpy.sin( pi / 2)),
@ -59,9 +58,8 @@ def triangle(
(numpy.cos( - pi / 6), numpy.sin( - pi / 6)), (numpy.cos( - pi / 6), numpy.sin( - pi / 6)),
]) * radius ]) * radius
pat = Pattern() pat = Pattern('triangle', shapes=[
pat.shapes[layer].extend([ Polygon(offset=(0, 0), layer=layer, vertices=vertices),
Polygon(offset=(0, 0), vertices=vertices),
]) ])
return pat return pat
@ -80,40 +78,37 @@ def smile(
secondary_layer: Layer to draw eyes and smile on. secondary_layer: Layer to draw eyes and smile on.
Returns: Returns:
Pattern containing a smiley face Pattern, named `'smile'`
""" """
# Make an empty pattern # Make an empty pattern
pat = Pattern() pat = Pattern('smile')
# Add all the shapes we want # Add all the shapes we want
pat.shapes[layer] += [ pat.shapes += [
Circle(radius=radius, offset=(0, 0)), # Outer circle Circle(radius=radius, offset=(0, 0), layer=layer), # Outer circle
] Circle(radius=radius / 10, offset=(radius / 3, radius / 3), layer=secondary_layer),
Circle(radius=radius / 10, offset=(-radius / 3, radius / 3), layer=secondary_layer),
pat.shapes[secondary_layer] += [ Arc(radii=(radius * 2 / 3, radius * 2 / 3), # Underlying ellipse radii
Circle(radius=radius / 10, offset=(radius / 3, radius / 3)),
Circle(radius=radius / 10, offset=(-radius / 3, radius / 3)),
Arc(
radii=(radius * 2 / 3, radius * 2 / 3), # Underlying ellipse radii
angles=(7 / 6 * pi, 11 / 6 * pi), # Angles limiting the arc angles=(7 / 6 * pi, 11 / 6 * pi), # Angles limiting the arc
width=radius / 10, width=radius / 10,
offset=(0, 0), offset=(0, 0),
), layer=secondary_layer),
] ]
return pat return pat
def main() -> None: def main() -> None:
lib = {} hole_pat = hole(1000)
smile_pat = smile(1000)
tri_pat = triangle(1000)
lib['hole'] = hole(1000) units_per_meter = 1e-9
lib['smile'] = smile(1000) units_per_display_unit = 1e-3
lib['triangle'] = triangle(1000)
masque.file.gdsii.writefile(lib, 'basic_shapes.gds', **GDS_OPTS) masque.file.gdsii.writefile([hole_pat, tri_pat, smile_pat], 'basic_shapes.gds', **GDS_OPTS)
lib['triangle'].visualize() smile_pat.visualize()
if __name__ == '__main__': if __name__ == '__main__':

View File

@ -1,14 +1,12 @@
from collections.abc import Sequence, Mapping from typing import Tuple, Sequence, Dict
import numpy import numpy
from numpy import pi from numpy import pi
from masque import ( from masque import layer_t, Pattern, SubPattern, Label
layer_t, Pattern, Ref, Label, Builder, Port, Polygon, from masque.shapes import Polygon
Library, ILibraryView, from masque.builder import Device, Port, port_utils
) from masque.file.gdsii import writefile
from masque.utils import ports2data
from masque.file.gdsii import writefile, check_valid_names
import pcgen import pcgen
import basic_shapes import basic_shapes
@ -19,41 +17,40 @@ LATTICE_CONSTANT = 512
RADIUS = LATTICE_CONSTANT / 2 * 0.75 RADIUS = LATTICE_CONSTANT / 2 * 0.75
def ports_to_data(pat: Pattern) -> Pattern: def dev2pat(dev: Device) -> Pattern:
""" """
Bake port information into the pattern. Bake port information into the device.
This places a label at each port location on layer (3, 0) with text content This places a label at each port location on layer (3, 0) with text content
'name:ptype angle_deg' 'name:ptype angle_deg'
""" """
return ports2data.ports_to_data(pat, layer=(3, 0)) return port_utils.dev2pat(dev, layer=(3, 0))
def data_to_ports(lib: Mapping[str, Pattern], name: str, pat: Pattern) -> Pattern: def pat2dev(pat: Pattern) -> Device:
""" """
Scan the Pattern to determine port locations. Same port format as `ports_to_data` Scans the Pattern to determine port locations. Same format as `dev2pat`
""" """
return ports2data.data_to_ports(layers=[(3, 0)], library=lib, pattern=pat, name=name) return port_utils.pat2dev(pat, layers=[(3, 0)])
def perturbed_l3( def perturbed_l3(
lattice_constant: float, lattice_constant: float,
hole: str, hole: Pattern,
hole_lib: Mapping[str, Pattern], trench_dose: float = 1.0,
trench_layer: layer_t = (1, 0), trench_layer: layer_t = (1, 0),
shifts_a: Sequence[float] = (0.15, 0, 0.075), shifts_a: Sequence[float] = (0.15, 0, 0.075),
shifts_r: Sequence[float] = (1.0, 1.0, 1.0), shifts_r: Sequence[float] = (1.0, 1.0, 1.0),
xy_size: tuple[int, int] = (10, 10), xy_size: Tuple[int, int] = (10, 10),
perturbed_radius: float = 1.1, perturbed_radius: float = 1.1,
trench_width: float = 1200, trench_width: float = 1200,
) -> Pattern: ) -> Device:
""" """
Generate a `Pattern` representing a perturbed L3 cavity. Generate a `Device` representing a perturbed L3 cavity.
Args: Args:
lattice_constant: Distance between nearest neighbor holes lattice_constant: Distance between nearest neighbor holes
hole: name of a `Pattern` containing a single hole hole: `Pattern` object containing a single hole
hole_lib: Library which contains the `Pattern` object for hole. trench_dose: Dose for the trenches. Default 1.0. (Hole dose is 1.0.)
Necessary because we need to know how big it is...
trench_layer: Layer for the trenches, default `(1, 0)`. trench_layer: Layer for the trenches, default `(1, 0)`.
shifts_a: passed to `pcgen.l3_shift`; specifies lattice constant shifts_a: passed to `pcgen.l3_shift`; specifies lattice constant
(1 - multiplicative factor) for shifting holes adjacent to (1 - multiplicative factor) for shifting holes adjacent to
@ -69,10 +66,8 @@ def perturbed_l3(
trench width: Width of the undercut trenches. Default 1200. trench width: Width of the undercut trenches. Default 1200.
Returns: Returns:
`Pattern` object representing the L3 design. `Device` object representing the L3 design.
""" """
print('Generating perturbed L3...')
# Get hole positions and radii # Get hole positions and radii
xyr = pcgen.l3_shift_perturbed_defect(mirror_dims=xy_size, xyr = pcgen.l3_shift_perturbed_defect(mirror_dims=xy_size,
perturbed_radius=perturbed_radius, perturbed_radius=perturbed_radius,
@ -80,206 +75,188 @@ def perturbed_l3(
shifts_r=shifts_r) shifts_r=shifts_r)
# Build L3 cavity, using references to the provided hole pattern # Build L3 cavity, using references to the provided hole pattern
pat = Pattern() pat = Pattern(f'L3p-a{lattice_constant:g}rp{perturbed_radius:g}')
pat.refs[hole] += [ pat.subpatterns += [
Ref(scale=r, offset=(lattice_constant * x, SubPattern(hole, scale=r,
offset=(lattice_constant * x,
lattice_constant * y)) lattice_constant * y))
for x, y, r in xyr] for x, y, r in xyr]
# Add rectangular undercut aids # Add rectangular undercut aids
min_xy, max_xy = pat.get_bounds_nonempty(hole_lib) min_xy, max_xy = pat.get_bounds_nonempty()
trench_dx = max_xy[0] - min_xy[0] trench_dx = max_xy[0] - min_xy[0]
pat.shapes[trench_layer] += [ pat.shapes += [
Polygon.rect(ymin=max_xy[1], xmin=min_xy[0], lx=trench_dx, ly=trench_width), Polygon.rect(ymin=max_xy[1], xmin=min_xy[0], lx=trench_dx, ly=trench_width,
Polygon.rect(ymax=min_xy[1], xmin=min_xy[0], lx=trench_dx, ly=trench_width), layer=trench_layer, dose=trench_dose),
Polygon.rect(ymax=min_xy[1], xmin=min_xy[0], lx=trench_dx, ly=trench_width,
layer=trench_layer, dose=trench_dose),
] ]
# Ports are at outer extents of the device (with y=0) # Ports are at outer extents of the device (with y=0)
extent = lattice_constant * xy_size[0] extent = lattice_constant * xy_size[0]
pat.ports = dict( ports = {
input=Port((-extent, 0), rotation=0, ptype='pcwg'), 'input': Port((-extent, 0), rotation=0, ptype='pcwg'),
output=Port((extent, 0), rotation=pi, ptype='pcwg'), 'output': Port((extent, 0), rotation=pi, ptype='pcwg'),
) }
ports_to_data(pat) return Device(pat, ports)
return pat
def waveguide( def waveguide(
lattice_constant: float, lattice_constant: float,
hole: str, hole: Pattern,
length: int, length: int,
mirror_periods: int, mirror_periods: int,
) -> Pattern: ) -> Device:
""" """
Generate a `Pattern` representing a photonic crystal line-defect waveguide. Generate a `Device` representing a photonic crystal line-defect waveguide.
Args: Args:
lattice_constant: Distance between nearest neighbor holes lattice_constant: Distance between nearest neighbor holes
hole: name of a `Pattern` containing a single hole hole: `Pattern` object containing a single hole
length: Distance (number of mirror periods) between the input and output ports. length: Distance (number of mirror periods) between the input and output ports.
Ports are placed at lattice sites. Ports are placed at lattice sites.
mirror_periods: Number of hole rows on each side of the line defect mirror_periods: Number of hole rows on each side of the line defect
Returns: Returns:
`Pattern` object representing the waveguide. `Device` object representing the waveguide.
""" """
# Generate hole locations # Generate hole locations
xy = pcgen.waveguide(length=length, num_mirror=mirror_periods) xy = pcgen.waveguide(length=length, num_mirror=mirror_periods)
# Build the pattern # Build the pattern
pat = Pattern() pat = Pattern(f'_wg-a{lattice_constant:g}l{length}')
pat.refs[hole] += [ pat.subpatterns += [SubPattern(hole, offset=(lattice_constant * x,
Ref(offset=(lattice_constant * x,
lattice_constant * y)) lattice_constant * y))
for x, y in xy] for x, y in xy]
# Ports are at outer edges, with y=0 # Ports are at outer edges, with y=0
extent = lattice_constant * length / 2 extent = lattice_constant * length / 2
pat.ports = dict( ports = {
left=Port((-extent, 0), rotation=0, ptype='pcwg'), 'left': Port((-extent, 0), rotation=0, ptype='pcwg'),
right=Port((extent, 0), rotation=pi, ptype='pcwg'), 'right': Port((extent, 0), rotation=pi, ptype='pcwg'),
) }
return Device(pat, ports)
ports_to_data(pat)
return pat
def bend( def bend(
lattice_constant: float, lattice_constant: float,
hole: str, hole: Pattern,
mirror_periods: int, mirror_periods: int,
) -> Pattern: ) -> Device:
""" """
Generate a `Pattern` representing a 60-degree counterclockwise bend in a photonic crystal Generate a `Device` representing a 60-degree counterclockwise bend in a photonic crystal
line-defect waveguide. line-defect waveguide.
Args: Args:
lattice_constant: Distance between nearest neighbor holes lattice_constant: Distance between nearest neighbor holes
hole: name of a `Pattern` containing a single hole hole: `Pattern` object containing a single hole
mirror_periods: Minimum number of mirror periods on each side of the line defect. mirror_periods: Minimum number of mirror periods on each side of the line defect.
Returns: Returns:
`Pattern` object representing the waveguide bend. `Device` object representing the waveguide bend.
Ports are named 'left' (input) and 'right' (output). Ports are named 'left' (input) and 'right' (output).
""" """
# Generate hole locations # Generate hole locations
xy = pcgen.wgbend(num_mirror=mirror_periods) xy = pcgen.wgbend(num_mirror=mirror_periods)
# Build the pattern # Build the pattern
pat= Pattern() pat= Pattern(f'_wgbend-a{lattice_constant:g}l{mirror_periods}')
pat.refs[hole] += [ pat.subpatterns += [
Ref(offset=(lattice_constant * x, SubPattern(hole, offset=(lattice_constant * x,
lattice_constant * y)) lattice_constant * y))
for x, y in xy] for x, y in xy]
# Figure out port locations. # Figure out port locations.
extent = lattice_constant * mirror_periods extent = lattice_constant * mirror_periods
pat.ports = dict( ports = {
left=Port((-extent, 0), rotation=0, ptype='pcwg'), 'left': Port((-extent, 0), rotation=0, ptype='pcwg'),
right=Port((extent / 2, 'right': Port((extent / 2,
extent * numpy.sqrt(3) / 2), extent * numpy.sqrt(3) / 2),
rotation=pi * 4 / 3, ptype='pcwg'), rotation=pi * 4 / 3, ptype='pcwg'),
) }
ports_to_data(pat) return Device(pat, ports)
return pat
def y_splitter( def y_splitter(
lattice_constant: float, lattice_constant: float,
hole: str, hole: Pattern,
mirror_periods: int, mirror_periods: int,
) -> Pattern: ) -> Device:
""" """
Generate a `Pattern` representing a photonic crystal line-defect waveguide y-splitter. Generate a `Device` representing a photonic crystal line-defect waveguide y-splitter.
Args: Args:
lattice_constant: Distance between nearest neighbor holes lattice_constant: Distance between nearest neighbor holes
hole: name of a `Pattern` containing a single hole hole: `Pattern` object containing a single hole
mirror_periods: Minimum number of mirror periods on each side of the line defect. mirror_periods: Minimum number of mirror periods on each side of the line defect.
Returns: Returns:
`Pattern` object representing the y-splitter. `Device` object representing the y-splitter.
Ports are named 'in', 'top', and 'bottom'. Ports are named 'in', 'top', and 'bottom'.
""" """
# Generate hole locations # Generate hole locations
xy = pcgen.y_splitter(num_mirror=mirror_periods) xy = pcgen.y_splitter(num_mirror=mirror_periods)
# Build pattern # Build pattern
pat = Pattern() pat = Pattern(f'_wgsplit_half-a{lattice_constant:g}l{mirror_periods}')
pat.refs[hole] += [ pat.subpatterns += [
Ref(offset=(lattice_constant * x, SubPattern(hole, offset=(lattice_constant * x,
lattice_constant * y)) lattice_constant * y))
for x, y in xy] for x, y in xy]
# Determine port locations # Determine port locations
extent = lattice_constant * mirror_periods extent = lattice_constant * mirror_periods
pat.ports = { ports = {
'in': Port((-extent, 0), rotation=0, ptype='pcwg'), 'in': Port((-extent, 0), rotation=0, ptype='pcwg'),
'top': Port((extent / 2, extent * numpy.sqrt(3) / 2), rotation=pi * 4 / 3, ptype='pcwg'), 'top': Port((extent / 2, extent * numpy.sqrt(3) / 2), rotation=pi * 4 / 3, ptype='pcwg'),
'bot': Port((extent / 2, -extent * numpy.sqrt(3) / 2), rotation=pi * 2 / 3, ptype='pcwg'), 'bot': Port((extent / 2, -extent * numpy.sqrt(3) / 2), rotation=pi * 2 / 3, ptype='pcwg'),
} }
return Device(pat, ports)
ports_to_data(pat)
return pat
def main(interactive: bool = True) -> None: def main(interactive: bool = True):
# Generate some basic hole patterns # Generate some basic hole patterns
shape_lib = { smile = basic_shapes.smile(RADIUS)
'smile': basic_shapes.smile(RADIUS), hole = basic_shapes.hole(RADIUS)
'hole': basic_shapes.hole(RADIUS),
}
# Build some devices # Build some devices
a = LATTICE_CONSTANT a = LATTICE_CONSTANT
wg10 = waveguide(lattice_constant=a, hole=hole, length=10, mirror_periods=5).rename('wg10')
wg05 = waveguide(lattice_constant=a, hole=hole, length=5, mirror_periods=5).rename('wg05')
wg28 = waveguide(lattice_constant=a, hole=hole, length=28, mirror_periods=5).rename('wg28')
bend0 = bend(lattice_constant=a, hole=hole, mirror_periods=5).rename('bend0')
ysplit = y_splitter(lattice_constant=a, hole=hole, mirror_periods=5).rename('ysplit')
l3cav = perturbed_l3(lattice_constant=a, hole=smile, xy_size=(4, 10)).rename('l3cav') # uses smile :)
devices = {} # Autogenerate port labels so that GDS will also contain port data
devices['wg05'] = waveguide(lattice_constant=a, hole='hole', length=5, mirror_periods=5) for device in [wg10, wg05, wg28, l3cav, ysplit, bend0]:
devices['wg10'] = waveguide(lattice_constant=a, hole='hole', length=10, mirror_periods=5) dev2pat(device)
devices['wg28'] = waveguide(lattice_constant=a, hole='hole', length=28, mirror_periods=5)
devices['wg90'] = waveguide(lattice_constant=a, hole='hole', length=90, mirror_periods=5)
devices['bend0'] = bend(lattice_constant=a, hole='hole', mirror_periods=5)
devices['ysplit'] = y_splitter(lattice_constant=a, hole='hole', mirror_periods=5)
devices['l3cav'] = perturbed_l3(lattice_constant=a, hole='smile', hole_lib=shape_lib, xy_size=(4, 10)) # uses smile :)
# Turn our dict of devices into a Library.
# This provides some convenience functions in the future!
lib = Library(devices)
# #
# Build a circuit # Build a circuit
# #
# Create a `Builder`, and add the circuit to our library as "my_circuit". circ = Device(name='my_circuit', ports={})
circ = Builder(library=lib, name='my_circuit')
# Start by placing a waveguide. Call its ports "in" and "signal". # Start by placing a waveguide. Call its ports "in" and "signal".
circ.place('wg10', offset=(0, 0), port_map={'left': 'in', 'right': 'signal'}) circ.place(wg10, offset=(0, 0), port_map={'left': 'in', 'right': 'signal'})
# Extend the signal path by attaching the "left" port of a waveguide. # Extend the signal path by attaching the "left" port of a waveguide.
# Since there is only one other port ("right") on the waveguide we # Since there is only one other port ("right") on the waveguide we
# are attaching (wg10), it automatically inherits the name "signal". # are attaching (wg10), it automatically inherits the name "signal".
circ.plug('wg10', {'signal': 'left'}) circ.plug(wg10, {'signal': 'left'})
# We could have done the following instead:
# circ_pat = Pattern()
# lib['my_circuit'] = circ_pat
# circ_pat.place(lib.abstract('wg10'), ...)
# circ_pat.plug(lib.abstract('wg10'), ...)
# but `Builder` lets us omit some of the repetition of `lib.abstract(...)`, and uses similar
# syntax to `Pather` and `RenderPather`, which add wire/waveguide routing functionality.
# Attach a y-splitter to the signal path. # Attach a y-splitter to the signal path.
# Since the y-splitter has 3 ports total, we can't auto-inherit the # Since the y-splitter has 3 ports total, we can't auto-inherit the
# port name, so we have to specify what we want to name the unattached # port name, so we have to specify what we want to name the unattached
# ports. We can call them "signal1" and "signal2". # ports. We can call them "signal1" and "signal2".
circ.plug('ysplit', {'signal': 'in'}, {'top': 'signal1', 'bot': 'signal2'}) circ.plug(ysplit, {'signal': 'in'}, {'top': 'signal1', 'bot': 'signal2'})
# Add a waveguide to both signal ports, inheriting their names. # Add a waveguide to both signal ports, inheriting their names.
circ.plug('wg05', {'signal1': 'left'}) circ.plug(wg05, {'signal1': 'left'})
circ.plug('wg05', {'signal2': 'left'}) circ.plug(wg05, {'signal2': 'left'})
# Add a bend to both ports. # Add a bend to both ports.
# Our bend's ports "left" and "right" refer to the original counterclockwise # Our bend's ports "left" and "right" refer to the original counterclockwise
@ -288,22 +265,22 @@ def main(interactive: bool = True) -> None:
# to "signal2" to bend counterclockwise. # to "signal2" to bend counterclockwise.
# We could also use `mirrored=(True, False)` to mirror one of the devices # We could also use `mirrored=(True, False)` to mirror one of the devices
# and then use same device port on both paths. # and then use same device port on both paths.
circ.plug('bend0', {'signal1': 'right'}) circ.plug(bend0, {'signal1': 'right'})
circ.plug('bend0', {'signal2': 'left'}) circ.plug(bend0, {'signal2': 'left'})
# We add some waveguides and a cavity to "signal1". # We add some waveguides and a cavity to "signal1".
circ.plug('wg10', {'signal1': 'left'}) circ.plug(wg10, {'signal1': 'left'})
circ.plug('l3cav', {'signal1': 'input'}) circ.plug(l3cav, {'signal1': 'input'})
circ.plug('wg10', {'signal1': 'left'}) circ.plug(wg10, {'signal1': 'left'})
# "signal2" just gets a single of equivalent length # "signal2" just gets a single of equivalent length
circ.plug('wg28', {'signal2': 'left'}) circ.plug(wg28, {'signal2': 'left'})
# Now we bend both waveguides back towards each other # Now we bend both waveguides back towards each other
circ.plug('bend0', {'signal1': 'right'}) circ.plug(bend0, {'signal1': 'right'})
circ.plug('bend0', {'signal2': 'left'}) circ.plug(bend0, {'signal2': 'left'})
circ.plug('wg05', {'signal1': 'left'}) circ.plug(wg05, {'signal1': 'left'})
circ.plug('wg05', {'signal2': 'left'}) circ.plug(wg05, {'signal2': 'left'})
# To join the waveguides, we attach a second y-junction. # To join the waveguides, we attach a second y-junction.
# We plug "signal1" into the "bot" port, and "signal2" into the "top" port. # We plug "signal1" into the "bot" port, and "signal2" into the "top" port.
@ -311,34 +288,23 @@ def main(interactive: bool = True) -> None:
# This operation would raise an exception if the ports did not line up # This operation would raise an exception if the ports did not line up
# correctly (i.e. they required different rotations or translations of the # correctly (i.e. they required different rotations or translations of the
# y-junction device). # y-junction device).
circ.plug('ysplit', {'signal1': 'bot', 'signal2': 'top'}, {'in': 'signal_out'}) circ.plug(ysplit, {'signal1': 'bot', 'signal2': 'top'}, {'in': 'signal_out'})
# Finally, add some more waveguide to "signal_out". # Finally, add some more waveguide to "signal_out".
circ.plug('wg10', {'signal_out': 'left'}) circ.plug(wg10, {'signal_out': 'left'})
# We can also add text labels for our circuit's ports.
# They will appear at the uppermost hierarchy level, while the individual
# device ports will appear further down, in their respective cells.
ports_to_data(circ.pattern)
# Check if we forgot to include any patterns... ooops!
if dangling := lib.dangling_refs():
print('Warning: The following patterns are referenced, but not present in the'
f' library! {dangling}')
print('We\'ll solve this by merging in shape_lib, which contains those shapes...')
lib.add(shape_lib)
assert not lib.dangling_refs()
# We can visualize the design. Usually it's easier to just view the GDS. # We can visualize the design. Usually it's easier to just view the GDS.
if interactive: if interactive:
print('Visualizing... this step may be slow') print('Visualizing... this step may be slow')
circ.pattern.visualize(lib) circ.pattern.visualize()
#Write out to GDS, only keeping patterns referenced by our circuit (including itself) # We can also add text labels for our circuit's ports.
subtree = lib.subtree('my_circuit') # don't include wg90, which we don't use # They will appear at the uppermost hierarchy level, while the individual
check_valid_names(subtree.keys()) # device ports will appear further down, in their respective cells.
writefile(subtree, 'circuit.gds', **GDS_OPTS) dev2pat(circ)
# Write out to GDS
writefile(circ.pattern, 'circuit.gds', **GDS_OPTS)
if __name__ == '__main__': if __name__ == '__main__':

View File

@ -1,83 +1,81 @@
from typing import Any from typing import Tuple, Sequence, Callable
from collections.abc import Sequence, Callable
from pprint import pformat from pprint import pformat
import numpy import numpy
from numpy import pi from numpy import pi
from masque import Pattern, Builder, LazyLibrary from masque.builder import Device
from masque.library import Library, LibDeviceLibrary
from masque.file.gdsii import writefile, load_libraryfile from masque.file.gdsii import writefile, load_libraryfile
import pcgen import pcgen
import basic_shapes import basic_shapes
import devices import devices
from devices import ports_to_data, data_to_ports from devices import pat2dev, dev2pat
from basic_shapes import GDS_OPTS from basic_shapes import GDS_OPTS
def main() -> None: def main() -> None:
# Define a `LazyLibrary`, which provides lazy evaluation for generating # Define a `Library`-backed `DeviceLibrary`, which provides lazy evaluation
# patterns and lazy-loading of GDS contents. # for device generation code and lazy-loading of GDS contents.
lib = LazyLibrary() device_lib = LibDeviceLibrary()
# #
# Load some devices from a GDS file # Load some devices from a GDS file
# #
# Scan circuit.gds and prepare to lazy-load its contents # Scan circuit.gds and prepare to lazy-load its contents
gds_lib, _properties = load_libraryfile('circuit.gds', postprocess=data_to_ports) pattern_lib, _properties = load_libraryfile('circuit.gds', tag='mycirc01')
# Add it into the device library by providing a way to read port info # Add it into the device library by providing a way to read port info
# This maintains the lazy evaluation from above, so no patterns # This maintains the lazy evaluation from above, so no patterns
# are actually read yet. # are actually read yet.
lib.add(gds_lib) device_lib.add_library(pattern_lib, pat2dev=pat2dev)
print('Devices loaded from GDS into library:\n' + pformat(list(device_lib.keys())))
print('Patterns loaded from GDS into library:\n' + pformat(list(lib.keys())))
# #
# Add some new devices to the library, this time from python code rather than GDS # Add some new devices to the library, this time from python code rather than GDS
# #
lib['triangle'] = lambda: basic_shapes.triangle(devices.RADIUS) a = devices.LATTICE_CONSTANT
opts: dict[str, Any] = dict( tri = basic_shapes.triangle(devices.RADIUS)
lattice_constant = devices.LATTICE_CONSTANT,
hole = 'triangle', # Convenience function for adding devices
) # This is roughly equivalent to
# `device_lib[name] = lambda: dev2pat(fn())`
# but it also guarantees that the resulting pattern is named `name`.
def add(name: str, fn: Callable[[], Device]) -> None:
device_lib.add_device(name=name, fn=fn, dev2pat=dev2pat)
# Triangle-based variants. These are defined here, but they won't run until they're # Triangle-based variants. These are defined here, but they won't run until they're
# retrieved from the library. # retrieved from the library.
lib['tri_wg10'] = lambda: devices.waveguide(length=10, mirror_periods=5, **opts) add('tri_wg10', lambda: devices.waveguide(lattice_constant=a, hole=tri, length=10, mirror_periods=5))
lib['tri_wg05'] = lambda: devices.waveguide(length=5, mirror_periods=5, **opts) add('tri_wg05', lambda: devices.waveguide(lattice_constant=a, hole=tri, length=5, mirror_periods=5))
lib['tri_wg28'] = lambda: devices.waveguide(length=28, mirror_periods=5, **opts) add('tri_wg28', lambda: devices.waveguide(lattice_constant=a, hole=tri, length=28, mirror_periods=5))
lib['tri_bend0'] = lambda: devices.bend(mirror_periods=5, **opts) add('tri_bend0', lambda: devices.bend(lattice_constant=a, hole=tri, mirror_periods=5))
lib['tri_ysplit'] = lambda: devices.y_splitter(mirror_periods=5, **opts) add('tri_ysplit', lambda: devices.y_splitter(lattice_constant=a, hole=tri, mirror_periods=5))
lib['tri_l3cav'] = lambda: devices.perturbed_l3(xy_size=(4, 10), **opts, hole_lib=lib) add('tri_l3cav', lambda: devices.perturbed_l3(lattice_constant=a, hole=tri, xy_size=(4, 10)))
# #
# Build a mixed waveguide with an L3 cavity in the middle # Build a mixed waveguide with an L3 cavity in the middle
# #
# Immediately start building from an instance of the L3 cavity # Immediately start building from an instance of the L3 cavity
circ2 = Builder(library=lib, ports='tri_l3cav') circ2 = device_lib['tri_l3cav'].build('mixed_wg_cav')
# First way to get abstracts is `lib.abstract(name)` print(device_lib['wg10'].ports)
# We can use this syntax directly with `Pattern.plug()` and `Pattern.place()` as well as through `Builder`. circ2.plug(device_lib['wg10'], {'input': 'right'})
circ2.plug(lib.abstract('wg10'), {'input': 'right'}) circ2.plug(device_lib['wg10'], {'output': 'left'})
circ2.plug(device_lib['tri_wg10'], {'input': 'right'})
# Second way to get abstracts is to use an AbstractView circ2.plug(device_lib['tri_wg10'], {'output': 'left'})
# This also works directly with `Pattern.plug()` / `Pattern.place()`.
abstracts = lib.abstract_view()
circ2.plug(abstracts['wg10'], {'output': 'left'})
# Third way to specify an abstract works by automatically getting
# it from the library already within the Builder object.
# This wouldn't work if we only had a `Pattern` (not a `Builder`).
# Just pass the pattern name!
circ2.plug('tri_wg10', {'input': 'right'})
circ2.plug('tri_wg10', {'output': 'left'})
# Add the circuit to the device library. # Add the circuit to the device library.
lib['mixed_wg_cav'] = circ2.pattern # It has already been generated, so we can use `set_const` as a shorthand for
# `device_lib['mixed_wg_cav'] = lambda: circ2`
device_lib.set_const(circ2)
# #
@ -85,26 +83,29 @@ def main() -> None:
# #
# We'll be designing against an existing device's interface... # We'll be designing against an existing device's interface...
circ3 = Builder.interface(source=circ2) circ3 = circ2.as_interface('loop_segment')
# ... that lets us continue from where we left off. # ... that lets us continue from where we left off.
circ3.plug('tri_bend0', {'input': 'right'}) circ3.plug(device_lib['tri_bend0'], {'input': 'right'})
circ3.plug('tri_bend0', {'input': 'left'}, mirrored=True) # mirror since no tri y-symmetry circ3.plug(device_lib['tri_bend0'], {'input': 'left'}, mirrored=(True, False)) # mirror since no tri y-symmetry
circ3.plug('tri_bend0', {'input': 'right'}) circ3.plug(device_lib['tri_bend0'], {'input': 'right'})
circ3.plug('bend0', {'output': 'left'}) circ3.plug(device_lib['bend0'], {'output': 'left'})
circ3.plug('bend0', {'output': 'left'}) circ3.plug(device_lib['bend0'], {'output': 'left'})
circ3.plug('bend0', {'output': 'left'}) circ3.plug(device_lib['bend0'], {'output': 'left'})
circ3.plug('tri_wg10', {'input': 'right'}) circ3.plug(device_lib['tri_wg10'], {'input': 'right'})
circ3.plug('tri_wg28', {'input': 'right'}) circ3.plug(device_lib['tri_wg28'], {'input': 'right'})
circ3.plug('tri_wg10', {'input': 'right', 'output': 'left'}) circ3.plug(device_lib['tri_wg10'], {'input': 'right', 'output': 'left'})
lib['loop_segment'] = circ3.pattern device_lib.set_const(circ3)
# #
# Write all devices into a GDS file # Write all devices into a GDS file
# #
print('Writing library to file...')
writefile(lib, 'library.gds', **GDS_OPTS) # This line could be slow, since it generates or loads many of the devices
# since they were not all accessed above.
all_device_pats = [dev.pattern for dev in device_lib.values()]
writefile(all_device_pats, 'library.gds', **GDS_OPTS)
if __name__ == '__main__': if __name__ == '__main__':
@ -115,21 +116,22 @@ if __name__ == '__main__':
#class prout: #class prout:
# def place( # def place(
# self, # self,
# other: Pattern, # other: Device,
# label_layer: layer_t = 'WATLAYER', # label_layer: layer_t = 'WATLAYER',
# *, # *,
# port_map: Dict[str, str | None] | None = None, # port_map: Optional[Dict[str, Optional[str]]] = None,
# **kwargs, # **kwargs,
# ) -> 'prout': # ) -> 'prout':
# #
# Pattern.place(self, other, port_map=port_map, **kwargs) # Device.place(self, other, port_map=port_map, **kwargs)
# name: str | None # name: Optional[str]
# for name in other.ports: # for name in other.ports:
# if port_map: # if port_map:
# assert(name is not None) # assert(name is not None)
# name = port_map.get(name, name) # name = port_map.get(name, name)
# if name is None: # if name is None:
# continue # continue
# self.pattern.label(string=name, offset=self.ports[name].offset, layer=label_layer) # self.pattern.labels += [
# Label(string=name, offset=self.ports[name].offset, layer=layer)]
# return self # return self
# #

View File

@ -1,277 +0,0 @@
"""
Manual wire routing tutorial: Pather and BasicTool
"""
from collections.abc import Callable
from numpy import pi
from masque import Pather, RenderPather, Library, Pattern, Port, layer_t, map_layers
from masque.builder.tools import BasicTool, PathTool
from masque.file.gdsii import writefile
from basic_shapes import GDS_OPTS
#
# Define some basic wire widths, in nanometers
# M2 is the top metal; M1 is below it and connected with vias on V1
#
M1_WIDTH = 1000
V1_WIDTH = 500
M2_WIDTH = 4000
#
# First, we can define some functions for generating our wire geometry
#
def make_pad() -> Pattern:
"""
Create a pattern with a single rectangle of M2, with a single port on the bottom
Every pad will be an instance of the same pattern, so we will only call this function once.
"""
pat = Pattern()
pat.rect(layer='M2', xctr=0, yctr=0, lx=3 * M2_WIDTH, ly=4 * M2_WIDTH)
pat.ports['wire_port'] = Port((0, -2 * M2_WIDTH), rotation=pi / 2, ptype='m2wire')
return pat
def make_via(
layer_top: layer_t,
layer_via: layer_t,
layer_bot: layer_t,
width_top: float,
width_via: float,
width_bot: float,
ptype_top: str,
ptype_bot: str,
) -> Pattern:
"""
Generate three concentric squares, on the provided layers
(`layer_top`, `layer_via`, `layer_bot`) and with the provided widths
(`width_top`, `width_via`, `width_bot`).
Two ports are added, with the provided ptypes (`ptype_top`, `ptype_bot`).
They are placed at the left edge of the top layer and right edge of the
bottom layer, respectively.
We only have one via type, so we will only call this function once.
"""
pat = Pattern()
pat.rect(layer=layer_via, xctr=0, yctr=0, lx=width_via, ly=width_via)
pat.rect(layer=layer_bot, xctr=0, yctr=0, lx=width_bot, ly=width_bot)
pat.rect(layer=layer_top, xctr=0, yctr=0, lx=width_top, ly=width_top)
pat.ports = {
'top': Port(offset=(-width_top / 2, 0), rotation=0, ptype=ptype_top),
'bottom': Port(offset=(width_bot / 2, 0), rotation=pi, ptype=ptype_bot),
}
return pat
def make_bend(layer: layer_t, width: float, ptype: str) -> Pattern:
"""
Generate a triangular wire, with ports at the left (input) and bottom (output) edges.
This is effectively a clockwise wire bend.
Every bend will be the same, so we only need to call this twice (once each for M1 and M2).
We could call it additional times for different wire widths or bend types (e.g. squares).
"""
pat = Pattern()
pat.polygon(layer=layer, vertices=[(0, -width / 2), (0, width / 2), (width, -width / 2)])
pat.ports = {
'input': Port(offset=(0, 0), rotation=0, ptype=ptype),
'output': Port(offset=(width / 2, -width / 2), rotation=pi / 2, ptype=ptype),
}
return pat
def make_straight_wire(layer: layer_t, width: float, ptype: str, length: float) -> Pattern:
"""
Generate a straight wire with ports along either end (x=0 and x=length).
Every waveguide will be single-use, so we'll need to create lots of (mostly unique)
`Pattern`s, and this function will get called very often.
"""
pat = Pattern()
pat.rect(layer=layer, xmin=0, xmax=length, yctr=0, ly=width)
pat.ports = {
'input': Port(offset=(0, 0), rotation=0, ptype=ptype),
'output': Port(offset=(length, 0), rotation=pi, ptype=ptype),
}
return pat
def map_layer(layer: layer_t) -> layer_t:
"""
Map from a strings to GDS layer numbers
"""
layer_mapping = {
'M1': (10, 0),
'M2': (20, 0),
'V1': (30, 0),
}
return layer_mapping.get(layer, layer)
#
# Now we can start building up our library (collection of static cells) and pathing tools.
#
# If any of the operations below are confusing, you can cross-reference against the `RenderPather`
# tutorial, which handles some things more explicitly (e.g. via placement) and simplifies others
# (e.g. geometry definition).
#
def main() -> None:
# Build some patterns (static cells) using the above functions and store them in a library
library = Library()
library['pad'] = make_pad()
library['m1_bend'] = make_bend(layer='M1', ptype='m1wire', width=M1_WIDTH)
library['m2_bend'] = make_bend(layer='M2', ptype='m2wire', width=M2_WIDTH)
library['v1_via'] = make_via(
layer_top='M2',
layer_via='V1',
layer_bot='M1',
width_top=M2_WIDTH,
width_via=V1_WIDTH,
width_bot=M1_WIDTH,
ptype_bot='m1wire',
ptype_top='m2wire',
)
#
# Now, define two tools.
# M1_tool will route on M1, using wires with M1_WIDTH
# M2_tool will route on M2, using wires with M2_WIDTH
# Both tools are able to automatically transition from the other wire type (with a via)
#
# Note that while we use BasicTool for this tutorial, you can define your own `Tool`
# with arbitrary logic inside -- e.g. with single-use bends, complex transition rules,
# transmission line geometry, or other features.
#
M1_tool = BasicTool(
straight = (
# First, we need a function which takes in a length and spits out an M1 wire
lambda length: make_straight_wire(layer='M1', ptype='m1wire', width=M1_WIDTH, length=length),
'input', # When we get a pattern from make_straight_wire, use the port named 'input' as the input
'output', # and use the port named 'output' as the output
),
bend = (
library.abstract('m1_bend'), # When we need a bend, we'll reference the pattern we generated earlier
'input', # To orient it clockwise, use the port named 'input' as the input
'output', # and 'output' as the output
),
transitions = { # We can automate transitions for different (normally incompatible) port types
'm2wire': ( # For example, when we're attaching to a port with type 'm2wire'
library.abstract('v1_via'), # we can place a V1 via
'top', # using the port named 'top' as the input (i.e. the M2 side of the via)
'bottom', # and using the port named 'bottom' as the output
),
},
default_out_ptype = 'm1wire', # Unless otherwise requested, we'll default to trying to stay on M1
)
M2_tool = BasicTool(
straight = (
# Again, we use make_straight_wire, but this time we set parameters for M2
lambda length: make_straight_wire(layer='M2', ptype='m2wire', width=M2_WIDTH, length=length),
'input',
'output',
),
bend = (
library.abstract('m2_bend'), # and we use an M2 bend
'input',
'output',
),
transitions = {
'm1wire': (
library.abstract('v1_via'), # We still use the same via,
'bottom', # but the input port is now 'bottom'
'top', # and the output port is now 'top'
),
},
default_out_ptype = 'm2wire', # We default to trying to stay on M2
)
#
# Create a new pather which writes to `library` and uses `M2_tool` as its default tool.
# Then, place some pads and start routing wires!
#
pather = Pather(library, tools=M2_tool)
# Place two pads, and define their ports as 'VCC' and 'GND'
pather.place('pad', offset=(18_000, 30_000), port_map={'wire_port': 'VCC'})
pather.place('pad', offset=(18_000, 60_000), port_map={'wire_port': 'GND'})
# Add some labels to make the pads easier to distinguish
pather.pattern.label(layer='M2', string='VCC', offset=(18e3, 30e3))
pather.pattern.label(layer='M2', string='GND', offset=(18e3, 60e3))
# Path VCC forward (in this case south) and turn clockwise 90 degrees (ccw=False)
# The total distance forward (including the bend's forward component) must be 6um
pather.path('VCC', ccw=False, length=6_000)
# Now path VCC to x=0. This time, don't include any bend (ccw=None).
# Note that if we tried y=0 here, we would get an error since the VCC port is facing in the x-direction.
pather.path_to('VCC', ccw=None, x=0)
# Path GND forward by 5um, turning clockwise 90 degrees.
# This time we use shorthand (bool(0) == False) and omit the parameter labels
# Note that although ccw=0 is equivalent to ccw=False, ccw=None is not!
pather.path('GND', 0, 5_000)
# This time, path GND until it matches the current x-coordinate of VCC. Don't place a bend.
pather.path_to('GND', None, x=pather['VCC'].offset[0])
# Now, start using M1_tool for GND.
# Since we have defined an M2-to-M1 transition for BasicPather, we don't need to place one ourselves.
# If we wanted to place our via manually, we could add `pather.plug('m1_via', {'GND': 'top'})` here
# and achieve the same result without having to define any transitions in M1_tool.
# Note that even though we have changed the tool used for GND, the via doesn't get placed until
# the next time we draw a path on GND (the pather.mpath() statement below).
pather.retool(M1_tool, keys=['GND'])
# Bundle together GND and VCC, and path the bundle forward and counterclockwise.
# Pick the distance so that the leading/outermost wire (in this case GND) ends up at x=-10_000.
# Other wires in the bundle (in this case VCC) should be spaced at 5_000 pitch (so VCC ends up at x=-5_000)
#
# Since we recently retooled GND, its path starts with a via down to M1 (included in the distance
# calculation), and its straight segment and bend will be drawn using M1 while VCC's are drawn with M2.
pather.mpath(['GND', 'VCC'], ccw=True, xmax=-10_000, spacing=5_000)
# Now use M1_tool as the default tool for all ports/signals.
# Since VCC does not have an explicitly assigned tool, it will now transition down to M1.
pather.retool(M1_tool)
# Path the GND + VCC bundle forward and counterclockwise by 90 degrees.
# The total extension (travel distance along the forward direction) for the longest segment (in
# this case the segment being added to GND) should be exactly 50um.
# After turning, the wire pitch should be reduced only 1.2um.
pather.mpath(['GND', 'VCC'], ccw=True, emax=50_000, spacing=1_200)
# Make a U-turn with the bundle and expand back out to 4.5um wire pitch.
# Here, emin specifies the travel distance for the shortest segment. For the first mpath() call
# that applies to VCC, and for teh second call, that applies to GND; the relative lengths of the
# segments depend on their starting positions and their ordering within the bundle.
pather.mpath(['GND', 'VCC'], ccw=False, emin=1_000, spacing=1_200)
pather.mpath(['GND', 'VCC'], ccw=False, emin=2_000, spacing=4_500)
# Now, set the default tool back to M2_tool. Note that GND remains on M1 since it has been
# explicitly assigned a tool. We could `del pather.tools['GND']` to force it to use the default.
pather.retool(M2_tool)
# Now path both ports to x=-28_000.
# When ccw is not None, xmin constrains the trailing/innermost port to stop at the target x coordinate,
# However, with ccw=None, all ports stop at the same coordinate, and so specifying xmin= or xmax= is
# equivalent.
pather.mpath(['GND', 'VCC'], None, xmin=-28_000)
# Further extend VCC out to x=-50_000, and specify that we would like to get an output on M1.
# This results in a via at the end of the wire (instead of having one at the start like we got
# when using pather.retool().
pather.path_to('VCC', None, -50_000, out_ptype='m1wire')
# Save the pather's pattern into our library
library['Pather_and_BasicTool'] = pather.pattern
# Convert from text-based layers to numeric layers for GDS, and output the file
library.map_layers(map_layer)
writefile(library, 'pather.gds', **GDS_OPTS)
if __name__ == '__main__':
main()

View File

@ -2,7 +2,7 @@
Routines for creating normalized 2D lattices and common photonic crystal Routines for creating normalized 2D lattices and common photonic crystal
cavity designs. cavity designs.
""" """
from collection.abc import Sequence from typing import Sequence, Tuple
import numpy import numpy
from numpy.typing import ArrayLike, NDArray from numpy.typing import ArrayLike, NDArray
@ -29,11 +29,8 @@ def triangular_lattice(
Returns: Returns:
`[[x0, y0], [x1, 1], ...]` denoting lattice sites. `[[x0, y0], [x1, 1], ...]` denoting lattice sites.
""" """
sx, sy = numpy.meshgrid( sx, sy = numpy.meshgrid(numpy.arange(dims[0], dtype=float),
numpy.arange(dims[0], dtype=float), numpy.arange(dims[1], dtype=float), indexing='ij')
numpy.arange(dims[1], dtype=float),
indexing='ij',
)
sx[sy % 2 == 1] += 0.5 sx[sy % 2 == 1] += 0.5
sy *= numpy.sqrt(3) / 2 sy *= numpy.sqrt(3) / 2
@ -233,8 +230,8 @@ def ln_shift_defect(
# Shift holes # Shift holes
# Expand shifts as necessary # Expand shifts as necessary
tmp_a = numpy.asarray(shifts_a) tmp_a = numpy.array(shifts_a)
tmp_r = numpy.asarray(shifts_r) tmp_r = numpy.array(shifts_r)
n_shifted = max(tmp_a.size, tmp_r.size) n_shifted = max(tmp_a.size, tmp_r.size)
shifts_a = numpy.ones(n_shifted) shifts_a = numpy.ones(n_shifted)

View File

@ -1,96 +0,0 @@
"""
Manual wire routing tutorial: RenderPather an PathTool
"""
from collections.abc import Callable
from masque import RenderPather, Library, Pattern, Port, layer_t, map_layers
from masque.builder.tools import PathTool
from masque.file.gdsii import writefile
from basic_shapes import GDS_OPTS
from pather import M1_WIDTH, V1_WIDTH, M2_WIDTH, map_layer, make_pad, make_via
def main() -> None:
#
# To illustrate the advantages of using `RenderPather`, we use `PathTool` instead
# of `BasicTool`. `PathTool` lacks some sophistication (e.g. no automatic transitions)
# but when used with `RenderPather`, it can consolidate multiple routing steps into
# a single `Path` shape.
#
# We'll try to nearly replicate the layout from the `Pather` tutorial; see `pather.py`
# for more detailed descriptions of the individual pathing steps.
#
# First, we make a library and generate some of the same patterns as in the pather tutorial
library = Library()
library['pad'] = make_pad()
library['v1_via'] = make_via(
layer_top='M2',
layer_via='V1',
layer_bot='M1',
width_top=M2_WIDTH,
width_via=V1_WIDTH,
width_bot=M1_WIDTH,
ptype_bot='m1wire',
ptype_top='m2wire',
)
# `PathTool` is more limited than `BasicTool`. It only generates one type of shape
# (`Path`), so it only needs to know what layer to draw on, what width to draw with,
# and what port type to present.
M1_ptool = PathTool(layer='M1', width=M1_WIDTH, ptype='m1wire')
M2_ptool = PathTool(layer='M2', width=M2_WIDTH, ptype='m2wire')
rpather = RenderPather(tools=M2_ptool, library=library)
# As in the pather tutorial, we make soem pads and labels...
rpather.place('pad', offset=(18_000, 30_000), port_map={'wire_port': 'VCC'})
rpather.place('pad', offset=(18_000, 60_000), port_map={'wire_port': 'GND'})
rpather.pattern.label(layer='M2', string='VCC', offset=(18e3, 30e3))
rpather.pattern.label(layer='M2', string='GND', offset=(18e3, 60e3))
# ...and start routing the signals.
rpather.path('VCC', ccw=False, length=6_000)
rpather.path_to('VCC', ccw=None, x=0)
rpather.path('GND', 0, 5_000)
rpather.path_to('GND', None, x=rpather['VCC'].offset[0])
# `PathTool` doesn't know how to transition betwen metal layers, so we have to
# `plug` the via into the GND wire ourselves.
rpather.plug('v1_via', {'GND': 'top'})
rpather.retool(M1_ptool, keys=['GND'])
rpather.mpath(['GND', 'VCC'], ccw=True, xmax=-10_000, spacing=5_000)
# Same thing on the VCC wire when it goes down to M1.
rpather.plug('v1_via', {'VCC': 'top'})
rpather.retool(M1_ptool)
rpather.mpath(['GND', 'VCC'], ccw=True, emax=50_000, spacing=1_200)
rpather.mpath(['GND', 'VCC'], ccw=False, emin=1_000, spacing=1_200)
rpather.mpath(['GND', 'VCC'], ccw=False, emin=2_000, spacing=4_500)
# And again when VCC goes back up to M2.
rpather.plug('v1_via', {'VCC': 'bottom'})
rpather.retool(M2_ptool)
rpather.mpath(['GND', 'VCC'], None, xmin=-28_000)
# Finally, since PathTool has no conception of transitions, we can't
# just ask it to transition to an 'm1wire' port at the end of the final VCC segment.
# Instead, we have to calculate the via size ourselves, and adjust the final position
# to account for it.
via_size = abs(
library['v1_via'].ports['top'].offset[0]
- library['v1_via'].ports['bottom'].offset[0]
)
rpather.path_to('VCC', None, -50_000 + via_size)
rpather.plug('v1_via', {'VCC': 'top'})
rpather.render()
library['RenderPather_and_PathTool'] = rpather.pattern
# Convert from text-based layers to numeric layers for GDS, and output the file
library.map_layers(map_layer)
writefile(library, 'render_pather.gds', **GDS_OPTS)
if __name__ == '__main__':
main()

View File

@ -1,16 +1,16 @@
""" """
masque 2D CAD library masque 2D CAD library
masque is an attempt to make a relatively compact library for designing lithography masque is an attempt to make a relatively small library for designing lithography
masks. The general idea is to implement something resembling the GDSII and OASIS file-formats, masks. The general idea is to implement something resembling the GDSII and OASIS file-formats,
but with some additional vectorized element types (eg. ellipses, not just polygons), and the but with some additional vectorized element types (eg. ellipses, not just polygons), better
ability to interface with multiple file formats. support for E-beam doses, and the ability to interface with multiple file formats.
`Pattern` is a basic object containing a 2D lithography mask, composed of a list of `Shape` `Pattern` is a basic object containing a 2D lithography mask, composed of a list of `Shape`
objects, a list of `Label` objects, and a list of references to other `Patterns` (using objects, a list of `Label` objects, and a list of references to other `Patterns` (using
`Ref`). `SubPattern`).
`Ref` provides basic support for nesting `Pattern` objects within each other, by adding `SubPattern` provides basic support for nesting `Pattern` objects within each other, by adding
offset, rotation, scaling, repetition, and other such properties to a Pattern reference. offset, rotation, scaling, repetition, and other such properties to a Pattern reference.
Note that the methods for these classes try to avoid copying wherever possible, so unless Note that the methods for these classes try to avoid copying wherever possible, so unless
@ -20,73 +20,24 @@
NOTES ON INTERNALS NOTES ON INTERNALS
========================== ==========================
- Many of `masque`'s classes make use of `__slots__` to make them faster / smaller. - Many of `masque`'s classes make use of `__slots__` to make them faster / smaller.
Since `__slots__` doesn't play well with multiple inheritance, often they are left Since `__slots__` doesn't play well with multiple inheritance, the `masque.utils.AutoSlots`
empty for superclasses and it is the subclass's responsibility to set them correctly. metaclass is used to auto-generate slots based on superclass type annotations.
- File I/O submodules are not imported by `masque.file` to avoid creating hard dependencies - File I/O submodules are imported by `masque.file` to avoid creating hard dependencies on
on external file-format reader/writers external file-format reader/writers
- Try to accept the broadest-possible inputs: e.g., don't demand an `ILibraryView` if you - Pattern locking/unlocking is quite slow for large hierarchies.
can accept a `Mapping[str, Pattern]` and wrap it in a `LibraryView` internally.
""" """
from .utils import ( from .error import PatternError, PatternLockedError
layer_t as layer_t, from .shapes import Shape
annotations_t as annotations_t, from .label import Label
SupportsBool as SupportsBool, from .subpattern import SubPattern
) from .pattern import Pattern
from .error import ( from .utils import layer_t, annotations_t
MasqueError as MasqueError, from .library import Library, DeviceLibrary
PatternError as PatternError,
LibraryError as LibraryError,
BuildError as BuildError,
)
from .shapes import (
Shape as Shape,
Polygon as Polygon,
Path as Path,
Circle as Circle,
Arc as Arc,
Ellipse as Ellipse,
)
from .label import Label as Label
from .ref import Ref as Ref
from .pattern import (
Pattern as Pattern,
map_layers as map_layers,
map_targets as map_targets,
chain_elements as chain_elements,
)
from .library import (
ILibraryView as ILibraryView,
ILibrary as ILibrary,
LibraryView as LibraryView,
Library as Library,
LazyLibrary as LazyLibrary,
AbstractView as AbstractView,
TreeView as TreeView,
Tree as Tree,
)
from .ports import (
Port as Port,
PortList as PortList,
)
from .abstract import Abstract as Abstract
from .builder import (
Builder as Builder,
Tool as Tool,
Pather as Pather,
RenderPather as RenderPather,
RenderStep as RenderStep,
BasicTool as BasicTool,
PathTool as PathTool,
)
from .utils import (
ports2data as ports2data,
oneshot as oneshot,
)
__author__ = 'Jan Petykiewicz' __author__ = 'Jan Petykiewicz'
__version__ = '3.2' __version__ = '2.7'
version = __version__ # legacy version = __version__ # legacy

View File

@ -1,217 +0,0 @@
from typing import Self
import copy
import logging
import numpy
from numpy.typing import ArrayLike
from .ref import Ref
from .ports import PortList, Port
from .utils import rotation_matrix_2d
#if TYPE_CHECKING:
# from .builder import Builder, Tool
# from .library import ILibrary
logger = logging.getLogger(__name__)
class Abstract(PortList):
"""
An `Abstract` is a container for a name and associated ports.
When snapping a sub-component to an existing pattern, only the name (not contained
in a `Pattern` object) and port info is needed, and not the geometry itself.
"""
__slots__ = ('name', '_ports')
name: str
""" Name of the pattern this device references """
_ports: dict[str, Port]
""" Uniquely-named ports which can be used to instances together"""
@property
def ports(self) -> dict[str, Port]:
return self._ports
@ports.setter
def ports(self, value: dict[str, Port]) -> None:
self._ports = value
def __init__(
self,
name: str,
ports: dict[str, Port],
) -> None:
self.name = name
self.ports = copy.deepcopy(ports)
# TODO do we want to store a Ref instead of just a name? then we can translate/rotate/mirror...
def __repr__(self) -> str:
s = f'<Abstract {self.name} ['
for name, port in self.ports.items():
s += f'\n\t{name}: {port}'
s += ']>'
return s
def translate_ports(self, offset: ArrayLike) -> Self:
"""
Translates all ports by the given offset.
Args:
offset: (x, y) to translate by
Returns:
self
"""
for port in self.ports.values():
port.translate(offset)
return self
def scale_by(self, c: float) -> Self:
"""
Scale this Abstract by the given value
(all port offsets are scaled)
Args:
c: factor to scale by
Returns:
self
"""
for port in self.ports.values():
port.offset *= c
return self
def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self:
"""
Rotate the Abstract around the a location.
Args:
pivot: (x, y) location to rotate around
rotation: Angle to rotate by (counter-clockwise, radians)
Returns:
self
"""
pivot = numpy.asarray(pivot, dtype=float)
self.translate_ports(-pivot)
self.rotate_ports(rotation)
self.rotate_port_offsets(rotation)
self.translate_ports(+pivot)
return self
def rotate_port_offsets(self, rotation: float) -> Self:
"""
Rotate the offsets of all ports around (0, 0)
Args:
rotation: Angle to rotate by (counter-clockwise, radians)
Returns:
self
"""
for port in self.ports.values():
port.offset = rotation_matrix_2d(rotation) @ port.offset
return self
def rotate_ports(self, rotation: float) -> Self:
"""
Rotate each port around its offset (i.e. in place)
Args:
rotation: Angle to rotate by (counter-clockwise, radians)
Returns:
self
"""
for port in self.ports.values():
port.rotate(rotation)
return self
def mirror_port_offsets(self, across_axis: int = 0) -> Self:
"""
Mirror the offsets of all shapes, labels, and refs across an axis
Args:
across_axis: Axis to mirror across
(0: mirror across x axis, 1: mirror across y axis)
Returns:
self
"""
for port in self.ports.values():
port.offset[across_axis - 1] *= -1
return self
def mirror_ports(self, across_axis: int = 0) -> Self:
"""
Mirror each port's rotation across an axis, relative to its
offset
Args:
across_axis: Axis to mirror across
(0: mirror across x axis, 1: mirror across y axis)
Returns:
self
"""
for port in self.ports.values():
port.mirror(across_axis)
return self
def mirror(self, across_axis: int = 0) -> Self:
"""
Mirror the Pattern across an axis
Args:
axis: Axis to mirror across
(0: mirror across x axis, 1: mirror across y axis)
Returns:
self
"""
self.mirror_ports(across_axis)
self.mirror_port_offsets(across_axis)
return self
def apply_ref_transform(self, ref: Ref) -> Self:
"""
Apply the transform from a `Ref` to the ports of this `Abstract`.
This changes the port locations to where they would be in the Ref's parent pattern.
Args:
ref: The ref whose transform should be applied.
Returns:
self
"""
if ref.mirrored:
self.mirror()
self.rotate_ports(ref.rotation)
self.rotate_port_offsets(ref.rotation)
self.translate_ports(ref.offset)
return self
def undo_ref_transform(self, ref: Ref) -> Self:
"""
Apply the inverse transform from a `Ref` to the ports of this `Abstract`.
This changes the port locations to where they would be in the Ref's target (from the parent).
Args:
ref: The ref whose (inverse) transform should be applied.
Returns:
self
# TODO test undo_ref_transform
"""
self.translate_ports(-ref.offset)
self.rotate_port_offsets(-ref.rotation)
self.rotate_ports(-ref.rotation)
if ref.mirrored:
self.mirror(0)
return self

View File

@ -1,10 +1,3 @@
from .builder import Builder as Builder from .devices import Port, Device
from .pather import Pather as Pather from .utils import ell
from .renderpather import RenderPather as RenderPather from .tools import Tool
from .utils import ell as ell
from .tools import (
Tool as Tool,
RenderStep as RenderStep,
BasicTool as BasicTool,
PathTool as PathTool,
)

View File

@ -1,436 +0,0 @@
"""
Simplified Pattern assembly (`Builder`)
"""
from typing import Self
from collections.abc import Sequence, Mapping
import copy
import logging
from functools import wraps
from numpy.typing import ArrayLike
from ..pattern import Pattern
from ..library import ILibrary, TreeView
from ..error import BuildError
from ..ports import PortList, Port
from ..abstract import Abstract
logger = logging.getLogger(__name__)
class Builder(PortList):
"""
A `Builder` is a helper object used for snapping together multiple
lower-level patterns at their `Port`s.
The `Builder` mostly just holds context, in the form of a `Library`,
in addition to its underlying pattern. This simplifies some calls
to `plug` and `place`, by making the library implicit.
`Builder` can also be `set_dead()`, at which point further calls to `plug()`
and `place()` are ignored (intended for debugging).
Examples: Creating a Builder
===========================
- `Builder(library, ports={'A': port_a, 'C': port_c}, name='mypat')` makes
an empty pattern, adds the given ports, and places it into `library`
under the name `'mypat'`.
- `Builder(library)` makes an empty pattern with no ports. The pattern
is not added into `library` and must later be added with e.g.
`library['mypat'] = builder.pattern`
- `Builder(library, pattern=pattern, name='mypat')` uses an existing
pattern (including its ports) and sets `library['mypat'] = pattern`.
- `Builder.interface(other_pat, port_map=['A', 'B'], library=library)`
makes a new (empty) pattern, copies over ports 'A' and 'B' from
`other_pat`, and creates additional ports 'in_A' and 'in_B' facing
in the opposite directions. This can be used to build a device which
can plug into `other_pat` (using the 'in_*' ports) but which does not
itself include `other_pat` as a subcomponent.
- `Builder.interface(other_builder, ...)` does the same thing as
`Builder.interface(other_builder.pattern, ...)` but also uses
`other_builder.library` as its library by default.
Examples: Adding to a pattern
=============================
- `my_device.plug(subdevice, {'A': 'C', 'B': 'B'}, map_out={'D': 'myport'})`
instantiates `subdevice` into `my_device`, plugging ports 'A' and 'B'
of `my_device` into ports 'C' and 'B' of `subdevice`. The connected ports
are removed and any unconnected ports from `subdevice` are added to
`my_device`. Port 'D' of `subdevice` (unconnected) is renamed to 'myport'.
- `my_device.plug(wire, {'myport': 'A'})` places port 'A' of `wire` at 'myport'
of `my_device`. If `wire` has only two ports (e.g. 'A' and 'B'), no `map_out`,
argument is provided, and the `inherit_name` argument is not explicitly
set to `False`, the unconnected port of `wire` is automatically renamed to
'myport'. This allows easy extension of existing ports without changing
their names or having to provide `map_out` each time `plug` is called.
- `my_device.place(pad, offset=(10, 10), rotation=pi / 2, port_map={'A': 'gnd'})`
instantiates `pad` at the specified (x, y) offset and with the specified
rotation, adding its ports to those of `my_device`. Port 'A' of `pad` is
renamed to 'gnd' so that further routing can use this signal or net name
rather than the port name on the original `pad` device.
"""
__slots__ = ('pattern', 'library', '_dead')
pattern: Pattern
""" Layout of this device """
library: ILibrary
"""
Library from which patterns should be referenced
"""
_dead: bool
""" If True, plug()/place() are skipped (for debugging)"""
@property
def ports(self) -> dict[str, Port]:
return self.pattern.ports
@ports.setter
def ports(self, value: dict[str, Port]) -> None:
self.pattern.ports = value
def __init__(
self,
library: ILibrary,
*,
pattern: Pattern | None = None,
ports: str | Mapping[str, Port] | None = None,
name: str | None = None,
) -> None:
"""
Args:
library: The library from which referenced patterns will be taken
pattern: The pattern which will be modified by subsequent operations.
If `None` (default), a new pattern is created.
ports: Allows specifying the initial set of ports, if `pattern` does
not already have any ports (or is not provided). May be a string,
in which case it is interpreted as a name in `library`.
Default `None` (no ports).
name: If specified, `library[name]` is set to `self.pattern`.
"""
self._dead = False
self.library = library
if pattern is not None:
self.pattern = pattern
else:
self.pattern = Pattern()
if ports is not None:
if self.pattern.ports:
raise BuildError('Ports supplied for pattern with pre-existing ports!')
if isinstance(ports, str):
ports = library.abstract(ports).ports
self.pattern.ports.update(copy.deepcopy(dict(ports)))
if name is not None:
library[name] = self.pattern
@classmethod
def interface(
cls: type['Builder'],
source: PortList | Mapping[str, Port] | str,
*,
library: ILibrary | None = None,
in_prefix: str = 'in_',
out_prefix: str = '',
port_map: dict[str, str] | Sequence[str] | None = None,
name: str | None = None,
) -> 'Builder':
"""
Wrapper for `Pattern.interface()`, which returns a Builder instead.
Args:
source: A collection of ports (e.g. Pattern, Builder, or dict)
from which to create the interface. May be a pattern name if
`library` is provided.
library: Library from which existing patterns should be referenced,
and to which the new one should be added (if named). If not provided,
`source.library` must exist and will be used.
in_prefix: Prepended to port names for newly-created ports with
reversed directions compared to the current device.
out_prefix: Prepended to port names for ports which are directly
copied from the current device.
port_map: Specification for ports to copy into the new device:
- If `None`, all ports are copied.
- If a sequence, only the listed ports are copied
- If a mapping, the listed ports (keys) are copied and
renamed (to the values).
Returns:
The new builder, with an empty pattern and 2x as many ports as
listed in port_map.
Raises:
`PortError` if `port_map` contains port names not present in the
current device.
`PortError` if applying the prefixes results in duplicate port
names.
"""
if library is None:
if hasattr(source, 'library') and isinstance(source.library, ILibrary):
library = source.library
else:
raise BuildError('No library was given, and `source.library` does not have one either.')
if isinstance(source, str):
source = library.abstract(source).ports
pat = Pattern.interface(source, in_prefix=in_prefix, out_prefix=out_prefix, port_map=port_map)
new = Builder(library=library, pattern=pat, name=name)
return new
@wraps(Pattern.label)
def label(self, *args, **kwargs) -> Self:
self.pattern.label(*args, **kwargs)
return self
@wraps(Pattern.ref)
def ref(self, *args, **kwargs) -> Self:
self.pattern.ref(*args, **kwargs)
return self
@wraps(Pattern.polygon)
def polygon(self, *args, **kwargs) -> Self:
self.pattern.polygon(*args, **kwargs)
return self
@wraps(Pattern.rect)
def rect(self, *args, **kwargs) -> Self:
self.pattern.rect(*args, **kwargs)
return self
# Note: We're a superclass of `Pather`, where path() means something different...
#@wraps(Pattern.path)
#def path(self, *args, **kwargs) -> Self:
# self.pattern.path(*args, **kwargs)
# return self
def plug(
self,
other: Abstract | str | Pattern | TreeView,
map_in: dict[str, str],
map_out: dict[str, str | None] | None = None,
*,
mirrored: bool = False,
inherit_name: bool = True,
set_rotation: bool | None = None,
append: bool = False,
) -> Self:
"""
Wrapper around `Pattern.plug` which allows a string for `other`.
The `Builder`'s library is used to dereference the string (or `Abstract`, if
one is passed with `append=True`). If a `TreeView` is passed, it is first
added into `self.library`.
Args:
other: An `Abstract`, string, `Pattern`, or `TreeView` describing the
device to be instatiated. If it is a `TreeView`, it is first
added into `self.library`, after which the topcell is plugged;
an equivalent statement is `self.plug(self.library << other, ...)`.
map_in: dict of `{'self_port': 'other_port'}` mappings, specifying
port connections between the two devices.
map_out: dict of `{'old_name': 'new_name'}` mappings, specifying
new names for ports in `other`.
mirrored: Enables mirroring `other` across the x axis prior to
connecting any ports.
inherit_name: If `True`, and `map_in` specifies only a single port,
and `map_out` is `None`, and `other` has only two ports total,
then automatically renames the output port of `other` to the
name of the port from `self` that appears in `map_in`. This
makes it easy to extend a device with simple 2-port devices
(e.g. wires) without providing `map_out` each time `plug` is
called. See "Examples" above for more info. Default `True`.
set_rotation: If the necessary rotation cannot be determined from
the ports being connected (i.e. all pairs have at least one
port with `rotation=None`), `set_rotation` must be provided
to indicate how much `other` should be rotated. Otherwise,
`set_rotation` must remain `None`.
append: If `True`, `other` is appended instead of being referenced.
Note that this does not flatten `other`, so its refs will still
be refs (now inside `self`).
Returns:
self
Raises:
`PortError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other_names`.
`PortError` if there are any duplicate names after `map_in` and `map_out`
are applied.
`PortError` if the specified port mapping is not achieveable (the ports
do not line up)
"""
if self._dead:
logger.error('Skipping plug() since device is dead')
return self
if not isinstance(other, str | Abstract | Pattern):
# We got a Tree; add it into self.library and grab an Abstract for it
other = self.library << other
if isinstance(other, str):
other = self.library.abstract(other)
if append and isinstance(other, Abstract):
other = self.library[other.name]
self.pattern.plug(
other=other,
map_in=map_in,
map_out=map_out,
mirrored=mirrored,
inherit_name=inherit_name,
set_rotation=set_rotation,
append=append,
)
return self
def place(
self,
other: Abstract | str | Pattern | TreeView,
*,
offset: ArrayLike = (0, 0),
rotation: float = 0,
pivot: ArrayLike = (0, 0),
mirrored: bool = False,
port_map: dict[str, str | None] | None = None,
skip_port_check: bool = False,
append: bool = False,
) -> Self:
"""
Wrapper around `Pattern.place` which allows a string or `TreeView` for `other`.
The `Builder`'s library is used to dereference the string (or `Abstract`, if
one is passed with `append=True`). If a `TreeView` is passed, it is first
added into `self.library`.
Args:
other: An `Abstract`, string, `Pattern`, or `TreeView` describing the
device to be instatiated. If it is a `TreeView`, it is first
added into `self.library`, after which the topcell is plugged;
an equivalent statement is `self.plug(self.library << other, ...)`.
offset: Offset at which to place the instance. Default (0, 0).
rotation: Rotation applied to the instance before placement. Default 0.
pivot: Rotation is applied around this pivot point (default (0, 0)).
Rotation is applied prior to translation (`offset`).
mirrored: Whether theinstance should be mirrored across the x axis.
Mirroring is applied before translation and rotation.
port_map: dict of `{'old_name': 'new_name'}` mappings, specifying
new names for ports in the instantiated device. New names can be
`None`, which will delete those ports.
skip_port_check: Can be used to skip the internal call to `check_ports`,
in case it has already been performed elsewhere.
append: If `True`, `other` is appended instead of being referenced.
Note that this does not flatten `other`, so its refs will still
be refs (now inside `self`).
Returns:
self
Raises:
`PortError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other.ports`.
`PortError` if there are any duplicate names after `map_in` and `map_out`
are applied.
"""
if self._dead:
logger.error('Skipping place() since device is dead')
return self
if not isinstance(other, str | Abstract | Pattern):
# We got a Tree; add it into self.library and grab an Abstract for it
other = self.library << other
if isinstance(other, str):
other = self.library.abstract(other)
if append and isinstance(other, Abstract):
other = self.library[other.name]
self.pattern.place(
other=other,
offset=offset,
rotation=rotation,
pivot=pivot,
mirrored=mirrored,
port_map=port_map,
skip_port_check=skip_port_check,
append=append,
)
return self
def translate(self, offset: ArrayLike) -> Self:
"""
Translate the pattern and all ports.
Args:
offset: (x, y) distance to translate by
Returns:
self
"""
self.pattern.translate_elements(offset)
return self
def rotate_around(self, pivot: ArrayLike, angle: float) -> Self:
"""
Rotate the pattern and all ports.
Args:
angle: angle (radians, counterclockwise) to rotate by
pivot: location to rotate around
Returns:
self
"""
self.pattern.rotate_around(pivot, angle)
for port in self.ports.values():
port.rotate_around(pivot, angle)
return self
def mirror(self, axis: int = 0) -> Self:
"""
Mirror the pattern and all ports across the specified axis.
Args:
axis: Axis to mirror across (x=0, y=1)
Returns:
self
"""
self.pattern.mirror(axis)
return self
def set_dead(self) -> Self:
"""
Disallows further changes through `plug()` or `place()`.
This is meant for debugging:
```
dev.plug(a, ...)
dev.set_dead() # added for debug purposes
dev.plug(b, ...) # usually raises an error, but now skipped
dev.plug(c, ...) # also skipped
dev.pattern.visualize() # shows the device as of the set_dead() call
```
Returns:
self
"""
self._dead = True
return self
def __repr__(self) -> str:
s = f'<Builder {self.pattern} L({len(self.library)})>'
return s

892
masque/builder/devices.py Normal file
View File

@ -0,0 +1,892 @@
from typing import Dict, Iterable, List, Tuple, Union, TypeVar, Any, Iterator, Optional, Sequence
from typing import overload, KeysView, ValuesView
import copy
import warnings
import traceback
import logging
from collections import Counter
import numpy
from numpy import pi
from numpy.typing import ArrayLike, NDArray
from ..pattern import Pattern
from ..subpattern import SubPattern
from ..traits import PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable
from ..utils import AutoSlots, rotation_matrix_2d
from ..error import DeviceError
from .tools import Tool
from .utils import ell
logger = logging.getLogger(__name__)
P = TypeVar('P', bound='Port')
D = TypeVar('D', bound='Device')
O = TypeVar('O', bound='Device')
class Port(PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable, metaclass=AutoSlots):
"""
A point at which a `Device` can be snapped to another `Device`.
Each port has an `offset` ((x, y) position) and may also have a
`rotation` (orientation) and a `ptype` (port type).
The `rotation` is an angle, in radians, measured counterclockwise
from the +x axis, pointing inwards into the device which owns the port.
The rotation may be set to `None`, indicating that any orientation is
allowed (e.g. for a DC electrical port). It is stored modulo 2pi.
The `ptype` is an arbitrary string, default of `unk` (unknown).
"""
__slots__ = ('ptype', '_rotation')
_rotation: Optional[float]
""" radians counterclockwise from +x, pointing into device body.
Can be `None` to signify undirected port """
ptype: str
""" Port types must match to be plugged together if both are non-zero """
def __init__(
self,
offset: ArrayLike,
rotation: Optional[float],
ptype: str = 'unk',
) -> None:
self.offset = offset
self.rotation = rotation
self.ptype = ptype
@property
def rotation(self) -> Optional[float]:
""" Rotation, radians counterclockwise, pointing into device body. Can be None. """
return self._rotation
@rotation.setter
def rotation(self, val: float) -> None:
if val is None:
self._rotation = None
else:
if not numpy.size(val) == 1:
raise DeviceError('Rotation must be a scalar')
self._rotation = val % (2 * pi)
def get_bounds(self):
return numpy.vstack((self.offset, self.offset))
def set_ptype(self: P, ptype: str) -> P:
""" Chainable setter for `ptype` """
self.ptype = ptype
return self
def mirror(self: P, axis: int) -> P:
self.offset[1 - axis] *= -1
if self.rotation is not None:
self.rotation *= -1
self.rotation += axis * pi
return self
def rotate(self: P, rotation: float) -> P:
if self.rotation is not None:
self.rotation += rotation
return self
def set_rotation(self: P, rotation: Optional[float]) -> P:
self.rotation = rotation
return self
def __repr__(self) -> str:
if self.rotation is None:
rot = 'any'
else:
rot = str(numpy.rad2deg(self.rotation))
return f'<{self.offset}, {rot}, [{self.ptype}]>'
class Device(Copyable, Mirrorable):
"""
A `Device` is a combination of a `Pattern` with a set of named `Port`s
which can be used to "snap" devices together to make complex layouts.
`Device`s can be as simple as one or two ports (e.g. an electrical pad
or wire), but can also be used to build and represent a large routed
layout (e.g. a logical block with multiple I/O connections or even a
full chip).
For convenience, ports can be read out using square brackets:
- `device['A'] == Port((0, 0), 0)`
- `device[['A', 'B']] == {'A': Port((0, 0), 0), 'B': Port((0, 0), pi)}`
Examples: Creating a Device
===========================
- `Device(pattern, ports={'A': port_a, 'C': port_c})` uses an existing
pattern and defines some ports.
- `Device(name='my_dev_name', ports=None)` makes a new empty pattern with
default ports ('A' and 'B', in opposite directions, at (0, 0)).
- `my_device.build('my_layout')` makes a new pattern and instantiates
`my_device` in it with offset (0, 0) as a base for further building.
- `my_device.as_interface('my_component', port_map=['A', 'B'])` makes a new
(empty) pattern, copies over ports 'A' and 'B' from `my_device`, and
creates additional ports 'in_A' and 'in_B' facing in the opposite
directions. This can be used to build a device which can plug into
`my_device` (using the 'in_*' ports) but which does not itself include
`my_device` as a subcomponent.
Examples: Adding to a Device
============================
- `my_device.plug(subdevice, {'A': 'C', 'B': 'B'}, map_out={'D': 'myport'})`
instantiates `subdevice` into `my_device`, plugging ports 'A' and 'B'
of `my_device` into ports 'C' and 'B' of `subdevice`. The connected ports
are removed and any unconnected ports from `subdevice` are added to
`my_device`. Port 'D' of `subdevice` (unconnected) is renamed to 'myport'.
- `my_device.plug(wire, {'myport': 'A'})` places port 'A' of `wire` at 'myport'
of `my_device`. If `wire` has only two ports (e.g. 'A' and 'B'), no `map_out`,
argument is provided, and the `inherit_name` argument is not explicitly
set to `False`, the unconnected port of `wire` is automatically renamed to
'myport'. This allows easy extension of existing ports without changing
their names or having to provide `map_out` each time `plug` is called.
- `my_device.place(pad, offset=(10, 10), rotation=pi / 2, port_map={'A': 'gnd'})`
instantiates `pad` at the specified (x, y) offset and with the specified
rotation, adding its ports to those of `my_device`. Port 'A' of `pad` is
renamed to 'gnd' so that further routing can use this signal or net name
rather than the port name on the original `pad` device.
"""
__slots__ = ('pattern', 'ports', 'tools', '_dead')
pattern: Pattern
""" Layout of this device """
ports: Dict[str, Port]
""" Uniquely-named ports which can be used to snap to other Device instances"""
tools: Dict[Optional[str], Tool]
"""
Tool objects are used to dynamically generate new single-use Devices
(e.g wires or waveguides) to be plugged into this device.
"""
_dead: bool
""" If True, plug()/place() are skipped (for debugging)"""
def __init__(
self,
pattern: Optional[Pattern] = None,
ports: Optional[Dict[str, Port]] = None,
*,
tools: Union[None, Tool, Dict[Optional[str], Tool]] = None,
name: Optional[str] = None,
) -> None:
"""
If `ports` is `None`, two default ports ('A' and 'B') are created.
Both are placed at (0, 0) and have default `ptype`, but 'A' has rotation 0
(attached devices will be placed to the left) and 'B' has rotation
pi (attached devices will be placed to the right).
"""
if pattern is not None:
if name is not None:
raise DeviceError('Only one of `pattern` and `name` may be specified')
self.pattern = pattern
else:
if name is None:
raise DeviceError('Must specify either `pattern` or `name`')
self.pattern = Pattern(name=name)
if ports is None:
self.ports = {
'A': Port([0, 0], rotation=0),
'B': Port([0, 0], rotation=pi),
}
else:
self.ports = copy.deepcopy(ports)
if tools is None:
self.tools = {}
elif isinstance(tools, Tool):
self.tools = {None: tools}
else:
self.tools = tools
self._dead = False
@overload
def __getitem__(self, key: str) -> Port:
pass
@overload
def __getitem__(self, key: Union[List[str], Tuple[str, ...], KeysView[str], ValuesView[str]]) -> Dict[str, Port]:
pass
def __getitem__(self, key: Union[str, Iterable[str]]) -> Union[Port, Dict[str, Port]]:
"""
For convenience, ports can be read out using square brackets:
- `device['A'] == Port((0, 0), 0)`
- `device[['A', 'B']] == {'A': Port((0, 0), 0),
'B': Port((0, 0), pi)}`
"""
if isinstance(key, str):
return self.ports[key]
else:
return {k: self.ports[k] for k in key}
def rename_ports(
self: D,
mapping: Dict[str, Optional[str]],
overwrite: bool = False,
) -> D:
"""
Renames ports as specified by `mapping`.
Ports can be explicitly deleted by mapping them to `None`.
Args:
mapping: Dict of `{'old_name': 'new_name'}` pairs. Names can be mapped
to `None` to perform an explicit deletion. `'new_name'` can also
overwrite an existing non-renamed port to implicitly delete it if
`overwrite` is set to `True`.
overwrite: Allows implicit deletion of ports if set to `True`; see `mapping`.
Returns:
self
"""
if not overwrite:
duplicates = (set(self.ports.keys()) - set(mapping.keys())) & set(mapping.values())
if duplicates:
raise DeviceError(f'Unrenamed ports would be overwritten: {duplicates}')
renamed = {mapping[k]: self.ports.pop(k) for k in mapping.keys()}
if None in renamed:
del renamed[None]
self.ports.update(renamed) # type: ignore
return self
def check_ports(
self: D,
other_names: Iterable[str],
map_in: Optional[Dict[str, str]] = None,
map_out: Optional[Dict[str, Optional[str]]] = None,
) -> D:
"""
Given the provided port mappings, check that:
- All of the ports specified in the mappings exist
- There are no duplicate port names after all the mappings are performed
Args:
other_names: List of port names being considered for inclusion into
`self.ports` (before mapping)
map_in: Dict of `{'self_port': 'other_port'}` mappings, specifying
port connections between the two devices.
map_out: Dict of `{'old_name': 'new_name'}` mappings, specifying
new names for unconnected `other_names` ports.
Returns:
self
Raises:
`DeviceError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other_names`.
`DeviceError` if there are any duplicate names after `map_in` and `map_out`
are applied.
"""
if map_in is None:
map_in = {}
if map_out is None:
map_out = {}
other = set(other_names)
missing_inkeys = set(map_in.keys()) - set(self.ports.keys())
if missing_inkeys:
raise DeviceError(f'`map_in` keys not present in device: {missing_inkeys}')
missing_invals = set(map_in.values()) - other
if missing_invals:
raise DeviceError(f'`map_in` values not present in other device: {missing_invals}')
missing_outkeys = set(map_out.keys()) - other
if missing_outkeys:
raise DeviceError(f'`map_out` keys not present in other device: {missing_outkeys}')
orig_remaining = set(self.ports.keys()) - set(map_in.keys())
other_remaining = other - set(map_out.keys()) - set(map_in.values())
mapped_vals = set(map_out.values())
mapped_vals.discard(None)
conflicts_final = orig_remaining & (other_remaining | mapped_vals)
if conflicts_final:
raise DeviceError(f'Device ports conflict with existing ports: {conflicts_final}')
conflicts_partial = other_remaining & mapped_vals
if conflicts_partial:
raise DeviceError(f'`map_out` targets conflict with non-mapped outputs: {conflicts_partial}')
map_out_counts = Counter(map_out.values())
map_out_counts[None] = 0
conflicts_out = {k for k, v in map_out_counts.items() if v > 1}
if conflicts_out:
raise DeviceError(f'Duplicate targets in `map_out`: {conflicts_out}')
return self
def build(self, name: str) -> 'Device':
"""
Begin building a new device around an instance of the current device
(rather than modifying the current device).
Args:
name: A name for the new device
Returns:
The new `Device` object.
"""
pat = Pattern(name)
pat.addsp(self.pattern)
new = Device(pat, ports=self.ports, tools=self.tools)
return new
def as_interface(
self,
name: str,
in_prefix: str = 'in_',
out_prefix: str = '',
port_map: Optional[Union[Dict[str, str], Sequence[str]]] = None
) -> 'Device':
"""
Begin building a new device based on all or some of the ports in the
current device. Do not include the current device; instead use it
to define ports (the "interface") for the new device.
The ports specified by `port_map` (default: all ports) are copied to
new device, and additional (input) ports are created facing in the
opposite directions. The specified `in_prefix` and `out_prefix` are
prepended to the port names to differentiate them.
By default, the flipped ports are given an 'in_' prefix and unflipped
ports keep their original names, enabling intuitive construction of
a device that will "plug into" the current device; the 'in_*' ports
are used for plugging the devices together while the original port
names are used for building the new device.
Another use-case could be to build the new device using the 'in_'
ports, creating a new device which could be used in place of the
current device.
Args:
name: Name for the new device
in_prefix: Prepended to port names for newly-created ports with
reversed directions compared to the current device.
out_prefix: Prepended to port names for ports which are directly
copied from the current device.
port_map: Specification for ports to copy into the new device:
- If `None`, all ports are copied.
- If a sequence, only the listed ports are copied
- If a mapping, the listed ports (keys) are copied and
renamed (to the values).
Returns:
The new device, with an empty pattern and 2x as many ports as
listed in port_map.
Raises:
`DeviceError` if `port_map` contains port names not present in the
current device.
`DeviceError` if applying the prefixes results in duplicate port
names.
"""
if port_map:
if isinstance(port_map, dict):
missing_inkeys = set(port_map.keys()) - set(self.ports.keys())
orig_ports = {port_map[k]: v for k, v in self.ports.items() if k in port_map}
else:
port_set = set(port_map)
missing_inkeys = port_set - set(self.ports.keys())
orig_ports = {k: v for k, v in self.ports.items() if k in port_set}
if missing_inkeys:
raise DeviceError(f'`port_map` keys not present in device: {missing_inkeys}')
else:
orig_ports = self.ports
ports_in = {f'{in_prefix}{name}': port.deepcopy().rotate(pi)
for name, port in orig_ports.items()}
ports_out = {f'{out_prefix}{name}': port.deepcopy()
for name, port in orig_ports.items()}
duplicates = set(ports_out.keys()) & set(ports_in.keys())
if duplicates:
raise DeviceError(f'Duplicate keys after prefixing, try a different prefix: {duplicates}')
new = Device(name=name, ports={**ports_in, **ports_out}, tools=self.tools)
return new
def plug(
self: D,
other: O,
map_in: Dict[str, str],
map_out: Optional[Dict[str, Optional[str]]] = None,
*,
mirrored: Tuple[bool, bool] = (False, False),
inherit_name: bool = True,
set_rotation: Optional[bool] = None,
) -> D:
"""
Instantiate the device `other` into the current device, connecting
the ports specified by `map_in` and renaming the unconnected
ports specified by `map_out`.
Examples:
=========
- `my_device.plug(subdevice, {'A': 'C', 'B': 'B'}, map_out={'D': 'myport'})`
instantiates `subdevice` into `my_device`, plugging ports 'A' and 'B'
of `my_device` into ports 'C' and 'B' of `subdevice`. The connected ports
are removed and any unconnected ports from `subdevice` are added to
`my_device`. Port 'D' of `subdevice` (unconnected) is renamed to 'myport'.
- `my_device.plug(wire, {'myport': 'A'})` places port 'A' of `wire` at 'myport'
of `my_device`. If `wire` has only two ports (e.g. 'A' and 'B'), no `map_out`,
argument is provided, and the `inherit_name` argument is not explicitly
set to `False`, the unconnected port of `wire` is automatically renamed to
'myport'. This allows easy extension of existing ports without changing
their names or having to provide `map_out` each time `plug` is called.
Args:
other: A device to instantiate into the current device.
map_in: Dict of `{'self_port': 'other_port'}` mappings, specifying
port connections between the two devices.
map_out: Dict of `{'old_name': 'new_name'}` mappings, specifying
new names for ports in `other`.
mirrored: Enables mirroring `other` across the x or y axes prior
to connecting any ports.
inherit_name: If `True`, and `map_in` specifies only a single port,
and `map_out` is `None`, and `other` has only two ports total,
then automatically renames the output port of `other` to the
name of the port from `self` that appears in `map_in`. This
makes it easy to extend a device with simple 2-port devices
(e.g. wires) without providing `map_out` each time `plug` is
called. See "Examples" above for more info. Default `True`.
set_rotation: If the necessary rotation cannot be determined from
the ports being connected (i.e. all pairs have at least one
port with `rotation=None`), `set_rotation` must be provided
to indicate how much `other` should be rotated. Otherwise,
`set_rotation` must remain `None`.
Returns:
self
Raises:
`DeviceError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other_names`.
`DeviceError` if there are any duplicate names after `map_in` and `map_out`
are applied.
`DeviceError` if the specified port mapping is not achieveable (the ports
do not line up)
"""
if self._dead:
logger.error('Skipping plug() since device is dead')
return self
if (inherit_name
and not map_out
and len(map_in) == 1
and len(other.ports) == 2):
out_port_name = next(iter(set(other.ports.keys()) - set(map_in.values())))
map_out = {out_port_name: next(iter(map_in.keys()))}
if map_out is None:
map_out = {}
map_out = copy.deepcopy(map_out)
self.check_ports(other.ports.keys(), map_in, map_out)
translation, rotation, pivot = self.find_transform(other, map_in, mirrored=mirrored,
set_rotation=set_rotation)
# get rid of plugged ports
for ki, vi in map_in.items():
del self.ports[ki]
map_out[vi] = None
self.place(other, offset=translation, rotation=rotation, pivot=pivot,
mirrored=mirrored, port_map=map_out, skip_port_check=True)
return self
def place(
self: D,
other: O,
*,
offset: ArrayLike = (0, 0),
rotation: float = 0,
pivot: ArrayLike = (0, 0),
mirrored: Tuple[bool, bool] = (False, False),
port_map: Optional[Dict[str, Optional[str]]] = None,
skip_port_check: bool = False,
) -> D:
"""
Instantiate the device `other` into the current device, adding its
ports to those of the current device (but not connecting any ports).
Mirroring is applied before rotation; translation (`offset`) is applied last.
Examples:
=========
- `my_device.place(pad, offset=(10, 10), rotation=pi / 2, port_map={'A': 'gnd'})`
instantiates `pad` at the specified (x, y) offset and with the specified
rotation, adding its ports to those of `my_device`. Port 'A' of `pad` is
renamed to 'gnd' so that further routing can use this signal or net name
rather than the port name on the original `pad` device.
Args:
other: A device to instantiate into the current device.
offset: Offset at which to place `other`. Default (0, 0).
rotation: Rotation applied to `other` before placement. Default 0.
pivot: Rotation is applied around this pivot point (default (0, 0)).
Rotation is applied prior to translation (`offset`).
mirrored: Whether `other` should be mirrored across the x and y axes.
Mirroring is applied before translation and rotation.
port_map: Dict of `{'old_name': 'new_name'}` mappings, specifying
new names for ports in `other`. New names can be `None`, which will
delete those ports.
skip_port_check: Can be used to skip the internal call to `check_ports`,
in case it has already been performed elsewhere.
Returns:
self
Raises:
`DeviceError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other_names`.
`DeviceError` if there are any duplicate names after `map_in` and `map_out`
are applied.
"""
if self._dead:
logger.error('Skipping place() since device is dead')
return self
if port_map is None:
port_map = {}
if not skip_port_check:
self.check_ports(other.ports.keys(), map_in=None, map_out=port_map)
ports = {}
for name, port in other.ports.items():
new_name = port_map.get(name, name)
if new_name is None:
continue
ports[new_name] = port
for name, port in ports.items():
p = port.deepcopy()
p.mirror2d(mirrored)
p.rotate_around(pivot, rotation)
p.translate(offset)
self.ports[name] = p
sp = SubPattern(other.pattern, mirrored=mirrored)
sp.rotate_around(pivot, rotation)
sp.translate(offset)
self.pattern.subpatterns.append(sp)
return self
def find_transform(
self: D,
other: O,
map_in: Dict[str, str],
*,
mirrored: Tuple[bool, bool] = (False, False),
set_rotation: Optional[bool] = None,
) -> Tuple[NDArray[numpy.float64], float, NDArray[numpy.float64]]:
"""
Given a device `other` and a mapping `map_in` specifying port connections,
find the transform which will correctly align the specified ports.
Args:
other: a device
map_in: Dict of `{'self_port': 'other_port'}` mappings, specifying
port connections between the two devices.
mirrored: Mirrors `other` across the x or y axes prior to
connecting any ports.
set_rotation: If the necessary rotation cannot be determined from
the ports being connected (i.e. all pairs have at least one
port with `rotation=None`), `set_rotation` must be provided
to indicate how much `other` should be rotated. Otherwise,
`set_rotation` must remain `None`.
Returns:
- The (x, y) translation (performed last)
- The rotation (radians, counterclockwise)
- The (x, y) pivot point for the rotation
The rotation should be performed before the translation.
"""
s_ports = self[map_in.keys()]
o_ports = other[map_in.values()]
s_offsets = numpy.array([p.offset for p in s_ports.values()])
o_offsets = numpy.array([p.offset for p in o_ports.values()])
s_types = [p.ptype for p in s_ports.values()]
o_types = [p.ptype for p in o_ports.values()]
s_rotations = numpy.array([p.rotation if p.rotation is not None else 0 for p in s_ports.values()])
o_rotations = numpy.array([p.rotation if p.rotation is not None else 0 for p in o_ports.values()])
s_has_rot = numpy.array([p.rotation is not None for p in s_ports.values()], dtype=bool)
o_has_rot = numpy.array([p.rotation is not None for p in o_ports.values()], dtype=bool)
has_rot = s_has_rot & o_has_rot
if mirrored[0]:
o_offsets[:, 1] *= -1
o_rotations *= -1
if mirrored[1]:
o_offsets[:, 0] *= -1
o_rotations *= -1
o_rotations += pi
type_conflicts = numpy.array([st != ot and st != 'unk' and ot != 'unk'
for st, ot in zip(s_types, o_types)])
if type_conflicts.any():
ports = numpy.where(type_conflicts)
msg = 'Ports have conflicting types:\n'
for nn, (k, v) in enumerate(map_in.items()):
if type_conflicts[nn]:
msg += f'{k} | {s_types[nn]}:{o_types[nn]} | {v}\n'
msg = ''.join(traceback.format_stack()) + '\n' + msg
warnings.warn(msg, stacklevel=2)
rotations = numpy.mod(s_rotations - o_rotations - pi, 2 * pi)
if not has_rot.any():
if set_rotation is None:
DeviceError('Must provide set_rotation if rotation is indeterminate')
rotations[:] = set_rotation
else:
rotations[~has_rot] = rotations[has_rot][0]
if not numpy.allclose(rotations[:1], rotations):
rot_deg = numpy.rad2deg(rotations)
msg = f'Port orientations do not match:\n'
for nn, (k, v) in enumerate(map_in.items()):
msg += f'{k} | {rot_deg[nn]:g} | {v}\n'
raise DeviceError(msg)
pivot = o_offsets[0].copy()
rotate_offsets_around(o_offsets, pivot, rotations[0])
translations = s_offsets - o_offsets
if not numpy.allclose(translations[:1], translations):
msg = f'Port translations do not match:\n'
for nn, (k, v) in enumerate(map_in.items()):
msg += f'{k} | {translations[nn]} | {v}\n'
raise DeviceError(msg)
return translations[0], rotations[0], o_offsets[0]
def translate(self: D, offset: ArrayLike) -> D:
"""
Translate the pattern and all ports.
Args:
offset: (x, y) distance to translate by
Returns:
self
"""
self.pattern.translate_elements(offset)
for port in self.ports.values():
port.translate(offset)
return self
def rotate_around(self: D, pivot: ArrayLike, angle: float) -> D:
"""
Translate the pattern and all ports.
Args:
offset: (x, y) distance to translate by
Returns:
self
"""
self.pattern.rotate_around(pivot, angle)
for port in self.ports.values():
port.rotate_around(pivot, angle)
return self
def mirror(self: D, axis: int) -> D:
"""
Translate the pattern and all ports across the specified axis.
Args:
axis: Axis to mirror across (x=0, y=1)
Returns:
self
"""
self.pattern.mirror(axis)
for p in self.ports.values():
p.mirror(axis)
return self
def set_dead(self: D) -> D:
"""
Disallows further changes through `plug()` or `place()`.
This is meant for debugging:
```
dev.plug(a, ...)
dev.set_dead() # added for debug purposes
dev.plug(b, ...) # usually raises an error, but now skipped
dev.plug(c, ...) # also skipped
dev.pattern.visualize() # shows the device as of the set_dead() call
```
Returns:
self
"""
self._dead = True
return self
def rename(self: D, name: str) -> D:
"""
Renames the pattern and returns the device
Args:
name: The new name
Returns:
self
"""
self.pattern.name = name
return self
def __repr__(self) -> str:
s = f'<Device {self.pattern} ['
for name, port in self.ports.items():
s += f'\n\t{name}: {port}'
s += ']>'
return s
def retool(
self: D,
tool: Tool,
keys: Union[Optional[str], Sequence[Optional[str]]] = None,
) -> D:
if keys is None or isinstance(keys, str):
self.tools[keys] = tool
else:
for key in keys:
self.tools[key] = tool
return self
def path(
self: D,
portspec: str,
ccw: Optional[bool],
length: float,
*,
tool_port_names: Sequence[str] = ('A', 'B'),
**kwargs,
) -> D:
if self._dead:
logger.error('Skipping path() since device is dead')
return self
tool = self.tools.get(portspec, self.tools[None])
in_ptype = self.ports[portspec].ptype
dev = tool.path(ccw, length, in_ptype=in_ptype, port_names=tool_port_names, **kwargs)
return self.plug(dev, {portspec: tool_port_names[0]})
def path_to(
self: D,
portspec: str,
ccw: Optional[bool],
position: float,
*,
tool_port_names: Sequence[str] = ('A', 'B'),
**kwargs,
) -> D:
if self._dead:
logger.error('Skipping path_to() since device is dead')
return self
port = self.ports[portspec]
x, y = port.offset
if port.rotation is None:
raise DeviceError(f'Port {portspec} has no rotation and cannot be used for path_to()')
if not numpy.isclose(port.rotation % (pi / 2), 0):
raise DeviceError('path_to was asked to route from non-manhattan port')
is_horizontal = numpy.isclose(port.rotation % pi, 0)
if is_horizontal:
if numpy.sign(numpy.cos(port.rotation)) == numpy.sign(position - x):
raise DeviceError(f'path_to routing to behind source port: x={x:g} to {position:g}')
length = numpy.abs(position - x)
else:
if numpy.sign(numpy.sin(port.rotation)) == numpy.sign(position - y):
raise DeviceError(f'path_to routing to behind source port: y={y:g} to {position:g}')
length = numpy.abs(position - y)
return self.path(portspec, ccw, length, tool_port_names=tool_port_names, **kwargs)
def busL(
self: D,
portspec: Union[str, Sequence[str]],
ccw: Optional[bool],
*,
spacing: Optional[Union[float, ArrayLike]] = None,
set_rotation: Optional[float] = None,
tool_port_names: Sequence[str] = ('A', 'B'),
container_name: str = '_busL',
force_container: bool = False,
**kwargs,
) -> D:
if self._dead:
logger.error('Skipping busL() since device is dead')
return self
bound_types = set()
if 'bound_type' in kwargs:
bound_types.add(kwargs['bound_type'])
bound = kwargs['bound']
for bt in ('emin', 'emax', 'pmin', 'pmax', 'min_past_furthest'):
if bt in kwargs:
bound_types.add(bt)
bound = kwargs[bt]
if not bound_types:
raise DeviceError('No bound type specified for busL')
elif len(bound_types) > 1:
raise DeviceError(f'Too many bound types specified for busL: {bound_types}')
bound_type = tuple(bound_types)[0]
if isinstance(portspec, str):
portspec = [portspec]
ports = self[tuple(portspec)]
extensions = ell(ports, ccw, spacing=spacing, bound=bound, bound_type=bound_type, set_rotation=set_rotation)
if len(ports) == 1 and not force_container:
# Not a bus, so having a container just adds noise to the layout
port_name = tuple(portspec)[0]
return self.path(port_name, ccw, extensions[port_name], tool_port_names=tool_port_names)
else:
dev = Device(name='', ports=ports, tools=self.tools).as_interface(container_name)
for name, length in extensions.items():
dev.path(name, ccw, length, tool_port_names=tool_port_names)
return self.plug(dev, {sp: 'in_' + sp for sp in ports.keys()}) # TODO safe to use 'in_'?
# TODO def path_join() and def bus_join()?
def rotate_offsets_around(
offsets: NDArray[numpy.float64],
pivot: NDArray[numpy.float64],
angle: float,
) -> NDArray[numpy.float64]:
offsets -= pivot
offsets[:] = (rotation_matrix_2d(angle) @ offsets.T).T
offsets += pivot
return offsets

View File

@ -1,694 +0,0 @@
"""
Manual wire/waveguide routing (`Pather`)
"""
from typing import Self
from collections.abc import Sequence, MutableMapping, Mapping
import copy
import logging
from pprint import pformat
import numpy
from numpy import pi
from numpy.typing import ArrayLike
from ..pattern import Pattern
from ..library import ILibrary, SINGLE_USE_PREFIX
from ..error import PortError, BuildError
from ..ports import PortList, Port
from ..abstract import Abstract
from ..utils import SupportsBool, rotation_matrix_2d
from .tools import Tool
from .utils import ell
from .builder import Builder
logger = logging.getLogger(__name__)
class Pather(Builder):
"""
An extension of `Builder` which provides functionality for routing and attaching
single-use patterns (e.g. wires or waveguides) and bundles / buses of such patterns.
`Pather` is mostly concerned with calculating how long each wire should be. It calls
out to `Tool.path` functions provided by subclasses of `Tool` to build the actual patterns.
`Tool`s are assigned on a per-port basis and stored in `.tools`; a key of `None` represents
a "default" `Tool` used for all ports which do not have a port-specific `Tool` assigned.
Examples: Creating a Pather
===========================
- `Pather(library, tools=my_tool)` makes an empty pattern with no ports. The pattern
is not added into `library` and must later be added with e.g.
`library['mypat'] = pather.pattern`.
The default wire/waveguide generating tool for all ports is set to `my_tool`.
- `Pather(library, ports={'in': Port(...), 'out': ...}, name='mypat', tools=my_tool)`
makes an empty pattern, adds the given ports, and places it into `library`
under the name `'mypat'`. The default wire/waveguide generating tool
for all ports is set to `my_tool`
- `Pather(..., tools={'in': top_metal_40um, 'out': bottom_metal_1um, None: my_tool})`
assigns specific tools to individual ports, and `my_tool` as a default for ports
which are not specified.
- `Pather.interface(other_pat, port_map=['A', 'B'], library=library, tools=my_tool)`
makes a new (empty) pattern, copies over ports 'A' and 'B' from
`other_pat`, and creates additional ports 'in_A' and 'in_B' facing
in the opposite directions. This can be used to build a device which
can plug into `other_pat` (using the 'in_*' ports) but which does not
itself include `other_pat` as a subcomponent.
- `Pather.interface(other_pather, ...)` does the same thing as
`Builder.interface(other_builder.pattern, ...)` but also uses
`other_builder.library` as its library by default.
Examples: Adding to a pattern
=============================
- `pather.path('my_port', ccw=True, distance)` creates a "wire" for which the output
port is `distance` units away along the axis of `'my_port'` and rotated 90 degrees
counterclockwise (since `ccw=True`) relative to `'my_port'`. The wire is `plug`ged
into the existing `'my_port'`, causing the port to move to the wire's output.
There is no formal guarantee about how far off-axis the output will be located;
there may be a significant width to the bend that is used to accomplish the 90 degree
turn. However, an error is raised if `distance` is too small to fit the bend.
- `pather.path('my_port', ccw=None, distance)` creates a straight wire with a length
of `distance` and `plug`s it into `'my_port'`.
- `pather.path_to('my_port', ccw=False, position)` creates a wire which starts at
`'my_port'` and has its output at the specified `position`, pointing 90 degrees
clockwise relative to the input. Again, the off-axis position or distance to the
output is not specified, so `position` takes the form of a single coordinate. To
ease debugging, position may be specified as `x=position` or `y=position` and an
error will be raised if the wrong coordinate is given.
- `pather.mpath(['A', 'B', 'C'], ..., spacing=spacing)` is a superset of `path`
and `path_to` which can act on multiple ports simultaneously. Each port's wire is
generated using its own `Tool` (or the default tool if left unspecified).
The output ports are spaced out by `spacing` along the input ports' axis, unless
`ccw=None` is specified (i.e. no bends) in which case they all end at the same
destination coordinate.
- `pather.plug(wire, {'myport': 'A'})` places port 'A' of `wire` at 'myport'
of `pather.pattern`. If `wire` has only two ports (e.g. 'A' and 'B'), no `map_out`,
argument is provided, and the `inherit_name` argument is not explicitly
set to `False`, the unconnected port of `wire` is automatically renamed to
'myport'. This allows easy extension of existing ports without changing
their names or having to provide `map_out` each time `plug` is called.
- `pather.place(pad, offset=(10, 10), rotation=pi / 2, port_map={'A': 'gnd'})`
instantiates `pad` at the specified (x, y) offset and with the specified
rotation, adding its ports to those of `pather.pattern`. Port 'A' of `pad` is
renamed to 'gnd' so that further routing can use this signal or net name
rather than the port name on the original `pad` device.
- `pather.retool(tool)` or `pather.retool(tool, ['in', 'out', None])` can change
which tool is used for the given ports (or as the default tool). Useful
when placing vias or using multiple waveguide types along a route.
"""
__slots__ = ('tools',)
library: ILibrary
"""
Library from which existing patterns should be referenced, and to which
new ones should be added
"""
tools: dict[str | None, Tool]
"""
Tool objects are used to dynamically generate new single-use `Pattern`s
(e.g wires or waveguides) to be plugged into this device. A key of `None`
indicates the default `Tool`.
"""
def __init__(
self,
library: ILibrary,
*,
pattern: Pattern | None = None,
ports: str | Mapping[str, Port] | None = None,
tools: Tool | MutableMapping[str | None, Tool] | None = None,
name: str | None = None,
) -> None:
"""
Args:
library: The library from which referenced patterns will be taken,
and where new patterns (e.g. generated by the `tools`) will be placed.
pattern: The pattern which will be modified by subsequent operations.
If `None` (default), a new pattern is created.
ports: Allows specifying the initial set of ports, if `pattern` does
not already have any ports (or is not provided). May be a string,
in which case it is interpreted as a name in `library`.
Default `None` (no ports).
tools: A mapping of {port: tool} which specifies what `Tool` should be used
to generate waveguide or wire segments when `path`/`path_to`/`mpath`
are called. Relies on `Tool.path` implementations.
name: If specified, `library[name]` is set to `self.pattern`.
"""
self._dead = False
self.library = library
if pattern is not None:
self.pattern = pattern
else:
self.pattern = Pattern()
if ports is not None:
if self.pattern.ports:
raise BuildError('Ports supplied for pattern with pre-existing ports!')
if isinstance(ports, str):
ports = library.abstract(ports).ports
self.pattern.ports.update(copy.deepcopy(dict(ports)))
if name is not None:
library[name] = self.pattern
if tools is None:
self.tools = {}
elif isinstance(tools, Tool):
self.tools = {None: tools}
else:
self.tools = dict(tools)
@classmethod
def from_builder(
cls: type['Pather'],
builder: Builder,
*,
tools: Tool | MutableMapping[str | None, Tool] | None = None,
) -> 'Pather':
"""
Construct a `Pather` by adding tools to a `Builder`.
Args:
builder: Builder to turn into a Pather
tools: Tools for the `Pather`
Returns:
A new Pather object, using `builder.library` and `builder.pattern`.
"""
new = Pather(library=builder.library, tools=tools, pattern=builder.pattern)
return new
@classmethod
def interface(
cls: type['Pather'],
source: PortList | Mapping[str, Port] | str,
*,
library: ILibrary | None = None,
tools: Tool | MutableMapping[str | None, Tool] | None = None,
in_prefix: str = 'in_',
out_prefix: str = '',
port_map: dict[str, str] | Sequence[str] | None = None,
name: str | None = None,
) -> 'Pather':
"""
Wrapper for `Pattern.interface()`, which returns a Pather instead.
Args:
source: A collection of ports (e.g. Pattern, Builder, or dict)
from which to create the interface. May be a pattern name if
`library` is provided.
library: Library from which existing patterns should be referenced,
and to which the new one should be added (if named). If not provided,
`source.library` must exist and will be used.
tools: `Tool`s which will be used by the pather for generating new wires
or waveguides (via `path`/`path_to`/`mpath`).
in_prefix: Prepended to port names for newly-created ports with
reversed directions compared to the current device.
out_prefix: Prepended to port names for ports which are directly
copied from the current device.
port_map: Specification for ports to copy into the new device:
- If `None`, all ports are copied.
- If a sequence, only the listed ports are copied
- If a mapping, the listed ports (keys) are copied and
renamed (to the values).
Returns:
The new pather, with an empty pattern and 2x as many ports as
listed in port_map.
Raises:
`PortError` if `port_map` contains port names not present in the
current device.
`PortError` if applying the prefixes results in duplicate port
names.
"""
if library is None:
if hasattr(source, 'library') and isinstance(source.library, ILibrary):
library = source.library
else:
raise BuildError('No library provided (and not present in `source.library`')
if tools is None and hasattr(source, 'tools') and isinstance(source.tools, dict):
tools = source.tools
if isinstance(source, str):
source = library.abstract(source).ports
pat = Pattern.interface(source, in_prefix=in_prefix, out_prefix=out_prefix, port_map=port_map)
new = Pather(library=library, pattern=pat, name=name, tools=tools)
return new
def __repr__(self) -> str:
s = f'<Pather {self.pattern} L({len(self.library)}) {pformat(self.tools)}>'
return s
def retool(
self,
tool: Tool,
keys: str | Sequence[str | None] | None = None,
) -> Self:
"""
Update the `Tool` which will be used when generating `Pattern`s for the ports
given by `keys`.
Args:
tool: The new `Tool` to use for the given ports.
keys: Which ports the tool should apply to. `None` indicates the default tool,
used when there is no matching entry in `self.tools` for the port in question.
Returns:
self
"""
if keys is None or isinstance(keys, str):
self.tools[keys] = tool
else:
for key in keys:
self.tools[key] = tool
return self
def path(
self,
portspec: str,
ccw: SupportsBool | None,
length: float,
*,
tool_port_names: tuple[str, str] = ('A', 'B'),
plug_into: str | None = None,
**kwargs,
) -> Self:
"""
Create a "wire"/"waveguide" and `plug` it into the port `portspec`, with the aim
of traveling exactly `length` distance.
The wire will travel `length` distance along the port's axis, an an unspecified
(tool-dependent) distance in the perpendicular direction. The output port will
be rotated (or not) based on the `ccw` parameter.
Args:
portspec: The name of the port into which the wire will be plugged.
ccw: If `None`, the output should be along the same axis as the input.
Otherwise, cast to bool and turn counterclockwise if True
and clockwise otherwise.
length: The total distance from input to output, along the input's axis only.
(There may be a tool-dependent offset along the other axis.)
tool_port_names: The names of the ports on the generated pattern. It is unlikely
that you will need to change these. The first port is the input (to be
connected to `portspec`).
plug_into: If not None, attempts to plug the wire's output port into the provided
port on `self`.
Returns:
self
Raises:
BuildError if `distance` is too small to fit the bend (if a bend is present).
LibraryError if no valid name could be picked for the pattern.
"""
if self._dead:
logger.error('Skipping path() since device is dead')
return self
tool = self.tools.get(portspec, self.tools[None])
in_ptype = self.pattern[portspec].ptype
tree = tool.path(ccw, length, in_ptype=in_ptype, port_names=tool_port_names, **kwargs)
abstract = self.library << tree
if plug_into is not None:
output = {plug_into: tool_port_names[1]}
else:
output = {}
return self.plug(abstract, {portspec: tool_port_names[0], **output})
def path_to(
self,
portspec: str,
ccw: SupportsBool | None,
position: float | None = None,
*,
x: float | None = None,
y: float | None = None,
tool_port_names: tuple[str, str] = ('A', 'B'),
plug_into: str | None = None,
**kwargs,
) -> Self:
"""
Create a "wire"/"waveguide" and `plug` it into the port `portspec`, with the aim
of ending exactly at a target position.
The wire will travel so that the output port will be placed at exactly the target
position along the input port's axis. There can be an unspecified (tool-dependent)
offset in the perpendicular direction. The output port will be rotated (or not)
based on the `ccw` parameter.
Args:
portspec: The name of the port into which the wire will be plugged.
ccw: If `None`, the output should be along the same axis as the input.
Otherwise, cast to bool and turn counterclockwise if True
and clockwise otherwise.
position: The final port position, along the input's axis only.
(There may be a tool-dependent offset along the other axis.)
Only one of `position`, `x`, and `y` may be specified.
x: The final port position along the x axis.
`portspec` must refer to a horizontal port if `x` is passed, otherwise a
BuildError will be raised.
y: The final port position along the y axis.
`portspec` must refer to a vertical port if `y` is passed, otherwise a
BuildError will be raised.
tool_port_names: The names of the ports on the generated pattern. It is unlikely
that you will need to change these. The first port is the input (to be
connected to `portspec`).
plug_into: If not None, attempts to plug the wire's output port into the provided
port on `self`.
Returns:
self
Raises:
BuildError if `position`, `x`, or `y` is too close to fit the bend (if a bend
is present).
BuildError if `x` or `y` is specified but does not match the axis of `portspec`.
BuildError if more than one of `x`, `y`, and `position` is specified.
"""
if self._dead:
logger.error('Skipping path_to() since device is dead')
return self
pos_count = sum(vv is not None for vv in (position, x, y))
if pos_count > 1:
raise BuildError('Only one of `position`, `x`, and `y` may be specified at once')
if pos_count < 1:
raise BuildError('One of `position`, `x`, and `y` must be specified')
port = self.pattern[portspec]
if port.rotation is None:
raise PortError(f'Port {portspec} has no rotation and cannot be used for path_to()')
if not numpy.isclose(port.rotation % (pi / 2), 0):
raise BuildError('path_to was asked to route from non-manhattan port')
is_horizontal = numpy.isclose(port.rotation % pi, 0)
if is_horizontal:
if y is not None:
raise BuildError('Asked to path to y-coordinate, but port is horizontal')
if position is None:
position = x
else:
if x is not None:
raise BuildError('Asked to path to x-coordinate, but port is vertical')
if position is None:
position = y
x0, y0 = port.offset
if is_horizontal:
if numpy.sign(numpy.cos(port.rotation)) == numpy.sign(position - x0):
raise BuildError(f'path_to routing to behind source port: x0={x0:g} to {position:g}')
length = numpy.abs(position - x0)
else:
if numpy.sign(numpy.sin(port.rotation)) == numpy.sign(position - y0):
raise BuildError(f'path_to routing to behind source port: y0={y0:g} to {position:g}')
length = numpy.abs(position - y0)
return self.path(
portspec,
ccw,
length,
tool_port_names=tool_port_names,
plug_into=plug_into,
**kwargs,
)
def path_into(
self,
portspec_src: str,
portspec_dst: str,
*,
tool_port_names: tuple[str, str] = ('A', 'B'),
out_ptype: str | None = None,
plug_destination: bool = True,
**kwargs,
) -> Self:
"""
Create a "wire"/"waveguide" and traveling between the ports `portspec_src` and
`portspec_dst`, and `plug` it into both (or just the source port).
Only unambiguous scenarios are allowed:
- Straight connector between facing ports
- Single 90 degree bend
- Jog between facing ports
(jog is done as late as possible, i.e. only 2 L-shaped segments are used)
By default, the destination's `pytpe` will be used as the `out_ptype` for the
wire, and the `portspec_dst` will be plugged (i.e. removed).
Args:
portspec_src: The name of the starting port into which the wire will be plugged.
portspec_dst: The name of the destination port.
tool_port_names: The names of the ports on the generated pattern. It is unlikely
that you will need to change these. The first port is the input (to be
connected to `portspec`).
out_ptype: Passed to the pathing tool in order to specify the desired port type
to be generated at the destination end. If `None` (default), the destination
port's `ptype` will be used.
Returns:
self
Raises:
PortError if either port does not have a specified rotation.
BuildError if and invalid port config is encountered:
- Non-manhattan ports
- U-bend
- Destination too close to (or behind) source
"""
if self._dead:
logger.error('Skipping path_into() since device is dead')
return self
port_src = self.pattern[portspec_src]
port_dst = self.pattern[portspec_dst]
if out_ptype is None:
out_ptype = port_dst.ptype
if port_src.rotation is None:
raise PortError(f'Port {portspec_src} has no rotation and cannot be used for path_into()')
if port_dst.rotation is None:
raise PortError(f'Port {portspec_dst} has no rotation and cannot be used for path_into()')
if not numpy.isclose(port_src.rotation % (pi / 2), 0):
raise BuildError('path_into was asked to route from non-manhattan port')
if not numpy.isclose(port_dst.rotation % (pi / 2), 0):
raise BuildError('path_into was asked to route to non-manhattan port')
src_is_horizontal = numpy.isclose(port_src.rotation % pi, 0)
dst_is_horizontal = numpy.isclose(port_dst.rotation % pi, 0)
xs, ys = port_src.offset
xd, yd = port_dst.offset
angle = (port_dst.rotation - port_src.rotation) % (2 * pi)
src_ne = port_src.rotation % (2 * pi) > (3 * pi / 4) # path from src will go north or east
def get_jog(ccw: SupportsBool, length: float) -> float:
tool = self.tools.get(portspec_src, self.tools[None])
in_ptype = 'unk' # Could use port_src.ptype, but we're assuming this is after one bend already...
tree2 = tool.path(ccw, length, in_ptype=in_ptype, port_names=('A', 'B'), out_ptype=out_ptype, **kwargs)
top2 = tree2.top_pattern()
jog = rotation_matrix_2d(top2['A'].rotation) @ (top2['B'].offset - top2['A'].offset)
return jog[1]
dst_extra_args = {'out_ptype': out_ptype}
if plug_destination:
dst_extra_args['plug_into'] = portspec_dst
src_args = {**kwargs, 'tool_port_names': tool_port_names}
dst_args = {**src_args, **dst_extra_args}
if src_is_horizontal and not dst_is_horizontal:
# single bend should suffice
self.path_to(portspec_src, angle > pi, x=xd, **src_args)
self.path_to(portspec_src, None, y=yd, **dst_args)
elif dst_is_horizontal and not src_is_horizontal:
# single bend should suffice
self.path_to(portspec_src, angle > pi, y=yd, **src_args)
self.path_to(portspec_src, None, x=xd, **dst_args)
elif numpy.isclose(angle, pi):
if src_is_horizontal and ys == yd:
# straight connector
self.path_to(portspec_src, None, x=xd, **dst_args)
elif not src_is_horizontal and xs == xd:
# straight connector
self.path_to(portspec_src, None, y=yd, **dst_args)
elif src_is_horizontal:
# figure out how much x our y-segment (2nd) takes up, then path based on that
y_len = numpy.abs(yd - ys)
ccw2 = src_ne != (yd > ys)
jog = get_jog(ccw2, y_len) * numpy.sign(xd - xs)
self.path_to(portspec_src, not ccw2, x=xd - jog, **src_args)
self.path_to(portspec_src, ccw2, y=yd, **dst_args)
else:
# figure out how much y our x-segment (2nd) takes up, then path based on that
x_len = numpy.abs(xd - xs)
ccw2 = src_ne != (xd < xs)
jog = get_jog(ccw2, x_len) * numpy.sign(yd - ys)
self.path_to(portspec_src, not ccw2, y=yd - jog, **src_args)
self.path_to(portspec_src, ccw2, x=xd, **dst_args)
elif numpy.isclose(angle, 0):
raise BuildError('Don\'t know how to route a U-bend at this time!')
else:
raise BuildError(f'Don\'t know how to route ports with relative angle {angle}')
return self
def mpath(
self,
portspec: str | Sequence[str],
ccw: SupportsBool | None,
*,
spacing: float | ArrayLike | None = None,
set_rotation: float | None = None,
tool_port_names: tuple[str, str] = ('A', 'B'),
force_container: bool = False,
base_name: str = SINGLE_USE_PREFIX + 'mpath',
**kwargs,
) -> Self:
"""
`mpath` is a superset of `path` and `path_to` which can act on bundles or buses
of "wires or "waveguides".
The wires will travel so that the output ports will be placed at well-defined
locations along the axis of their input ports, but may have arbitrary (tool-
dependent) offsets in the perpendicular direction.
If `ccw` is not `None`, the wire bundle will turn 90 degres in either the
clockwise (`ccw=False`) or counter-clockwise (`ccw=True`) direction. Within the
bundle, the center-to-center wire spacings after the turn are set by `spacing`,
which is required when `ccw` is not `None`. The final position of bundle as a
whole can be set in a number of ways:
=A>---------------------------V turn direction: `ccw=False`
=B>-------------V |
=C>-----------------------V |
=D=>----------------V |
|
x---x---x---x `spacing` (can be scalar or array)
<--------------> `emin=`
<------> `bound_type='min_past_furthest', bound=`
<--------------------------------> `emax=`
x `pmin=`
x `pmax=`
- `emin=`, equivalent to `bound_type='min_extension', bound=`
The total extension value for the furthest-out port (B in the diagram).
- `emax=`, equivalent to `bound_type='max_extension', bound=`:
The total extension value for the closest-in port (C in the diagram).
- `pmin=`, equivalent to `xmin=`, `ymin=`, or `bound_type='min_position', bound=`:
The coordinate of the innermost bend (D's bend).
The x/y versions throw an error if they do not match the port axis (for debug)
- `pmax=`, `xmax=`, `ymax=`, or `bound_type='max_position', bound=`:
The coordinate of the outermost bend (A's bend).
The x/y versions throw an error if they do not match the port axis (for debug)
- `bound_type='min_past_furthest', bound=`:
The distance between furthest out-port (B) and the innermost bend (D's bend).
If `ccw=None`, final output positions (along the input axis) of all wires will be
identical (i.e. wires will all be cut off evenly). In this case, `spacing=None` is
required. In this case, `emin=` and `emax=` are equivalent to each other, and
`pmin=`, `pmax=`, `xmin=`, etc. are also equivalent to each other.
Args:
portspec: The names of the ports which are to be routed.
ccw: If `None`, the outputs should be along the same axis as the inputs.
Otherwise, cast to bool and turn 90 degrees counterclockwise if `True`
and clockwise otherwise.
spacing: Center-to-center distance between output ports along the input port's axis.
Must be provided if (and only if) `ccw` is not `None`.
set_rotation: If the provided ports have `rotation=None`, this can be used
to set a rotation for them.
tool_port_names: The names of the ports on the generated pattern. It is unlikely
that you will need to change these. The first port is the input (to be
connected to `portspec`).
force_container: If `False` (default), and only a single port is provided, the
generated wire for that port will be referenced directly, rather than being
wrapped in an additonal `Pattern`.
base_name: Name to use for the generated `Pattern`. This will be passed through
`self.library.get_name()` to get a unique name for each new `Pattern`.
Returns:
self
Raises:
BuildError if the implied length for any wire is too close to fit the bend
(if a bend is requested).
BuildError if `xmin`/`xmax` or `ymin`/`ymax` is specified but does not
match the axis of `portspec`.
BuildError if an incorrect bound type or spacing is specified.
"""
if self._dead:
logger.error('Skipping mpath() since device is dead')
return self
bound_types = set()
if 'bound_type' in kwargs:
bound_types.add(kwargs['bound_type'])
bound = kwargs['bound']
del kwargs['bound_type']
del kwargs['bound']
for bt in ('emin', 'emax', 'pmin', 'pmax', 'xmin', 'xmax', 'ymin', 'ymax', 'min_past_furthest'):
if bt in kwargs:
bound_types.add(bt)
bound = kwargs[bt]
del kwargs[bt]
if not bound_types:
raise BuildError('No bound type specified for mpath')
if len(bound_types) > 1:
raise BuildError(f'Too many bound types specified for mpath: {bound_types}')
bound_type = tuple(bound_types)[0]
if isinstance(portspec, str):
portspec = [portspec]
ports = self.pattern[tuple(portspec)]
extensions = ell(ports, ccw, spacing=spacing, bound=bound, bound_type=bound_type, set_rotation=set_rotation)
if len(ports) == 1 and not force_container:
# Not a bus, so having a container just adds noise to the layout
port_name = tuple(portspec)[0]
return self.path(port_name, ccw, extensions[port_name], tool_port_names=tool_port_names, **kwargs)
bld = Pather.interface(source=ports, library=self.library, tools=self.tools)
for port_name, length in extensions.items():
bld.path(port_name, ccw, length, tool_port_names=tool_port_names, **kwargs)
name = self.library.get_name(base_name)
self.library[name] = bld.pattern
return self.plug(Abstract(name, bld.pattern.ports), {sp: 'in_' + sp for sp in ports}) # TODO safe to use 'in_'?
# TODO def bus_join()?
def flatten(self) -> Self:
"""
Flatten the contained pattern, using the contained library to resolve references.
Returns:
self
"""
self.pattern.flatten(self.library)
return self

View File

@ -0,0 +1,112 @@
"""
Functions for writing port data into a Pattern (`dev2pat`) and retrieving it (`pat2dev`).
These use the format 'name:ptype angle_deg' written into labels, which are placed at
the port locations. This particular approach is just a sensible default; feel free to
to write equivalent functions for your own format or alternate storage methods.
"""
from typing import Sequence
import logging
import numpy
from ..pattern import Pattern
from ..label import Label
from ..utils import rotation_matrix_2d, layer_t
from .devices import Device, Port
logger = logging.getLogger(__name__)
def dev2pat(device: Device, layer: layer_t) -> Pattern:
"""
Place a text label at each port location, specifying the port data in the format
'name:ptype angle_deg'
This can be used to debug port locations or to automatically generate ports
when reading in a GDS file.
NOTE that `device` is modified by this function, and `device.pattern` is returned.
Args:
device: The device which is to have its ports labeled. MODIFIED in-place.
layer: The layer on which the labels will be placed.
Returns:
`device.pattern`
"""
for name, port in device.ports.items():
if port.rotation is None:
angle_deg = numpy.inf
else:
angle_deg = numpy.rad2deg(port.rotation)
device.pattern.labels += [
Label(string=f'{name}:{port.ptype} {angle_deg:g}', layer=layer, offset=port.offset)
]
return device.pattern
def pat2dev(
pattern: Pattern,
layers: Sequence[layer_t],
max_depth: int = 999_999,
skip_subcells: bool = True,
) -> Device:
"""
Examine `pattern` for labels specifying port info, and use that info
to build a `Device` object.
Labels are assumed to be placed at the port locations, and have the format
'name:ptype angle_deg'
Args:
pattern: Pattern object to scan for labels.
layers: Search for labels on all the given layers.
max_depth: Maximum hierarcy depth to search. Default 999_999.
Reduce this to 0 to avoid ever searching subcells.
skip_subcells: If port labels are found at a given hierarcy level,
do not continue searching at deeper levels. This allows subcells
to contain their own port info (and thus become their own Devices).
Default True.
Returns:
The constructed Device object. Port labels are not removed from the pattern.
"""
ports = {} # Note: could do a list here, if they're not unique
annotated_cells = set()
def find_ports_each(pat, hierarchy, transform, memo) -> Pattern:
if len(hierarchy) > max_depth - 1:
return pat
if skip_subcells and any(parent in annotated_cells for parent in hierarchy):
return pat
labels = [ll for ll in pat.labels if ll.layer in layers]
if len(labels) == 0:
return pat
if skip_subcells:
annotated_cells.add(pat)
mirr_factor = numpy.array((1, -1)) ** transform[3]
rot_matrix = rotation_matrix_2d(transform[2])
for label in labels:
name, property_string = label.string.split(':')
properties = property_string.split(' ')
ptype = properties[0]
angle_deg = float(properties[1]) if len(ptype) else 0
xy_global = transform[:2] + rot_matrix @ (label.offset * mirr_factor)
angle = numpy.deg2rad(angle_deg) * mirr_factor[0] * mirr_factor[1] + transform[2]
if name in ports:
logger.info(f'Duplicate port {name} in pattern {pattern.name}')
ports[name] = Port(offset=xy_global, rotation=angle, ptype=ptype)
return pat
pattern.dfs(visit_before=find_ports_each, transform=True)
return Device(pattern, ports)

View File

@ -1,703 +0,0 @@
"""
Pather with batched (multi-step) rendering
"""
from typing import Self
from collections.abc import Sequence, Mapping, MutableMapping
import copy
import logging
from collections import defaultdict
from pprint import pformat
import numpy
from numpy import pi
from numpy.typing import ArrayLike
from ..pattern import Pattern
from ..library import ILibrary
from ..error import PortError, BuildError
from ..ports import PortList, Port
from ..abstract import Abstract
from ..utils import SupportsBool
from .tools import Tool, RenderStep
from .utils import ell
logger = logging.getLogger(__name__)
class RenderPather(PortList):
"""
`RenderPather` is an alternative to `Pather` which uses the `path`/`path_to`/`mpath`
functions to plan out wire paths without incrementally generating the layout. Instead,
it waits until `render` is called, at which point it draws all the planned segments
simultaneously. This allows it to e.g. draw each wire using a single `Path` or
`Polygon` shape instead of multiple rectangles.
`RenderPather` calls out to `Tool.planL` and `Tool.render` to provide tool-specific
dimensions and build the final geometry for each wire. `Tool.planL` provides the
output port data (relative to the input) for each segment. The tool, input and output
ports are placed into a `RenderStep`, and a sequence of `RenderStep`s is stored for
each port. When `render` is called, it bundles `RenderStep`s into batches which use
the same `Tool`, and passes each batch to the relevant tool's `Tool.render` to build
the geometry.
See `Pather` for routing examples. After routing is complete, `render` must be called
to generate the final geometry.
"""
__slots__ = ('pattern', 'library', 'paths', 'tools', '_dead', )
pattern: Pattern
""" Layout of this device """
library: ILibrary
""" Library from which patterns should be referenced """
_dead: bool
""" If True, plug()/place() are skipped (for debugging) """
paths: defaultdict[str, list[RenderStep]]
""" Per-port list of operations, to be used by `render` """
tools: dict[str | None, Tool]
"""
Tool objects are used to dynamically generate new single-use Devices
(e.g wires or waveguides) to be plugged into this device.
"""
@property
def ports(self) -> dict[str, Port]:
return self.pattern.ports
@ports.setter
def ports(self, value: dict[str, Port]) -> None:
self.pattern.ports = value
def __init__(
self,
library: ILibrary,
*,
pattern: Pattern | None = None,
ports: str | Mapping[str, Port] | None = None,
tools: Tool | MutableMapping[str | None, Tool] | None = None,
name: str | None = None,
) -> None:
"""
Args:
library: The library from which referenced patterns will be taken,
and where new patterns (e.g. generated by the `tools`) will be placed.
pattern: The pattern which will be modified by subsequent operations.
If `None` (default), a new pattern is created.
ports: Allows specifying the initial set of ports, if `pattern` does
not already have any ports (or is not provided). May be a string,
in which case it is interpreted as a name in `library`.
Default `None` (no ports).
tools: A mapping of {port: tool} which specifies what `Tool` should be used
to generate waveguide or wire segments when `path`/`path_to`/`mpath`
are called. Relies on `Tool.planL` and `Tool.render` implementations.
name: If specified, `library[name]` is set to `self.pattern`.
"""
self._dead = False
self.paths = defaultdict(list)
self.library = library
if pattern is not None:
self.pattern = pattern
else:
self.pattern = Pattern()
if ports is not None:
if self.pattern.ports:
raise BuildError('Ports supplied for pattern with pre-existing ports!')
if isinstance(ports, str):
if library is None:
raise BuildError('Ports given as a string, but `library` was `None`!')
ports = library.abstract(ports).ports
self.pattern.ports.update(copy.deepcopy(dict(ports)))
if name is not None:
if library is None:
raise BuildError('Name was supplied, but no library was given!')
library[name] = self.pattern
if tools is None:
self.tools = {}
elif isinstance(tools, Tool):
self.tools = {None: tools}
else:
self.tools = dict(tools)
@classmethod
def interface(
cls: type['RenderPather'],
source: PortList | Mapping[str, Port] | str,
*,
library: ILibrary | None = None,
tools: Tool | MutableMapping[str | None, Tool] | None = None,
in_prefix: str = 'in_',
out_prefix: str = '',
port_map: dict[str, str] | Sequence[str] | None = None,
name: str | None = None,
) -> 'RenderPather':
"""
Wrapper for `Pattern.interface()`, which returns a RenderPather instead.
Args:
source: A collection of ports (e.g. Pattern, Builder, or dict)
from which to create the interface. May be a pattern name if
`library` is provided.
library: Library from which existing patterns should be referenced,
and to which the new one should be added (if named). If not provided,
`source.library` must exist and will be used.
tools: `Tool`s which will be used by the pather for generating new wires
or waveguides (via `path`/`path_to`/`mpath`).
in_prefix: Prepended to port names for newly-created ports with
reversed directions compared to the current device.
out_prefix: Prepended to port names for ports which are directly
copied from the current device.
port_map: Specification for ports to copy into the new device:
- If `None`, all ports are copied.
- If a sequence, only the listed ports are copied
- If a mapping, the listed ports (keys) are copied and
renamed (to the values).
Returns:
The new `RenderPather`, with an empty pattern and 2x as many ports as
listed in port_map.
Raises:
`PortError` if `port_map` contains port names not present in the
current device.
`PortError` if applying the prefixes results in duplicate port
names.
"""
if library is None:
if hasattr(source, 'library') and isinstance(source.library, ILibrary):
library = source.library
else:
raise BuildError('No library provided (and not present in `source.library`')
if tools is None and hasattr(source, 'tools') and isinstance(source.tools, dict):
tools = source.tools
if isinstance(source, str):
source = library.abstract(source).ports
pat = Pattern.interface(source, in_prefix=in_prefix, out_prefix=out_prefix, port_map=port_map)
new = RenderPather(library=library, pattern=pat, name=name, tools=tools)
return new
def plug(
self,
other: Abstract | str,
map_in: dict[str, str],
map_out: dict[str, str | None] | None = None,
*,
mirrored: bool = False,
inherit_name: bool = True,
set_rotation: bool | None = None,
append: bool = False,
) -> Self:
"""
Wrapper for `Pattern.plug` which adds a `RenderStep` with opcode 'P'
for any affected ports. This separates any future `RenderStep`s on the
same port into a new batch, since the plugged device interferes with drawing.
Args:
other: An `Abstract`, string, or `Pattern` describing the device to be instatiated.
map_in: dict of `{'self_port': 'other_port'}` mappings, specifying
port connections between the two devices.
map_out: dict of `{'old_name': 'new_name'}` mappings, specifying
new names for ports in `other`.
mirrored: Enables mirroring `other` across the x axis prior to
connecting any ports.
inherit_name: If `True`, and `map_in` specifies only a single port,
and `map_out` is `None`, and `other` has only two ports total,
then automatically renames the output port of `other` to the
name of the port from `self` that appears in `map_in`. This
makes it easy to extend a device with simple 2-port devices
(e.g. wires) without providing `map_out` each time `plug` is
called. See "Examples" above for more info. Default `True`.
set_rotation: If the necessary rotation cannot be determined from
the ports being connected (i.e. all pairs have at least one
port with `rotation=None`), `set_rotation` must be provided
to indicate how much `other` should be rotated. Otherwise,
`set_rotation` must remain `None`.
append: If `True`, `other` is appended instead of being referenced.
Note that this does not flatten `other`, so its refs will still
be refs (now inside `self`).
Returns:
self
Raises:
`PortError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other_names`.
`PortError` if there are any duplicate names after `map_in` and `map_out`
are applied.
`PortError` if the specified port mapping is not achieveable (the ports
do not line up)
"""
if self._dead:
logger.error('Skipping plug() since device is dead')
return self
other_tgt: Pattern | Abstract
if isinstance(other, str):
other_tgt = self.library.abstract(other)
if append and isinstance(other, Abstract):
other_tgt = self.library[other.name]
# get rid of plugged ports
for kk in map_in:
if kk in self.paths:
self.paths[kk].append(RenderStep('P', None, self.ports[kk].copy(), self.ports[kk].copy(), None))
plugged = map_in.values()
for name, port in other_tgt.ports.items():
if name in plugged:
continue
new_name = map_out.get(name, name) if map_out is not None else name
if new_name is not None and new_name in self.paths:
self.paths[new_name].append(RenderStep('P', None, port.copy(), port.copy(), None))
self.pattern.plug(
other=other_tgt,
map_in=map_in,
map_out=map_out,
mirrored=mirrored,
inherit_name=inherit_name,
set_rotation=set_rotation,
append=append,
)
return self
def place(
self,
other: Abstract | str,
*,
offset: ArrayLike = (0, 0),
rotation: float = 0,
pivot: ArrayLike = (0, 0),
mirrored: bool = False,
port_map: dict[str, str | None] | None = None,
skip_port_check: bool = False,
append: bool = False,
) -> Self:
"""
Wrapper for `Pattern.place` which adds a `RenderStep` with opcode 'P'
for any affected ports. This separates any future `RenderStep`s on the
same port into a new batch, since the placed device interferes with drawing.
Note that mirroring is applied before rotation; translation (`offset`) is applied last.
Args:
other: An `Abstract` or `Pattern` describing the device to be instatiated.
offset: Offset at which to place the instance. Default (0, 0).
rotation: Rotation applied to the instance before placement. Default 0.
pivot: Rotation is applied around this pivot point (default (0, 0)).
Rotation is applied prior to translation (`offset`).
mirrored: Whether theinstance should be mirrored across the x axis.
Mirroring is applied before translation and rotation.
port_map: dict of `{'old_name': 'new_name'}` mappings, specifying
new names for ports in the instantiated pattern. New names can be
`None`, which will delete those ports.
skip_port_check: Can be used to skip the internal call to `check_ports`,
in case it has already been performed elsewhere.
append: If `True`, `other` is appended instead of being referenced.
Note that this does not flatten `other`, so its refs will still
be refs (now inside `self`).
Returns:
self
Raises:
`PortError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other.ports`.
`PortError` if there are any duplicate names after `map_in` and `map_out`
are applied.
"""
if self._dead:
logger.error('Skipping place() since device is dead')
return self
other_tgt: Pattern | Abstract
if isinstance(other, str):
other_tgt = self.library.abstract(other)
if append and isinstance(other, Abstract):
other_tgt = self.library[other.name]
for name, port in other_tgt.ports.items():
new_name = port_map.get(name, name) if port_map is not None else name
if new_name is not None and new_name in self.paths:
self.paths[new_name].append(RenderStep('P', None, port.copy(), port.copy(), None))
self.pattern.place(
other=other_tgt,
offset=offset,
rotation=rotation,
pivot=pivot,
mirrored=mirrored,
port_map=port_map,
skip_port_check=skip_port_check,
append=append,
)
return self
def retool(
self,
tool: Tool,
keys: str | Sequence[str | None] | None = None,
) -> Self:
"""
Update the `Tool` which will be used when generating `Pattern`s for the ports
given by `keys`.
Args:
tool: The new `Tool` to use for the given ports.
keys: Which ports the tool should apply to. `None` indicates the default tool,
used when there is no matching entry in `self.tools` for the port in question.
Returns:
self
"""
if keys is None or isinstance(keys, str):
self.tools[keys] = tool
else:
for key in keys:
self.tools[key] = tool
return self
def path(
self,
portspec: str,
ccw: SupportsBool | None,
length: float,
**kwargs,
) -> Self:
"""
Plan a "wire"/"waveguide" extending from the port `portspec`, with the aim
of traveling exactly `length` distance.
The wire will travel `length` distance along the port's axis, an an unspecified
(tool-dependent) distance in the perpendicular direction. The output port will
be rotated (or not) based on the `ccw` parameter.
`RenderPather.render` must be called after all paths have been fully planned.
Args:
portspec: The name of the port into which the wire will be plugged.
ccw: If `None`, the output should be along the same axis as the input.
Otherwise, cast to bool and turn counterclockwise if True
and clockwise otherwise.
length: The total distance from input to output, along the input's axis only.
(There may be a tool-dependent offset along the other axis.)
Returns:
self
Raises:
BuildError if `distance` is too small to fit the bend (if a bend is present).
LibraryError if no valid name could be picked for the pattern.
"""
if self._dead:
logger.error('Skipping path() since device is dead')
return self
port = self.pattern[portspec]
in_ptype = port.ptype
port_rot = port.rotation
assert port_rot is not None # TODO allow manually setting rotation for RenderPather.path()?
tool = self.tools.get(portspec, self.tools[None])
# ask the tool for bend size (fill missing dx or dy), check feasibility, and get out_ptype
out_port, data = tool.planL(ccw, length, in_ptype=in_ptype, **kwargs)
# Update port
out_port.rotate_around((0, 0), pi + port_rot)
out_port.translate(port.offset)
step = RenderStep('L', tool, port.copy(), out_port.copy(), data)
self.paths[portspec].append(step)
self.pattern.ports[portspec] = out_port.copy()
return self
def path_to(
self,
portspec: str,
ccw: SupportsBool | None,
position: float | None = None,
*,
x: float | None = None,
y: float | None = None,
**kwargs,
) -> Self:
"""
Plan a "wire"/"waveguide" extending from the port `portspec`, with the aim
of ending exactly at a target position.
The wire will travel so that the output port will be placed at exactly the target
position along the input port's axis. There can be an unspecified (tool-dependent)
offset in the perpendicular direction. The output port will be rotated (or not)
based on the `ccw` parameter.
`RenderPather.render` must be called after all paths have been fully planned.
Args:
portspec: The name of the port into which the wire will be plugged.
ccw: If `None`, the output should be along the same axis as the input.
Otherwise, cast to bool and turn counterclockwise if True
and clockwise otherwise.
position: The final port position, along the input's axis only.
(There may be a tool-dependent offset along the other axis.)
Only one of `position`, `x`, and `y` may be specified.
x: The final port position along the x axis.
`portspec` must refer to a horizontal port if `x` is passed, otherwise a
BuildError will be raised.
y: The final port position along the y axis.
`portspec` must refer to a vertical port if `y` is passed, otherwise a
BuildError will be raised.
Returns:
self
Raises:
BuildError if `position`, `x`, or `y` is too close to fit the bend (if a bend
is present).
BuildError if `x` or `y` is specified but does not match the axis of `portspec`.
BuildError if more than one of `x`, `y`, and `position` is specified.
"""
if self._dead:
logger.error('Skipping path_to() since device is dead')
return self
pos_count = sum(vv is not None for vv in (position, x, y))
if pos_count > 1:
raise BuildError('Only one of `position`, `x`, and `y` may be specified at once')
if pos_count < 1:
raise BuildError('One of `position`, `x`, and `y` must be specified')
port = self.pattern[portspec]
if port.rotation is None:
raise PortError(f'Port {portspec} has no rotation and cannot be used for path_to()')
if not numpy.isclose(port.rotation % (pi / 2), 0):
raise BuildError('path_to was asked to route from non-manhattan port')
is_horizontal = numpy.isclose(port.rotation % pi, 0)
if is_horizontal:
if y is not None:
raise BuildError('Asked to path to y-coordinate, but port is horizontal')
if position is None:
position = x
else:
if x is not None:
raise BuildError('Asked to path to x-coordinate, but port is vertical')
if position is None:
position = y
x0, y0 = port.offset
if is_horizontal:
if numpy.sign(numpy.cos(port.rotation)) == numpy.sign(position - x0):
raise BuildError(f'path_to routing to behind source port: x0={x0:g} to {position:g}')
length = numpy.abs(position - x0)
else:
if numpy.sign(numpy.sin(port.rotation)) == numpy.sign(position - y0):
raise BuildError(f'path_to routing to behind source port: y0={y0:g} to {position:g}')
length = numpy.abs(position - y0)
return self.path(portspec, ccw, length, **kwargs)
def mpath(
self,
portspec: str | Sequence[str],
ccw: SupportsBool | None,
*,
spacing: float | ArrayLike | None = None,
set_rotation: float | None = None,
**kwargs,
) -> Self:
"""
`mpath` is a superset of `path` and `path_to` which can act on bundles or buses
of "wires or "waveguides".
See `Pather.mpath` for details.
Args:
portspec: The names of the ports which are to be routed.
ccw: If `None`, the outputs should be along the same axis as the inputs.
Otherwise, cast to bool and turn 90 degrees counterclockwise if `True`
and clockwise otherwise.
spacing: Center-to-center distance between output ports along the input port's axis.
Must be provided if (and only if) `ccw` is not `None`.
set_rotation: If the provided ports have `rotation=None`, this can be used
to set a rotation for them.
Returns:
self
Raises:
BuildError if the implied length for any wire is too close to fit the bend
(if a bend is requested).
BuildError if `xmin`/`xmax` or `ymin`/`ymax` is specified but does not
match the axis of `portspec`.
BuildError if an incorrect bound type or spacing is specified.
"""
if self._dead:
logger.error('Skipping mpath() since device is dead')
return self
bound_types = set()
if 'bound_type' in kwargs:
bound_types.add(kwargs['bound_type'])
bound = kwargs['bound']
for bt in ('emin', 'emax', 'pmin', 'pmax', 'xmin', 'xmax', 'ymin', 'ymax', 'min_past_furthest'):
if bt in kwargs:
bound_types.add(bt)
bound = kwargs[bt]
if not bound_types:
raise BuildError('No bound type specified for mpath')
if len(bound_types) > 1:
raise BuildError(f'Too many bound types specified for mpath: {bound_types}')
bound_type = tuple(bound_types)[0]
if isinstance(portspec, str):
portspec = [portspec]
ports = self.pattern[tuple(portspec)]
extensions = ell(ports, ccw, spacing=spacing, bound=bound, bound_type=bound_type, set_rotation=set_rotation)
if len(ports) == 1:
# Not a bus, so having a container just adds noise to the layout
port_name = tuple(portspec)[0]
self.path(port_name, ccw, extensions[port_name])
else:
for port_name, length in extensions.items():
self.path(port_name, ccw, length)
return self
def render(
self,
append: bool = True,
) -> Self:
"""
Generate the geometry which has been planned out with `path`/`path_to`/etc.
Args:
append: If `True`, the rendered geometry will be directly appended to
`self.pattern`. Note that it will not be flattened, so if only one
layer of hierarchy is eliminated.
Returns:
self
"""
lib = self.library
tool_port_names = ('A', 'B')
pat = Pattern()
def render_batch(portspec: str, batch: list[RenderStep], append: bool) -> None:
assert batch[0].tool is not None
name = lib << batch[0].tool.render(batch, port_names=tool_port_names)
pat.ports[portspec] = batch[0].start_port.copy()
if append:
pat.plug(lib[name], {portspec: tool_port_names[0]}, append=append)
del lib[name] # NOTE if the rendered pattern has refs, those are now in `pat` but not flattened
else:
pat.plug(lib.abstract(name), {portspec: tool_port_names[0]}, append=append)
for portspec, steps in self.paths.items():
batch: list[RenderStep] = []
for step in steps:
appendable_op = step.opcode in ('L', 'S', 'U')
same_tool = batch and step.tool == batch[0].tool
# If we can't continue a batch, render it
if batch and (not appendable_op or not same_tool):
render_batch(portspec, batch, append)
batch = []
# batch is emptied already if we couldn't continue it
if appendable_op:
batch.append(step)
# Opcodes which break the batch go below this line
if not appendable_op and portspec in pat.ports:
del pat.ports[portspec]
#If the last batch didn't end yet
if batch:
render_batch(portspec, batch, append)
self.paths.clear()
pat.ports.clear()
self.pattern.append(pat)
return self
def translate(self, offset: ArrayLike) -> Self:
"""
Translate the pattern and all ports.
Args:
offset: (x, y) distance to translate by
Returns:
self
"""
self.pattern.translate_elements(offset)
return self
def rotate_around(self, pivot: ArrayLike, angle: float) -> Self:
"""
Rotate the pattern and all ports.
Args:
angle: angle (radians, counterclockwise) to rotate by
pivot: location to rotate around
Returns:
self
"""
self.pattern.rotate_around(pivot, angle)
return self
def mirror(self, axis: int) -> Self:
"""
Mirror the pattern and all ports across the specified axis.
Args:
axis: Axis to mirror across (x=0, y=1)
Returns:
self
"""
self.pattern.mirror(axis)
return self
def set_dead(self) -> Self:
"""
Disallows further changes through `plug()` or `place()`.
This is meant for debugging:
```
dev.plug(a, ...)
dev.set_dead() # added for debug purposes
dev.plug(b, ...) # usually raises an error, but now skipped
dev.plug(c, ...) # also skipped
dev.pattern.visualize() # shows the device as of the set_dead() call
```
Returns:
self
"""
self._dead = True
return self
def __repr__(self) -> str:
s = f'<Pather {self.pattern} L({len(self.library)}) {pformat(self.tools)}>'
return s

View File

@ -1,553 +1,22 @@
""" """
Tools are objects which dynamically generate simple single-use devices (e.g. wires or waveguides) Tools are objects which dynamically generate simple single-use devices (e.g. wires or waveguides)
# TODO document all tools
""" """
from typing import Literal, Any from typing import TYPE_CHECKING, Optional, Sequence
from collections.abc import Sequence, Callable
from abc import ABCMeta # , abstractmethod # TODO any way to make Tool ok with implementing only one method?
from dataclasses import dataclass
import numpy if TYPE_CHECKING:
from numpy.typing import NDArray from .devices import Device
from numpy import pi
from ..utils import SupportsBool, rotation_matrix_2d, layer_t
from ..ports import Port
from ..pattern import Pattern
from ..abstract import Abstract
from ..library import ILibrary, Library, SINGLE_USE_PREFIX
from ..error import BuildError
@dataclass(frozen=True, slots=True)
class RenderStep:
"""
Representation of a single saved operation, used by `RenderPather` and passed
to `Tool.render()` when `RenderPather.render()` is called.
"""
opcode: Literal['L', 'S', 'U', 'P']
""" What operation is being performed.
L: planL (straight, optionally with a single bend)
S: planS (s-bend)
U: planU (u-bend)
P: plug
"""
tool: 'Tool | None'
""" The current tool. May be `None` if `opcode='P'` """
start_port: Port
end_port: Port
data: Any
""" Arbitrary tool-specific data"""
def __post_init__(self) -> None:
if self.opcode != 'P' and self.tool is None:
raise BuildError('Got tool=None but the opcode is not "P"')
class Tool: class Tool:
"""
Interface for path (e.g. wire or waveguide) generation.
Note that subclasses may implement only a subset of the methods and leave others
unimplemented (e.g. in cases where they don't make sense or the required components
are impractical or unavailable).
"""
def path( def path(
self, self,
ccw: SupportsBool | None, ccw: Optional[bool],
length: float, length: float,
*, *,
in_ptype: str | None = None, in_ptype: Optional[str] = None,
out_ptype: str | None = None, out_ptype: Optional[str] = None,
port_names: tuple[str, str] = ('A', 'B'), port_names: Sequence[str] = ('A', 'B'),
**kwargs, **kwargs,
) -> Library: ) -> 'Device':
"""
Create a wire or waveguide that travels exactly `length` distance along the axis
of its input port.
Used by `Pather`.
The output port must be exactly `length` away along the input port's axis, but
may be placed an additional (unspecified) distance away along the perpendicular
direction. The output port should be rotated (or not) based on the value of
`ccw`.
The input and output ports should be compatible with `in_ptype` and
`out_ptype`, respectively. They should also be named `port_names[0]` and
`port_names[1]`, respectively.
Args:
ccw: If `None`, the output should be along the same axis as the input.
Otherwise, cast to bool and turn counterclockwise if True
and clockwise otherwise.
length: The total distance from input to output, along the input's axis only.
(There may be a tool-dependent offset along the other axis.)
in_ptype: The `ptype` of the port into which this wire's input will be `plug`ged.
out_ptype: The `ptype` of the port into which this wire's output will be `plug`ged.
port_names: The output pattern will have its input port named `port_names[0]` and
its output named `port_names[1]`.
kwargs: Custom tool-specific parameters.
Returns:
A pattern tree containing the requested L-shaped (or straight) wire or waveguide
Raises:
BuildError if an impossible or unsupported geometry is requested.
"""
raise NotImplementedError(f'path() not implemented for {type(self)}') raise NotImplementedError(f'path() not implemented for {type(self)}')
def planL(
self,
ccw: SupportsBool | None,
length: float,
*,
in_ptype: str | None = None,
out_ptype: str | None = None,
**kwargs,
) -> tuple[Port, Any]:
"""
Plan a wire or waveguide that travels exactly `length` distance along the axis
of its input port.
Used by `RenderPather`.
The output port must be exactly `length` away along the input port's axis, but
may be placed an additional (unspecified) distance away along the perpendicular
direction. The output port should be rotated (or not) based on the value of
`ccw`.
The input and output ports should be compatible with `in_ptype` and
`out_ptype`, respectively.
Args:
ccw: If `None`, the output should be along the same axis as the input.
Otherwise, cast to bool and turn counterclockwise if True
and clockwise otherwise.
length: The total distance from input to output, along the input's axis only.
(There may be a tool-dependent offset along the other axis.)
in_ptype: The `ptype` of the port into which this wire's input will be `plug`ged.
out_ptype: The `ptype` of the port into which this wire's output will be `plug`ged.
kwargs: Custom tool-specific parameters.
Returns:
The calculated output `Port` for the wire.
Any tool-specifc data, to be stored in `RenderStep.data`, for use during rendering.
Raises:
BuildError if an impossible or unsupported geometry is requested.
"""
raise NotImplementedError(f'planL() not implemented for {type(self)}')
def planS(
self,
length: float,
jog: float,
*,
in_ptype: str | None = None,
out_ptype: str | None = None,
**kwargs,
) -> tuple[Port, Any]:
"""
Plan a wire or waveguide that travels exactly `length` distance along the axis
of its input port and `jog` distance along the perpendicular axis (i.e. an S-bend).
Used by `RenderPather`.
The output port must have an orientation rotated by pi from the input port.
The input and output ports should be compatible with `in_ptype` and
`out_ptype`, respectively.
Args:
length: The total distance from input to output, along the input's axis only.
jog: The total offset from the input to output, along the perpendicular axis.
A positive number implies a rightwards shift (i.e. clockwise bend followed
by a counterclockwise bend)
in_ptype: The `ptype` of the port into which this wire's input will be `plug`ged.
out_ptype: The `ptype` of the port into which this wire's output will be `plug`ged.
kwargs: Custom tool-specific parameters.
Returns:
The calculated output `Port` for the wire.
Any tool-specifc data, to be stored in `RenderStep.data`, for use during rendering.
Raises:
BuildError if an impossible or unsupported geometry is requested.
"""
raise NotImplementedError(f'planS() not implemented for {type(self)}')
def planU(
self,
jog: float,
*,
in_ptype: str | None = None,
out_ptype: str | None = None,
**kwargs,
) -> tuple[Port, Any]:
"""
# NOTE: TODO: U-bend is WIP; this interface may change in the future.
Plan a wire or waveguide that travels exactly `jog` distance along the axis
perpendicular to its input port (i.e. a U-bend).
Used by `RenderPather`.
The output port must have an orientation identical to the input port.
The input and output ports should be compatible with `in_ptype` and
`out_ptype`, respectively.
Args:
jog: The total offset from the input to output, along the perpendicular axis.
A positive number implies a rightwards shift (i.e. clockwise bend followed
by a counterclockwise bend)
in_ptype: The `ptype` of the port into which this wire's input will be `plug`ged.
out_ptype: The `ptype` of the port into which this wire's output will be `plug`ged.
kwargs: Custom tool-specific parameters.
Returns:
The calculated output `Port` for the wire.
Any tool-specifc data, to be stored in `RenderStep.data`, for use during rendering.
Raises:
BuildError if an impossible or unsupported geometry is requested.
"""
raise NotImplementedError(f'planU() not implemented for {type(self)}')
def render(
self,
batch: Sequence[RenderStep],
*,
port_names: Sequence[str] = ('A', 'B'), # noqa: ARG002 (unused)
**kwargs, # noqa: ARG002 (unused)
) -> ILibrary:
"""
Render the provided `batch` of `RenderStep`s into geometry, returning a tree
(a Library with a single topcell).
Args:
batch: A sequence of `RenderStep` objects containing the ports and data
provided by this tool's `planL`/`planS`/`planU` functions.
port_names: The topcell's input and output ports should be named
`port_names[0]` and `port_names[1]` respectively.
kwargs: Custom tool-specific parameters.
"""
assert not batch or batch[0].tool == self
raise NotImplementedError(f'render() not implemented for {type(self)}')
abstract_tuple_t = tuple[Abstract, str, str]
@dataclass
class BasicTool(Tool, metaclass=ABCMeta):
"""
A simple tool which relies on a single pre-rendered `bend` pattern, a function
for generating straight paths, and a table of pre-rendered `transitions` for converting
from non-native ptypes.
"""
straight: tuple[Callable[[float], Pattern], str, str]
""" `create_straight(length: float), in_port_name, out_port_name` """
bend: abstract_tuple_t # Assumed to be clockwise
""" `clockwise_bend_abstract, in_port_name, out_port_name` """
transitions: dict[str, abstract_tuple_t]
""" `{ptype: (transition_abstract`, ptype_port_name, other_port_name), ...}` """
default_out_ptype: str
""" Default value for out_ptype """
@dataclass(frozen=True, slots=True)
class LData:
""" Data for planL """
straight_length: float
ccw: SupportsBool | None
in_transition: abstract_tuple_t | None
out_transition: abstract_tuple_t | None
def path(
self,
ccw: SupportsBool | None,
length: float,
*,
in_ptype: str | None = None,
out_ptype: str | None = None,
port_names: tuple[str, str] = ('A', 'B'),
**kwargs,
) -> Library:
_out_port, data = self.planL(
ccw,
length,
in_ptype=in_ptype,
out_ptype=out_ptype,
)
gen_straight, sport_in, sport_out = self.straight
tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'path')
pat.add_port_pair(names=port_names)
if data.in_transition:
ipat, iport_theirs, _iport_ours = data.in_transition
pat.plug(ipat, {port_names[1]: iport_theirs})
if not numpy.isclose(data.straight_length, 0):
straight = tree <= {SINGLE_USE_PREFIX + 'straight': gen_straight(data.straight_length, **kwargs)}
pat.plug(straight, {port_names[1]: sport_in})
if data.ccw is not None:
bend, bport_in, bport_out = self.bend
pat.plug(bend, {port_names[1]: bport_in}, mirrored=bool(ccw))
if data.out_transition:
opat, oport_theirs, oport_ours = data.out_transition
pat.plug(opat, {port_names[1]: oport_ours})
return tree
def planL(
self,
ccw: SupportsBool | None,
length: float,
*,
in_ptype: str | None = None,
out_ptype: str | None = None,
**kwargs, # noqa: ARG002 (unused)
) -> tuple[Port, LData]:
# TODO check all the math for L-shaped bends
if ccw is not None:
bend, bport_in, bport_out = self.bend
angle_in = bend.ports[bport_in].rotation
angle_out = bend.ports[bport_out].rotation
assert angle_in is not None
assert angle_out is not None
bend_dxy = rotation_matrix_2d(-angle_in) @ (
bend.ports[bport_out].offset
- bend.ports[bport_in].offset
)
bend_angle = angle_out - angle_in
if bool(ccw):
bend_dxy[1] *= -1
bend_angle *= -1
else:
bend_dxy = numpy.zeros(2)
bend_angle = 0
in_transition = self.transitions.get('unk' if in_ptype is None else in_ptype, None)
if in_transition is not None:
ipat, iport_theirs, iport_ours = in_transition
irot = ipat.ports[iport_theirs].rotation
assert irot is not None
itrans_dxy = rotation_matrix_2d(-irot) @ (
ipat.ports[iport_ours].offset
- ipat.ports[iport_theirs].offset
)
else:
itrans_dxy = numpy.zeros(2)
out_transition = self.transitions.get('unk' if out_ptype is None else out_ptype, None)
if out_transition is not None:
opat, oport_theirs, oport_ours = out_transition
orot = opat.ports[oport_ours].rotation
assert orot is not None
otrans_dxy = rotation_matrix_2d(-orot + bend_angle) @ (
opat.ports[oport_theirs].offset
- opat.ports[oport_ours].offset
)
else:
otrans_dxy = numpy.zeros(2)
if out_transition is not None:
out_ptype_actual = opat.ports[oport_theirs].ptype
elif ccw is not None:
out_ptype_actual = bend.ports[bport_out].ptype
else:
out_ptype_actual = self.default_out_ptype
straight_length = length - bend_dxy[0] - itrans_dxy[0] - otrans_dxy[0]
bend_run = bend_dxy[1] + itrans_dxy[1] + otrans_dxy[1]
if straight_length < 0:
raise BuildError(
f'Asked to draw path with total length {length:,g}, shorter than required bends and transitions:\n'
f'bend: {bend_dxy[0]:,g} in_trans: {itrans_dxy[0]:,g} out_trans: {otrans_dxy[0]:,g}'
)
data = self.LData(straight_length, ccw, in_transition, out_transition)
out_port = Port((length, bend_run), rotation=bend_angle, ptype=out_ptype_actual)
return out_port, data
def render(
self,
batch: Sequence[RenderStep],
*,
port_names: Sequence[str] = ('A', 'B'),
append: bool = True,
**kwargs,
) -> ILibrary:
tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'path')
pat.add_port_pair(names=(port_names[0], port_names[1]))
gen_straight, sport_in, _sport_out = self.straight
for step in batch:
straight_length, ccw, in_transition, out_transition = step.data
assert step.tool == self
if step.opcode == 'L':
if in_transition:
ipat, iport_theirs, _iport_ours = in_transition
pat.plug(ipat, {port_names[1]: iport_theirs})
if not numpy.isclose(straight_length, 0):
straight_pat = gen_straight(straight_length, **kwargs)
if append:
pat.plug(straight_pat, {port_names[1]: sport_in}, append=True)
else:
straight = tree <= {SINGLE_USE_PREFIX + 'straight': straight_pat}
pat.plug(straight, {port_names[1]: sport_in}, append=True)
if ccw is not None:
bend, bport_in, bport_out = self.bend
pat.plug(bend, {port_names[1]: bport_in}, mirrored=bool(ccw))
if out_transition:
opat, oport_theirs, oport_ours = out_transition
pat.plug(opat, {port_names[1]: oport_ours})
return tree
@dataclass
class PathTool(Tool, metaclass=ABCMeta):
"""
A tool which draws `Path` geometry elements.
If `planL` / `render` are used, the `Path` elements can cover >2 vertices;
with `path` only individual rectangles will be drawn.
"""
layer: layer_t
""" Layer to draw on """
width: float
""" `Path` width """
ptype: str = 'unk'
""" ptype for any ports in patterns generated by this tool """
#@dataclass(frozen=True, slots=True)
#class LData:
# dxy: NDArray[numpy.float64]
#def __init__(self, layer: layer_t, width: float, ptype: str = 'unk') -> None:
# Tool.__init__(self)
# self.layer = layer
# self.width = width
# self.ptype: str
def path(
self,
ccw: SupportsBool | None,
length: float,
*,
in_ptype: str | None = None,
out_ptype: str | None = None,
port_names: tuple[str, str] = ('A', 'B'),
**kwargs, # noqa: ARG002 (unused)
) -> Library:
out_port, dxy = self.planL(
ccw,
length,
in_ptype=in_ptype,
out_ptype=out_ptype,
)
tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'path')
pat.path(layer=self.layer, width=self.width, vertices=[(0, 0), (length, 0)])
if ccw is None:
out_rot = pi
elif bool(ccw):
out_rot = -pi / 2
else:
out_rot = pi / 2
pat.ports = {
port_names[0]: Port((0, 0), rotation=0, ptype=self.ptype),
port_names[1]: Port(dxy, rotation=out_rot, ptype=self.ptype),
}
return tree
def planL(
self,
ccw: SupportsBool | None,
length: float,
*,
in_ptype: str | None = None, # noqa: ARG002 (unused)
out_ptype: str | None = None,
**kwargs, # noqa: ARG002 (unused)
) -> tuple[Port, NDArray[numpy.float64]]:
# TODO check all the math for L-shaped bends
if out_ptype and out_ptype != self.ptype:
raise BuildError(f'Requested {out_ptype=} does not match path ptype {self.ptype}')
if ccw is not None:
bend_dxy = numpy.array([1, -1]) * self.width / 2
bend_angle = pi / 2
if bool(ccw):
bend_dxy[1] *= -1
bend_angle *= -1
else:
bend_dxy = numpy.zeros(2)
bend_angle = pi
straight_length = length - bend_dxy[0]
bend_run = bend_dxy[1]
if straight_length < 0:
raise BuildError(
f'Asked to draw path with total length {length:,g}, shorter than required bend: {bend_dxy[0]:,g}'
)
data = numpy.array((length, bend_run))
out_port = Port(data, rotation=bend_angle, ptype=self.ptype)
return out_port, data
def render(
self,
batch: Sequence[RenderStep],
*,
port_names: Sequence[str] = ('A', 'B'),
**kwargs, # noqa: ARG002 (unused)
) -> ILibrary:
path_vertices = [batch[0].start_port.offset]
for step in batch:
assert step.tool == self
port_rot = step.start_port.rotation
assert port_rot is not None
if step.opcode == 'L':
length, bend_run = step.data
dxy = rotation_matrix_2d(port_rot + pi) @ (length, 0)
#path_vertices.append(step.start_port.offset)
path_vertices.append(step.start_port.offset + dxy)
else:
raise BuildError(f'Unrecognized opcode "{step.opcode}"')
if (path_vertices[-1] != batch[-1].end_port.offset).any():
# If the path ends in a bend, we need to add the final vertex
path_vertices.append(batch[-1].end_port.offset)
tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'path')
pat.path(layer=self.layer, width=self.width, vertices=path_vertices)
pat.ports = {
port_names[0]: batch[0].start_port.copy().rotate(pi),
port_names[1]: batch[-1].end_port.copy().rotate(pi),
}
return tree

View File

@ -1,27 +1,26 @@
from typing import SupportsFloat, cast, TYPE_CHECKING from typing import Dict, Tuple, List, Optional, Union, Any, cast, Sequence, TYPE_CHECKING
from collections.abc import Mapping, Sequence
from pprint import pformat from pprint import pformat
import numpy import numpy
from numpy import pi from numpy import pi
from numpy.typing import ArrayLike, NDArray from numpy.typing import ArrayLike
from ..utils import rotation_matrix_2d, SupportsBool from ..utils import rotation_matrix_2d
from ..error import BuildError from ..error import BuildError
if TYPE_CHECKING: if TYPE_CHECKING:
from ..ports import Port from .devices import Port
def ell( def ell(
ports: Mapping[str, 'Port'], ports: Dict[str, 'Port'],
ccw: SupportsBool | None, ccw: Optional[bool],
bound_type: str, bound_type: str,
bound: float | ArrayLike, bound: Union[float, ArrayLike],
*, *,
spacing: float | ArrayLike | None = None, spacing: Optional[Union[float, ArrayLike]] = None,
set_rotation: float | None = None, set_rotation: Optional[float] = None,
) -> dict[str, float]: ) -> Dict[str, float]:
""" """
Calculate extension for each port in order to build a 90-degree bend with the provided Calculate extension for each port in order to build a 90-degree bend with the provided
channel spacing: channel spacing:
@ -54,9 +53,9 @@ def ell(
The distance between furthest out-port (B) and the innermost bend (D's bend). The distance between furthest out-port (B) and the innermost bend (D's bend).
- 'max_extension' or 'emax': - 'max_extension' or 'emax':
The total extension value for the closest-in port (C in the diagram). The total extension value for the closest-in port (C in the diagram).
- 'min_position', 'pmin', 'xmin', 'ymin': - 'min_position' or 'pmin':
The coordinate of the innermost bend (D's bend). The coordinate of the innermost bend (D's bend).
- 'max_position', 'pmax', 'xmax', 'ymax': - 'max_position' or 'pmax':
The coordinate of the outermost bend (A's bend). The coordinate of the outermost bend (A's bend).
`bound` can also be a vector. If specifying an extension (e.g. 'min_extension', `bound` can also be a vector. If specifying an extension (e.g. 'min_extension',
@ -110,12 +109,6 @@ def ell(
raise BuildError('set_rotation must be specified if no ports have rotations!') raise BuildError('set_rotation must be specified if no ports have rotations!')
rotations = numpy.full_like(has_rotation, set_rotation, dtype=float) rotations = numpy.full_like(has_rotation, set_rotation, dtype=float)
is_horizontal = numpy.isclose(rotations[0] % pi, 0)
if bound_type in ('ymin', 'ymax') and is_horizontal:
raise BuildError(f'Asked for {bound_type} position but ports are pointing along the x-axis!')
if bound_type in ('xmin', 'xmax') and not is_horizontal:
raise BuildError(f'Asked for {bound_type} position but ports are pointing along the y-axis!')
direction = rotations[0] + pi # direction we want to travel in (+pi relative to port) direction = rotations[0] + pi # direction we want to travel in (+pi relative to port)
rot_matrix = rotation_matrix_2d(-direction) rot_matrix = rotation_matrix_2d(-direction)
@ -123,8 +116,6 @@ def ell(
orig_offsets = numpy.array([p.offset for p in ports.values()]) orig_offsets = numpy.array([p.offset for p in ports.values()])
rot_offsets = (rot_matrix @ orig_offsets.T).T rot_offsets = (rot_matrix @ orig_offsets.T).T
# ordering_base = rot_offsets.T * [[1], [-1 if ccw else 1]] # could work, but this is actually a more complex routing problem
# y_order = numpy.lexsort(ordering_base) # (need to make sure we don't collide with the next input port @ same y)
y_order = ((-1 if ccw else 1) * rot_offsets[:, 1]).argsort(kind='stable') y_order = ((-1 if ccw else 1) * rot_offsets[:, 1]).argsort(kind='stable')
y_ind = numpy.empty_like(y_order, dtype=int) y_ind = numpy.empty_like(y_order, dtype=int)
y_ind[y_order] = numpy.arange(y_ind.shape[0]) y_ind[y_order] = numpy.arange(y_ind.shape[0])
@ -144,7 +135,6 @@ def ell(
# D-----------| `d_to_align[3]` # D-----------| `d_to_align[3]`
# #
d_to_align = x_start.max() - x_start # distance to travel to align all d_to_align = x_start.max() - x_start # distance to travel to align all
offsets: NDArray[numpy.float64]
if bound_type == 'min_past_furthest': if bound_type == 'min_past_furthest':
# A------------------V `d_to_exit[0]` # A------------------V `d_to_exit[0]`
# B-----V `d_to_exit[1]` # B-----V `d_to_exit[1]`
@ -164,7 +154,6 @@ def ell(
travel = d_to_align - (ch_offsets.max() - ch_offsets) travel = d_to_align - (ch_offsets.max() - ch_offsets)
offsets = travel - travel.min().clip(max=0) offsets = travel - travel.min().clip(max=0)
rot_bound: SupportsFloat
if bound_type in ('emin', 'min_extension', if bound_type in ('emin', 'min_extension',
'emax', 'max_extension', 'emax', 'max_extension',
'min_past_furthest',): 'min_past_furthest',):
@ -193,16 +182,15 @@ def ell(
rot_bound = -bound if neg else bound rot_bound = -bound if neg else bound
min_possible = x_start + offsets min_possible = x_start + offsets
if bound_type in ('pmax', 'max_position', 'xmax', 'ymax'): if bound_type in ('pmax', 'max_position'):
extension = rot_bound - min_possible.max() extension = rot_bound - min_possible.max()
elif bound_type in ('pmin', 'min_position', 'xmin', 'ymin'): elif bound_type in ('pmin', 'min_position'):
extension = rot_bound - min_possible.min() extension = rot_bound - min_possible.min()
offsets += extension offsets += extension
if extension < 0: if extension < 0:
ext_floor = -numpy.floor(extension) raise BuildError(f'Position is too close by at least {-numpy.floor(extension)}. Total extensions would be'
raise BuildError(f'Position is too close by at least {ext_floor}. Total extensions would be\n\t' + '\n\t'.join(f'{key}: {off}' for key, off in zip(ports.keys(), offsets)))
+ '\n\t'.join(f'{key}: {off}' for key, off in zip(ports.keys(), offsets, strict=True)))
result = dict(zip(ports.keys(), offsets, strict=True)) result = dict(zip(ports.keys(), offsets))
return result return result

View File

@ -11,6 +11,13 @@ class PatternError(MasqueError):
""" """
pass pass
class PatternLockedError(PatternError):
"""
Exception raised when trying to modify a locked pattern
"""
def __init__(self):
PatternError.__init__(self, 'Tried to modify a locked Pattern, subpattern, or shape')
class LibraryError(MasqueError): class LibraryError(MasqueError):
""" """
@ -19,21 +26,22 @@ class LibraryError(MasqueError):
pass pass
class DeviceLibraryError(MasqueError):
"""
Exception raised by DeviceLibrary classes
"""
pass
class DeviceError(MasqueError):
"""
Exception raised by Device and Port objects
"""
pass
class BuildError(MasqueError): class BuildError(MasqueError):
""" """
Exception raised by builder-related functions Exception raised by builder-related functions
""" """
pass pass
class PortError(MasqueError):
"""
Exception raised by builder-related functions
"""
pass
class OneShotError(MasqueError):
"""
Exception raised when a function decorated with `@oneshot` is called more than once
"""
def __init__(self, func_name: str) -> None:
Exception.__init__(self, f'Function "{func_name}" with @oneshot was called more than once')

View File

@ -1,51 +1,45 @@
""" """
DXF file format readers and writers DXF file format readers and writers
Notes:
* Gzip modification time is set to 0 (start of current epoch, usually 1970-01-01)
* ezdxf sets creation time, write time, $VERSIONGUID, and $FINGERPRINTGUID
to unique values, so byte-for-byte reproducibility is not achievable for now
""" """
from typing import Any, cast, TextIO, IO from typing import List, Any, Dict, Tuple, Callable, Union, Sequence, Iterable
from collections.abc import Mapping, Callable import re
import io import io
import base64
import struct
import logging import logging
import pathlib import pathlib
import gzip import gzip
import numpy import numpy # type: ignore
import ezdxf import ezdxf # type: ignore
from ezdxf.enums import TextEntityAlignment
from ezdxf.entities import LWPolyline, Polyline, Text, Insert
from .utils import is_gzipped, tmpfile from .. import Pattern, SubPattern, PatternError, Label, Shape
from .. import Pattern, Ref, PatternError, Label from ..shapes import Polygon, Path
from ..library import ILibraryView, LibraryView, Library
from ..shapes import Shape, Polygon, Path
from ..repetition import Grid from ..repetition import Grid
from ..utils import rotation_matrix_2d, layer_t, normalize_mirror from ..utils import rotation_matrix_2d, layer_t
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
logger.warning('DXF support is experimental!') logger.warning('DXF support is experimental and only slightly tested!')
DEFAULT_LAYER = 'DEFAULT' DEFAULT_LAYER = 'DEFAULT'
def write( def write(
library: Mapping[str, Pattern], # TODO could allow library=None for flat DXF pattern: Pattern,
top_name: str, stream: io.TextIOBase,
stream: TextIO,
*, *,
dxf_version: str = 'AC1024', modify_originals: bool = False,
dxf_version='AC1024',
disambiguate_func: Callable[[Iterable[Pattern]], None] = None,
) -> None: ) -> None:
""" """
Write a `Pattern` to a DXF file, by first calling `.polygonize()` to change the shapes Write a `Pattern` to a DXF file, by first calling `.polygonize()` to change the shapes
into polygons, and then writing patterns as DXF `Block`s, polygons as `LWPolyline`s, into polygons, and then writing patterns as DXF `Block`s, polygons as `LWPolyline`s,
and refs as `Insert`s. and subpatterns as `Insert`s.
The top level pattern's name is not written to the DXF file. Nested patterns keep their The top level pattern's name is not written to the DXF file. Nested patterns keep their
names. names.
@ -55,61 +49,60 @@ def write(
tuple: (1, 2) -> '1.2' tuple: (1, 2) -> '1.2'
str: '1.2' -> '1.2' (no change) str: '1.2' -> '1.2' (no change)
DXF does not support shape repetition (only block repeptition). Please call It is often a good idea to run `pattern.subpatternize()` prior to calling this function,
library.wrap_repeated_shapes() before writing to file. especially if calling `.polygonize()` will result in very many vertices.
Other functions you may want to call: If you want pattern polygonized with non-default arguments, just call `pattern.polygonize()`
- `masque.file.oasis.check_valid_names(library.keys())` to check for invalid names prior to calling this function.
- `library.dangling_refs()` to check for references to missing patterns
- `pattern.polygonize()` for any patterns with shapes other
than `masque.shapes.Polygon` or `masque.shapes.Path`
Only `Grid` repetition objects with manhattan basis vectors are preserved as arrays. Since DXF Only `Grid` repetition objects with manhattan basis vectors are preserved as arrays. Since DXF
rotations apply to basis vectors while `masque`'s rotations do not, the basis vectors of an rotations apply to basis vectors while `masque`'s rotations do not, the basis vectors of an
array with rotated instances must be manhattan _after_ having a compensating rotation applied. array with rotated instances must be manhattan _after_ having a compensating rotation applied.
Args: Args:
library: A {name: Pattern} mapping of patterns. Only `top_name` and patterns referenced patterns: A Pattern or list of patterns to write to the stream.
by it are written.
top_name: Name of the top-level pattern to write.
stream: Stream object to write to. stream: Stream object to write to.
modify_original: If `True`, the original pattern is modified as part of the writing
process. Otherwise, a copy is made and `deepunlock()`-ed.
Default `False`.
disambiguate_func: Function which takes a list of patterns and alters them
to make their names valid and unique. Default is `disambiguate_pattern_names`.
WARNING: No additional error checking is performed on the results.
""" """
#TODO consider supporting DXF arcs? #TODO consider supporting DXF arcs?
if not isinstance(library, ILibraryView): if disambiguate_func is None:
if isinstance(library, dict): disambiguate_func = lambda pats: disambiguate_pattern_names(pats)
library = LibraryView(library) assert(disambiguate_func is not None)
else:
library = LibraryView(dict(library))
pattern = library[top_name] if not modify_originals:
subtree = library.subtree(top_name) pattern = pattern.deepcopy().deepunlock()
# Get a dict of id(pattern) -> pattern
patterns_by_id = pattern.referenced_patterns_by_id()
disambiguate_func(patterns_by_id.values())
# Create library # Create library
lib = ezdxf.new(dxf_version, setup=True) lib = ezdxf.new(dxf_version, setup=True)
msp = lib.modelspace() msp = lib.modelspace()
_shapes_to_elements(msp, pattern.shapes) _shapes_to_elements(msp, pattern.shapes)
_labels_to_texts(msp, pattern.labels) _labels_to_texts(msp, pattern.labels)
_mrefs_to_drefs(msp, pattern.refs) _subpatterns_to_refs(msp, pattern.subpatterns)
# Now create a block for each referenced pattern, and add in any shapes # Now create a block for each referenced pattern, and add in any shapes
for name, pat in subtree.items(): for pat in patterns_by_id.values():
assert pat is not None assert(pat is not None)
if name == top_name: block = lib.blocks.new(name=pat.name)
continue
block = lib.blocks.new(name=name)
_shapes_to_elements(block, pat.shapes) _shapes_to_elements(block, pat.shapes)
_labels_to_texts(block, pat.labels) _labels_to_texts(block, pat.labels)
_mrefs_to_drefs(block, pat.refs) _subpatterns_to_refs(block, pat.subpatterns)
lib.write(stream) lib.write(stream)
def writefile( def writefile(
library: Mapping[str, Pattern], pattern: Pattern,
top_name: str, filename: Union[str, pathlib.Path],
filename: str | pathlib.Path,
*args, *args,
**kwargs, **kwargs,
) -> None: ) -> None:
@ -119,42 +112,30 @@ def writefile(
Will automatically compress the file if it has a .gz suffix. Will automatically compress the file if it has a .gz suffix.
Args: Args:
library: A {name: Pattern} mapping of patterns. Only `top_name` and patterns referenced pattern: `Pattern` to save
by it are written.
top_name: Name of the top-level pattern to write.
filename: Filename to save to. filename: Filename to save to.
*args: passed to `dxf.write` *args: passed to `dxf.write`
**kwargs: passed to `dxf.write` **kwargs: passed to `dxf.write`
""" """
path = pathlib.Path(filename) path = pathlib.Path(filename)
gz_stream: IO[bytes]
with tmpfile(path) as base_stream:
streams: tuple[Any, ...] = (base_stream,)
if path.suffix == '.gz': if path.suffix == '.gz':
gz_stream = cast(IO[bytes], gzip.GzipFile(filename='', mtime=0, fileobj=base_stream, mode='wb')) open_func: Callable = gzip.open
streams = (gz_stream,) + streams
else: else:
gz_stream = base_stream open_func = open
stream = io.TextIOWrapper(gz_stream) # type: ignore
streams = (stream,) + streams
try: with open_func(path, mode='wt') as stream:
write(library, top_name, stream, *args, **kwargs) write(pattern, stream, *args, **kwargs)
finally:
for ss in streams:
ss.close()
def readfile( def readfile(
filename: str | pathlib.Path, filename: Union[str, pathlib.Path],
*args, *args,
**kwargs, **kwargs,
) -> tuple[Library, dict[str, Any]]: ) -> Tuple[Pattern, Dict[str, Any]]:
""" """
Wrapper for `dxf.read()` that takes a filename or path instead of a stream. Wrapper for `dxf.read()` that takes a filename or path instead of a stream.
Will automatically decompress gzipped files. Will automatically decompress files with a .gz suffix.
Args: Args:
filename: Filename to save to. filename: Filename to save to.
@ -162,7 +143,7 @@ def readfile(
**kwargs: passed to `dxf.read` **kwargs: passed to `dxf.read`
""" """
path = pathlib.Path(filename) path = pathlib.Path(filename)
if is_gzipped(path): if path.suffix == '.gz':
open_func: Callable = gzip.open open_func: Callable = gzip.open
else: else:
open_func = open open_func = open
@ -173,17 +154,21 @@ def readfile(
def read( def read(
stream: TextIO, stream: io.TextIOBase,
) -> tuple[Library, dict[str, Any]]: clean_vertices: bool = True,
) -> Tuple[Pattern, Dict[str, Any]]:
""" """
Read a dxf file and translate it into a dict of `Pattern` objects. DXF `Block`s are Read a dxf file and translate it into a dict of `Pattern` objects. DXF `Block`s are
translated into `Pattern` objects; `LWPolyline`s are translated into polygons, and `Insert`s translated into `Pattern` objects; `LWPolyline`s are translated into polygons, and `Insert`s
are translated into `Ref` objects. are translated into `SubPattern` objects.
If an object has no layer it is set to this module's `DEFAULT_LAYER` ("DEFAULT"). If an object has no layer it is set to this module's `DEFAULT_LAYER` ("DEFAULT").
Args: Args:
stream: Stream to read from. stream: Stream to read from.
clean_vertices: If `True`, remove any redundant vertices when loading polygons.
The cleaning process removes any polygons with zero area or <3 vertices.
Default `True`.
Returns: Returns:
- Top level pattern - Top level pattern
@ -191,165 +176,163 @@ def read(
lib = ezdxf.read(stream) lib = ezdxf.read(stream)
msp = lib.modelspace() msp = lib.modelspace()
top_name, top_pat = _read_block(msp) pat = _read_block(msp, clean_vertices)
mlib = Library({top_name: top_pat}) patterns = [pat] + [_read_block(bb, clean_vertices) for bb in lib.blocks if bb.name != '*Model_Space']
for bb in lib.blocks:
if bb.name == '*Model_Space':
continue
name, pat = _read_block(bb)
mlib[name] = pat
library_info = dict( # Create a dict of {pattern.name: pattern, ...}, then fix up all subpattern.pattern entries
layers=[ll.dxfattribs() for ll in lib.layers], # according to the subpattern.identifier (which is deleted after use).
) patterns_dict = dict(((p.name, p) for p in patterns))
for p in patterns_dict.values():
for sp in p.subpatterns:
sp.pattern = patterns_dict[sp.identifier[0]]
del sp.identifier
return mlib, library_info library_info = {
'layers': [ll.dxfattribs() for ll in lib.layers]
}
return pat, library_info
def _read_block(block: ezdxf.layouts.BlockLayout | ezdxf.layouts.Modelspace) -> tuple[str, Pattern]: def _read_block(block, clean_vertices: bool) -> Pattern:
name = block.name pat = Pattern(block.name)
pat = Pattern()
for element in block: for element in block:
if isinstance(element, LWPolyline | Polyline): eltype = element.dxftype()
if isinstance(element, LWPolyline): if eltype in ('POLYLINE', 'LWPOLYLINE'):
points = numpy.asarray(element.get_points()) if eltype == 'LWPOLYLINE':
elif isinstance(element, Polyline): points = numpy.array(tuple(element.lwpoints))
points = numpy.asarray(element.points())[:, :2] else:
points = numpy.array(tuple(element.points()))
attr = element.dxfattribs() attr = element.dxfattribs()
layer = attr.get('layer', DEFAULT_LAYER) layer = attr.get('layer', DEFAULT_LAYER)
if points.shape[1] == 2: if points.shape[1] == 2:
raise PatternError('Invalid or unimplemented polygon?') raise PatternError('Invalid or unimplemented polygon?')
#shape = Polygon(layer=layer)
if points.shape[1] > 2: elif points.shape[1] > 2:
if (points[0, 2] != points[:, 2]).any(): if (points[0, 2] != points[:, 2]).any():
raise PatternError('PolyLine has non-constant width (not yet representable in masque!)') raise PatternError('PolyLine has non-constant width (not yet representable in masque!)')
if points.shape[1] == 4 and (points[:, 3] != 0).any(): elif points.shape[1] == 4 and (points[:, 3] != 0).any():
raise PatternError('LWPolyLine has bulge (not yet representable in masque!)') raise PatternError('LWPolyLine has bulge (not yet representable in masque!)')
width = points[0, 2] width = points[0, 2]
if width == 0: if width == 0:
width = attr.get('const_width', 0) width = attr.get('const_width', 0)
shape: Path | Polygon shape: Union[Path, Polygon]
if width == 0 and len(points) > 2 and numpy.array_equal(points[0], points[-1]): if width == 0 and len(points) > 2 and numpy.array_equal(points[0], points[-1]):
shape = Polygon(vertices=points[:-1, :2]) shape = Polygon(layer=layer, vertices=points[:-1, :2])
else: else:
shape = Path(width=width, vertices=points[:, :2]) shape = Path(layer=layer, width=width, vertices=points[:, :2])
pat.shapes[layer].append(shape) if clean_vertices:
try:
shape.clean_vertices()
except PatternError:
continue
elif isinstance(element, Text): pat.shapes.append(shape)
args = dict(
offset=numpy.asarray(element.get_placement()[1])[:2], elif eltype in ('TEXT',):
layer=element.dxfattribs().get('layer', DEFAULT_LAYER), args = {'offset': numpy.array(element.get_pos()[1])[:2],
) 'layer': element.dxfattribs().get('layer', DEFAULT_LAYER),
}
string = element.dxfattribs().get('text', '') string = element.dxfattribs().get('text', '')
# height = element.dxfattribs().get('height', 0) # height = element.dxfattribs().get('height', 0)
# if height != 0: # if height != 0:
# logger.warning('Interpreting DXF TEXT as a label despite nonzero height. ' # logger.warning('Interpreting DXF TEXT as a label despite nonzero height. '
# 'This could be changed in the future by setting a font path in the masque DXF code.') # 'This could be changed in the future by setting a font path in the masque DXF code.')
pat.label(string=string, **args) pat.labels.append(Label(string=string, **args))
# else: # else:
# pat.shapes[args['layer']].append(Text(string=string, height=height, font_path=????)) # pat.shapes.append(Text(string=string, height=height, font_path=????))
elif isinstance(element, Insert): elif eltype in ('INSERT',):
attr = element.dxfattribs() attr = element.dxfattribs()
xscale = attr.get('xscale', 1) xscale = attr.get('xscale', 1)
yscale = attr.get('yscale', 1) yscale = attr.get('yscale', 1)
if abs(xscale) != abs(yscale): if abs(xscale) != abs(yscale):
logger.warning('Masque does not support per-axis scaling; using x-scaling only!') logger.warning('Masque does not support per-axis scaling; using x-scaling only!')
scale = abs(xscale) scale = abs(xscale)
mirrored, extra_angle = normalize_mirror((yscale < 0, xscale < 0)) mirrored = (yscale < 0, xscale < 0)
rotation = numpy.deg2rad(attr.get('rotation', 0)) + extra_angle rotation = numpy.deg2rad(attr.get('rotation', 0))
offset = numpy.asarray(attr.get('insert', (0, 0, 0)))[:2] offset = numpy.array(attr.get('insert', (0, 0, 0)))[:2]
args = dict( args = {
target=attr.get('name', None), 'offset': offset,
offset=offset, 'scale': scale,
scale=scale, 'mirrored': mirrored,
mirrored=mirrored, 'rotation': rotation,
rotation=rotation, 'pattern': None,
) 'identifier': (attr.get('name', None),),
}
if 'column_count' in attr: if 'column_count' in attr:
args['repetition'] = Grid( args['repetition'] = Grid(a_vector=(attr['column_spacing'], 0),
a_vector=(attr['column_spacing'], 0),
b_vector=(0, attr['row_spacing']), b_vector=(0, attr['row_spacing']),
a_count=attr['column_count'], a_count=attr['column_count'],
b_count=attr['row_count'], b_count=attr['row_count'])
) pat.subpatterns.append(SubPattern(**args))
pat.ref(**args)
else: else:
logger.warning(f'Ignoring DXF element {element.dxftype()} (not implemented).') logger.warning(f'Ignoring DXF element {element.dxftype()} (not implemented).')
return name, pat return pat
def _mrefs_to_drefs( def _subpatterns_to_refs(
block: ezdxf.layouts.BlockLayout | ezdxf.layouts.Modelspace, block: Union[ezdxf.layouts.BlockLayout, ezdxf.layouts.Modelspace],
refs: dict[str | None, list[Ref]], subpatterns: List[SubPattern],
) -> None: ) -> None:
def mk_blockref(encoded_name: str, ref: Ref) -> None: for subpat in subpatterns:
rotation = numpy.rad2deg(ref.rotation) % 360 if subpat.pattern is None:
attribs = dict( continue
xscale=ref.scale, encoded_name = subpat.pattern.name
yscale=ref.scale * (-1 if ref.mirrored else 1),
rotation=rotation,
)
rep = ref.repetition rotation = (subpat.rotation * 180 / numpy.pi) % 360
attribs = {
'xscale': subpat.scale * (-1 if subpat.mirrored[1] else 1),
'yscale': subpat.scale * (-1 if subpat.mirrored[0] else 1),
'rotation': rotation,
}
rep = subpat.repetition
if rep is None: if rep is None:
block.add_blockref(encoded_name, ref.offset, dxfattribs=attribs) block.add_blockref(encoded_name, subpat.offset, dxfattribs=attribs)
elif isinstance(rep, Grid): elif isinstance(rep, Grid):
a = rep.a_vector a = rep.a_vector
b = rep.b_vector if rep.b_vector is not None else numpy.zeros(2) b = rep.b_vector if rep.b_vector is not None else numpy.zeros(2)
rotated_a = rotation_matrix_2d(-ref.rotation) @ a rotated_a = rotation_matrix_2d(-subpat.rotation) @ a
rotated_b = rotation_matrix_2d(-ref.rotation) @ b rotated_b = rotation_matrix_2d(-subpat.rotation) @ b
if rotated_a[1] == 0 and rotated_b[0] == 0: if rotated_a[1] == 0 and rotated_b[0] == 0:
attribs['column_count'] = rep.a_count attribs['column_count'] = rep.a_count
attribs['row_count'] = rep.b_count attribs['row_count'] = rep.b_count
attribs['column_spacing'] = rotated_a[0] attribs['column_spacing'] = rotated_a[0]
attribs['row_spacing'] = rotated_b[1] attribs['row_spacing'] = rotated_b[1]
block.add_blockref(encoded_name, ref.offset, dxfattribs=attribs) block.add_blockref(encoded_name, subpat.offset, dxfattribs=attribs)
elif rotated_a[0] == 0 and rotated_b[1] == 0: elif rotated_a[0] == 0 and rotated_b[1] == 0:
attribs['column_count'] = rep.b_count attribs['column_count'] = rep.b_count
attribs['row_count'] = rep.a_count attribs['row_count'] = rep.a_count
attribs['column_spacing'] = rotated_b[0] attribs['column_spacing'] = rotated_b[0]
attribs['row_spacing'] = rotated_a[1] attribs['row_spacing'] = rotated_a[1]
block.add_blockref(encoded_name, ref.offset, dxfattribs=attribs) block.add_blockref(encoded_name, subpat.offset, dxfattribs=attribs)
else: else:
#NOTE: We could still do non-manhattan (but still orthogonal) grids by getting #NOTE: We could still do non-manhattan (but still orthogonal) grids by getting
# creative with counter-rotated nested patterns, but probably not worth it. # creative with counter-rotated nested patterns, but probably not worth it.
# Instead, just break appart the grid into individual elements: # Instead, just break appart the grid into individual elements:
for dd in rep.displacements: for dd in rep.displacements:
block.add_blockref(encoded_name, ref.offset + dd, dxfattribs=attribs) block.add_blockref(encoded_name, subpat.offset + dd, dxfattribs=attribs)
else: else:
for dd in rep.displacements: for dd in rep.displacements:
block.add_blockref(encoded_name, ref.offset + dd, dxfattribs=attribs) block.add_blockref(encoded_name, subpat.offset + dd, dxfattribs=attribs)
for target, rseq in refs.items():
if target is None:
continue
for ref in rseq:
mk_blockref(target, ref)
def _shapes_to_elements( def _shapes_to_elements(
block: ezdxf.layouts.BlockLayout | ezdxf.layouts.Modelspace, block: Union[ezdxf.layouts.BlockLayout, ezdxf.layouts.Modelspace],
shapes: dict[layer_t, list[Shape]], shapes: List[Shape],
polygonize_paths: bool = False,
) -> None: ) -> None:
# Add `LWPolyline`s for each shape. # Add `LWPolyline`s for each shape.
# Could set do paths with width setting, but need to consider endcaps. # Could set do paths with width setting, but need to consider endcaps.
# TODO: can DXF do paths? for shape in shapes:
for layer, sseq in shapes.items(): attribs = {'layer': _mlayer2dxf(shape.layer)}
attribs = dict(layer=_mlayer2dxf(layer))
for shape in sseq:
if shape.repetition is not None:
raise PatternError(
'Shape repetitions are not supported by DXF.'
' Please call library.wrap_repeated_shapes() before writing to file.'
)
for polygon in shape.to_polygons(): for polygon in shape.to_polygons():
xy_open = polygon.vertices + polygon.offset xy_open = polygon.vertices + polygon.offset
xy_closed = numpy.vstack((xy_open, xy_open[0, :])) xy_closed = numpy.vstack((xy_open, xy_open[0, :]))
@ -357,17 +340,13 @@ def _shapes_to_elements(
def _labels_to_texts( def _labels_to_texts(
block: ezdxf.layouts.BlockLayout | ezdxf.layouts.Modelspace, block: Union[ezdxf.layouts.BlockLayout, ezdxf.layouts.Modelspace],
labels: dict[layer_t, list[Label]], labels: List[Label],
) -> None: ) -> None:
for layer, lseq in labels.items(): for label in labels:
attribs = dict(layer=_mlayer2dxf(layer)) attribs = {'layer': _mlayer2dxf(label.layer)}
for label in lseq:
xy = label.offset xy = label.offset
block.add_text( block.add_text(label.string, dxfattribs=attribs).set_pos(xy, align='BOTTOM_LEFT')
label.string,
dxfattribs=attribs
).set_placement(xy, align=TextEntityAlignment.BOTTOM_LEFT)
def _mlayer2dxf(layer: layer_t) -> str: def _mlayer2dxf(layer: layer_t) -> str:
@ -378,3 +357,40 @@ def _mlayer2dxf(layer: layer_t) -> str:
if isinstance(layer, tuple): if isinstance(layer, tuple):
return f'{layer[0]}.{layer[1]}' return f'{layer[0]}.{layer[1]}'
raise PatternError(f'Unknown layer type: {layer} ({type(layer)})') raise PatternError(f'Unknown layer type: {layer} ({type(layer)})')
def disambiguate_pattern_names(
patterns: Iterable[Pattern],
max_name_length: int = 32,
suffix_length: int = 6,
dup_warn_filter: Callable[[str], bool] = None, # If returns False, don't warn about this name
) -> None:
used_names = []
for pat in patterns:
sanitized_name = re.compile(r'[^A-Za-z0-9_\?\$]').sub('_', pat.name)
i = 0
suffixed_name = sanitized_name
while suffixed_name in used_names or suffixed_name == '':
suffix = base64.b64encode(struct.pack('>Q', i), b'$?').decode('ASCII')
suffixed_name = sanitized_name + '$' + suffix[:-1].lstrip('A')
i += 1
if sanitized_name == '':
logger.warning(f'Empty pattern name saved as "{suffixed_name}"')
elif suffixed_name != sanitized_name:
if dup_warn_filter is None or dup_warn_filter(pat.name):
logger.warning(f'Pattern name "{pat.name}" ({sanitized_name}) appears multiple times;\n'
+ f' renaming to "{suffixed_name}"')
if len(suffixed_name) == 0:
# Should never happen since zero-length names are replaced
raise PatternError(f'Zero-length name after sanitize,\n originally "{pat.name}"')
if len(suffixed_name) > max_name_length:
raise PatternError(f'Pattern name "{suffixed_name!r}" length > {max_name_length} after encode,\n'
+ f' originally "{pat.name}"')
pat.name = suffixed_name
used_names.append(suffixed_name)

View File

@ -16,31 +16,31 @@ Notes:
* PLEX is not supported * PLEX is not supported
* ELFLAGS are not supported * ELFLAGS are not supported
* GDS does not support library- or structure-level annotations * GDS does not support library- or structure-level annotations
* GDS creation/modification/access times are set to 1900-01-01 for reproducibility. * Creation/modification/access times are set to 1900-01-01 for reproducibility.
* Gzip modification time is set to 0 (start of current epoch, usually 1970-01-01)
""" """
from typing import IO, cast, Any from typing import List, Any, Dict, Tuple, Callable, Union, Iterable, Optional
from collections.abc import Iterable, Mapping, Callable from typing import Sequence, BinaryIO
import re
import io import io
import mmap import mmap
import copy
import base64
import struct
import logging import logging
import pathlib import pathlib
import gzip import gzip
import string
from pprint import pformat
import numpy import numpy
from numpy.typing import ArrayLike, NDArray from numpy.typing import NDArray, ArrayLike
import klamath import klamath
from klamath import records from klamath import records
from .utils import is_gzipped, tmpfile from .utils import is_gzipped
from .. import Pattern, Ref, PatternError, LibraryError, Label, Shape from .. import Pattern, SubPattern, PatternError, Label, Shape
from ..shapes import Polygon, Path from ..shapes import Polygon, Path
from ..repetition import Grid from ..repetition import Grid
from ..utils import layer_t, annotations_t from ..utils import layer_t, normalize_mirror, annotations_t
from ..library import LazyLibrary, Library, ILibrary, ILibraryView from ..library import Library
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -53,21 +53,20 @@ path_cap_map = {
} }
def rint_cast(val: ArrayLike) -> NDArray[numpy.int32]:
return numpy.rint(val).astype(numpy.int32)
def write( def write(
library: Mapping[str, Pattern], patterns: Union[Pattern, Sequence[Pattern]],
stream: IO[bytes], stream: BinaryIO,
meters_per_unit: float, meters_per_unit: float,
logical_units_per_unit: float = 1, logical_units_per_unit: float = 1,
library_name: str = 'masque-klamath', library_name: str = 'masque-klamath',
*,
modify_originals: bool = False,
disambiguate_func: Callable[[Iterable[Pattern]], None] = None,
) -> None: ) -> None:
""" """
Convert a library to a GDSII stream, mapping data as follows: Convert a `Pattern` or list of patterns to a GDSII stream, and then mapping data as follows:
Pattern -> GDSII structure Pattern -> GDSII structure
Ref -> GDSII SREF or AREF SubPattern -> GDSII SREF or AREF
Path -> GSDII path Path -> GSDII path
Shape (other than path) -> GDSII boundary/ies Shape (other than path) -> GDSII boundary/ies
Label -> GDSII text Label -> GDSII text
@ -79,17 +78,14 @@ def write(
datatype is chosen to be `shape.layer[1]` if available, datatype is chosen to be `shape.layer[1]` if available,
otherwise `0` otherwise `0`
GDS does not support shape repetition (only cell repeptition). Please call It is often a good idea to run `pattern.subpatternize()` prior to calling this function,
`library.wrap_repeated_shapes()` before writing to file. especially if calling `.polygonize()` will result in very many vertices.
Other functions you may want to call: If you want pattern polygonized with non-default arguments, just call `pattern.polygonize()`
- `masque.file.gdsii.check_valid_names(library.keys())` to check for invalid names prior to calling this function.
- `library.dangling_refs()` to check for references to missing patterns
- `pattern.polygonize()` for any patterns with shapes other
than `masque.shapes.Polygon` or `masque.shapes.Path`
Args: Args:
library: A {name: Pattern} mapping of patterns to write. patterns: A Pattern or list of patterns to convert.
meters_per_unit: Written into the GDSII file, meters per (database) length unit. meters_per_unit: Written into the GDSII file, meters per (database) length unit.
All distances are assumed to be an integer multiple of this unit, and are stored as such. All distances are assumed to be an integer multiple of this unit, and are stored as such.
logical_units_per_unit: Written into the GDSII file. Allows the GDSII to specify a logical_units_per_unit: Written into the GDSII file. Allows the GDSII to specify a
@ -97,35 +93,54 @@ def write(
Default `1`. Default `1`.
library_name: Library name written into the GDSII file. library_name: Library name written into the GDSII file.
Default 'masque-klamath'. Default 'masque-klamath'.
modify_originals: If `True`, the original pattern is modified as part of the writing
process. Otherwise, a copy is made and `deepunlock()`-ed.
Default `False`.
disambiguate_func: Function which takes a list of patterns and alters them
to make their names valid and unique. Default is `disambiguate_pattern_names`, which
attempts to adhere to the GDSII standard as well as possible.
WARNING: No additional error checking is performed on the results.
""" """
if not isinstance(library, ILibrary): if isinstance(patterns, Pattern):
if isinstance(library, dict): patterns = [patterns]
library = Library(library)
else: if disambiguate_func is None:
library = Library(dict(library)) disambiguate_func = disambiguate_pattern_names # type: ignore
assert(disambiguate_func is not None) # placate mypy
if not modify_originals:
patterns = [p.deepunlock() for p in copy.deepcopy(patterns)]
patterns = [p.wrap_repeated_shapes() for p in patterns]
# Create library # Create library
header = klamath.library.FileHeader( header = klamath.library.FileHeader(name=library_name.encode('ASCII'),
name=library_name.encode('ASCII'),
user_units_per_db_unit=logical_units_per_unit, user_units_per_db_unit=logical_units_per_unit,
meters_per_db_unit=meters_per_unit, meters_per_db_unit=meters_per_unit)
)
header.write(stream) header.write(stream)
# Get a dict of id(pattern) -> pattern
patterns_by_id = {id(pattern): pattern for pattern in patterns}
for pattern in patterns:
for i, p in pattern.referenced_patterns_by_id().items():
patterns_by_id[i] = p
disambiguate_func(patterns_by_id.values())
# Now create a structure for each pattern, and add in any Boundary and SREF elements # Now create a structure for each pattern, and add in any Boundary and SREF elements
for name, pat in library.items(): for pat in patterns_by_id.values():
elements: list[klamath.elements.Element] = [] elements: List[klamath.elements.Element] = []
elements += _shapes_to_elements(pat.shapes) elements += _shapes_to_elements(pat.shapes)
elements += _labels_to_texts(pat.labels) elements += _labels_to_texts(pat.labels)
elements += _mrefs_to_grefs(pat.refs) elements += _subpatterns_to_refs(pat.subpatterns)
klamath.library.write_struct(stream, name=name.encode('ASCII'), elements=elements) klamath.library.write_struct(stream, name=pat.name.encode('ASCII'), elements=elements)
records.ENDLIB.write(stream, None) records.ENDLIB.write(stream, None)
def writefile( def writefile(
library: Mapping[str, Pattern], patterns: Union[Sequence[Pattern], Pattern],
filename: str | pathlib.Path, filename: Union[str, pathlib.Path],
*args, *args,
**kwargs, **kwargs,
) -> None: ) -> None:
@ -135,33 +150,26 @@ def writefile(
Will automatically compress the file if it has a .gz suffix. Will automatically compress the file if it has a .gz suffix.
Args: Args:
library: {name: Pattern} pairs to save. patterns: `Pattern` or list of patterns to save
filename: Filename to save to. filename: Filename to save to.
*args: passed to `write()` *args: passed to `write()`
**kwargs: passed to `write()` **kwargs: passed to `write()`
""" """
path = pathlib.Path(filename) path = pathlib.Path(filename)
with tmpfile(path) as base_stream:
streams: tuple[Any, ...] = (base_stream,)
if path.suffix == '.gz': if path.suffix == '.gz':
stream = cast(IO[bytes], gzip.GzipFile(filename='', mtime=0, fileobj=base_stream, mode='wb', compresslevel=6)) open_func: Callable = gzip.open
streams = (stream,) + streams
else: else:
stream = base_stream open_func = open
try: with io.BufferedWriter(open_func(path, mode='wb')) as stream:
write(library, stream, *args, **kwargs) write(patterns, stream, *args, **kwargs)
finally:
for ss in streams:
ss.close()
def readfile( def readfile(
filename: str | pathlib.Path, filename: Union[str, pathlib.Path],
*args, *args,
**kwargs, **kwargs,
) -> tuple[Library, dict[str, Any]]: ) -> Tuple[Dict[str, Pattern], Dict[str, Any]]:
""" """
Wrapper for `read()` that takes a filename or path instead of a stream. Wrapper for `read()` that takes a filename or path instead of a stream.
@ -178,20 +186,19 @@ def readfile(
else: else:
open_func = open open_func = open
with open_func(path, mode='rb') as stream: with io.BufferedReader(open_func(path, mode='rb')) as stream:
results = read(stream, *args, **kwargs) results = read(stream, *args, **kwargs)
return results return results
def read( def read(
stream: IO[bytes], stream: BinaryIO,
raw_mode: bool = True, raw_mode: bool = True,
) -> tuple[Library, dict[str, Any]]: ) -> Tuple[Dict[str, Pattern], Dict[str, Any]]:
""" """
# TODO check GDSII file for cycles!
Read a gdsii file and translate it into a dict of Pattern objects. GDSII structures are Read a gdsii file and translate it into a dict of Pattern objects. GDSII structures are
translated into Pattern objects; boundaries are translated into polygons, and srefs and arefs translated into Pattern objects; boundaries are translated into polygons, and srefs and arefs
are translated into Ref objects. are translated into SubPattern objects.
Additional library info is returned in a dict, containing: Additional library info is returned in a dict, containing:
'name': name of the library 'name': name of the library
@ -204,23 +211,31 @@ def read(
raw_mode: If True, constructs shapes in raw mode, bypassing most data validation, Default True. raw_mode: If True, constructs shapes in raw mode, bypassing most data validation, Default True.
Returns: Returns:
- dict of pattern_name:Patterns generated from GDSII structures - Dict of pattern_name:Patterns generated from GDSII structures
- dict of GDSII library info - Dict of GDSII library info
""" """
library_info = _read_header(stream) library_info = _read_header(stream)
mlib = Library() patterns = []
found_struct = records.BGNSTR.skip_past(stream) found_struct = records.BGNSTR.skip_past(stream)
while found_struct: while found_struct:
name = records.STRNAME.skip_and_read(stream) name = records.STRNAME.skip_and_read(stream)
pat = read_elements(stream, raw_mode=raw_mode) pat = read_elements(stream, name=name.decode('ASCII'), raw_mode=raw_mode)
mlib[name.decode('ASCII')] = pat patterns.append(pat)
found_struct = records.BGNSTR.skip_past(stream) found_struct = records.BGNSTR.skip_past(stream)
return mlib, library_info # Create a dict of {pattern.name: pattern, ...}, then fix up all subpattern.pattern entries
# according to the subpattern.identifier (which is deleted after use).
patterns_dict = dict(((p.name, p) for p in patterns))
for p in patterns_dict.values():
for sp in p.subpatterns:
sp.pattern = patterns_dict[sp.identifier[0]]
del sp.identifier
return patterns_dict, library_info
def _read_header(stream: IO[bytes]) -> dict[str, Any]: def _read_header(stream: BinaryIO) -> Dict[str, Any]:
""" """
Read the file header and create the library_info dict. Read the file header and create the library_info dict.
""" """
@ -234,7 +249,8 @@ def _read_header(stream: IO[bytes]) -> dict[str, Any]:
def read_elements( def read_elements(
stream: IO[bytes], stream: BinaryIO,
name: str,
raw_mode: bool = True, raw_mode: bool = True,
) -> Pattern: ) -> Pattern:
""" """
@ -249,30 +265,28 @@ def read_elements(
Returns: Returns:
A pattern containing the elements that were read. A pattern containing the elements that were read.
""" """
pat = Pattern() pat = Pattern(name)
elements = klamath.library.read_elements(stream) elements = klamath.library.read_elements(stream)
for element in elements: for element in elements:
if isinstance(element, klamath.elements.Boundary): if isinstance(element, klamath.elements.Boundary):
layer, poly = _boundary_to_polygon(element, raw_mode) poly = _boundary_to_polygon(element, raw_mode)
pat.shapes[layer].append(poly) pat.shapes.append(poly)
elif isinstance(element, klamath.elements.Path): elif isinstance(element, klamath.elements.Path):
layer, path = _gpath_to_mpath(element, raw_mode) path = _gpath_to_mpath(element, raw_mode)
pat.shapes[layer].append(path) pat.shapes.append(path)
elif isinstance(element, klamath.elements.Text): elif isinstance(element, klamath.elements.Text):
pat.label( label = Label(offset=element.xy.astype(float),
layer=element.layer, layer=element.layer,
offset=element.xy.astype(float),
string=element.string.decode('ASCII'), string=element.string.decode('ASCII'),
annotations=_properties_to_annotations(element.properties), annotations=_properties_to_annotations(element.properties))
) pat.labels.append(label)
elif isinstance(element, klamath.elements.Reference): elif isinstance(element, klamath.elements.Reference):
target, ref = _gref_to_mref(element) pat.subpatterns.append(_ref_to_subpat(element))
pat.refs[target].append(ref)
return pat return pat
def _mlayer2gds(mlayer: layer_t) -> tuple[int, int]: def _mlayer2gds(mlayer: layer_t) -> Tuple[int, int]:
""" Helper to turn a layer tuple-or-int into a layer and datatype""" """ Helper to turn a layer tuple-or-int into a layer and datatype"""
if isinstance(mlayer, int): if isinstance(mlayer, int):
layer = mlayer layer = mlayer
@ -288,9 +302,10 @@ def _mlayer2gds(mlayer: layer_t) -> tuple[int, int]:
return layer, data_type return layer, data_type
def _gref_to_mref(ref: klamath.library.Reference) -> tuple[str, Ref]: def _ref_to_subpat(ref: klamath.library.Reference) -> SubPattern:
""" """
Helper function to create a Ref from an SREF or AREF. Sets ref.target to struct_name. Helper function to create a SubPattern from an SREF or AREF. Sets subpat.pattern to None
and sets the instance .identifier to (struct_name,).
""" """
xy = ref.xy.astype(float) xy = ref.xy.astype(float)
offset = xy[0] offset = xy[0]
@ -302,26 +317,25 @@ def _gref_to_mref(ref: klamath.library.Reference) -> tuple[str, Ref]:
repetition = Grid(a_vector=a_vector, b_vector=b_vector, repetition = Grid(a_vector=a_vector, b_vector=b_vector,
a_count=a_count, b_count=b_count) a_count=a_count, b_count=b_count)
target = ref.struct_name.decode('ASCII') subpat = SubPattern(pattern=None,
mref = Ref(
offset=offset, offset=offset,
rotation=numpy.deg2rad(ref.angle_deg), rotation=numpy.deg2rad(ref.angle_deg),
scale=ref.mag, scale=ref.mag,
mirrored=ref.invert_y, mirrored=(ref.invert_y, False),
annotations=_properties_to_annotations(ref.properties), annotations=_properties_to_annotations(ref.properties),
repetition=repetition, repetition=repetition)
) subpat.identifier = (ref.struct_name.decode('ASCII'),)
return target, mref return subpat
def _gpath_to_mpath(gpath: klamath.library.Path, raw_mode: bool) -> tuple[layer_t, Path]: def _gpath_to_mpath(gpath: klamath.library.Path, raw_mode: bool) -> Path:
if gpath.path_type in path_cap_map: if gpath.path_type in path_cap_map:
cap = path_cap_map[gpath.path_type] cap = path_cap_map[gpath.path_type]
else: else:
raise PatternError(f'Unrecognized path type: {gpath.path_type}') raise PatternError(f'Unrecognized path type: {gpath.path_type}')
mpath = Path( mpath = Path(vertices=gpath.xy.astype(float),
vertices=gpath.xy.astype(float), layer=gpath.layer,
width=gpath.width, width=gpath.width,
cap=cap, cap=cap,
offset=numpy.zeros(2), offset=numpy.zeros(2),
@ -330,87 +344,81 @@ def _gpath_to_mpath(gpath: klamath.library.Path, raw_mode: bool) -> tuple[layer_
) )
if cap == Path.Cap.SquareCustom: if cap == Path.Cap.SquareCustom:
mpath.cap_extensions = gpath.extension mpath.cap_extensions = gpath.extension
return gpath.layer, mpath return mpath
def _boundary_to_polygon(boundary: klamath.library.Boundary, raw_mode: bool) -> tuple[layer_t, Polygon]: def _boundary_to_polygon(boundary: klamath.library.Boundary, raw_mode: bool) -> Polygon:
return boundary.layer, Polygon( return Polygon(vertices=boundary.xy[:-1].astype(float),
vertices=boundary.xy[:-1].astype(float), layer=boundary.layer,
offset=numpy.zeros(2), offset=numpy.zeros(2),
annotations=_properties_to_annotations(boundary.properties), annotations=_properties_to_annotations(boundary.properties),
raw=raw_mode, raw=raw_mode,
) )
def _mrefs_to_grefs(refs: dict[str | None, list[Ref]]) -> list[klamath.library.Reference]: def _subpatterns_to_refs(subpatterns: List[SubPattern]) -> List[klamath.library.Reference]:
grefs = [] refs = []
for target, rseq in refs.items(): for subpat in subpatterns:
if target is None: if subpat.pattern is None:
continue continue
encoded_name = target.encode('ASCII') encoded_name = subpat.pattern.name.encode('ASCII')
for ref in rseq:
# Note: GDS also mirrors first and rotates second # Note: GDS mirrors first and rotates second
rep = ref.repetition mirror_across_x, extra_angle = normalize_mirror(subpat.mirrored)
angle_deg = numpy.rad2deg(ref.rotation) % 360 rep = subpat.repetition
properties = _annotations_to_properties(ref.annotations, 512) angle_deg = numpy.rad2deg(subpat.rotation + extra_angle) % 360
properties = _annotations_to_properties(subpat.annotations, 512)
if isinstance(rep, Grid): if isinstance(rep, Grid):
b_vector = rep.b_vector if rep.b_vector is not None else numpy.zeros(2) b_vector = rep.b_vector if rep.b_vector is not None else numpy.zeros(2)
b_count = rep.b_count if rep.b_count is not None else 1 b_count = rep.b_count if rep.b_count is not None else 1
xy = numpy.asarray(ref.offset) + numpy.array([ xy: NDArray[numpy.float64] = numpy.array(subpat.offset) + [
[0.0, 0.0], [0, 0],
rep.a_vector * rep.a_count, rep.a_vector * rep.a_count,
b_vector * b_count, b_vector * b_count,
]) ]
aref = klamath.library.Reference( aref = klamath.library.Reference(struct_name=encoded_name,
struct_name=encoded_name, xy=numpy.round(xy).astype(int),
xy=rint_cast(xy), colrow=(numpy.round(rep.a_count), numpy.round(rep.b_count)),
colrow=(numpy.rint(rep.a_count), numpy.rint(rep.b_count)),
angle_deg=angle_deg, angle_deg=angle_deg,
invert_y=ref.mirrored, invert_y=mirror_across_x,
mag=ref.scale, mag=subpat.scale,
properties=properties, properties=properties)
) refs.append(aref)
grefs.append(aref)
elif rep is None: elif rep is None:
sref = klamath.library.Reference( ref = klamath.library.Reference(struct_name=encoded_name,
struct_name=encoded_name, xy=numpy.round([subpat.offset]).astype(int),
xy=rint_cast([ref.offset]),
colrow=None, colrow=None,
angle_deg=angle_deg, angle_deg=angle_deg,
invert_y=ref.mirrored, invert_y=mirror_across_x,
mag=ref.scale, mag=subpat.scale,
properties=properties, properties=properties)
) refs.append(ref)
grefs.append(sref)
else: else:
new_srefs = [ new_srefs = [klamath.library.Reference(struct_name=encoded_name,
klamath.library.Reference( xy=numpy.round([subpat.offset + dd]).astype(int),
struct_name=encoded_name,
xy=rint_cast([ref.offset + dd]),
colrow=None, colrow=None,
angle_deg=angle_deg, angle_deg=angle_deg,
invert_y=ref.mirrored, invert_y=mirror_across_x,
mag=ref.scale, mag=subpat.scale,
properties=properties, properties=properties)
)
for dd in rep.displacements] for dd in rep.displacements]
grefs += new_srefs refs += new_srefs
return grefs return refs
def _properties_to_annotations(properties: dict[int, bytes]) -> annotations_t: def _properties_to_annotations(properties: Dict[int, bytes]) -> annotations_t:
return {str(k): [v.decode()] for k, v in properties.items()} return {str(k): [v.decode()] for k, v in properties.items()}
def _annotations_to_properties(annotations: annotations_t, max_len: int = 126) -> dict[int, bytes]: def _annotations_to_properties(annotations: annotations_t, max_len: int = 126) -> Dict[int, bytes]:
cum_len = 0 cum_len = 0
props = {} props = {}
for key, vals in annotations.items(): for key, vals in annotations.items():
try: try:
i = int(key) i = int(key)
except ValueError as err: except ValueError:
raise PatternError(f'Annotation key {key} is not convertable to an integer') from err raise PatternError(f'Annotation key {key} is not convertable to an integer')
if not (0 < i < 126): if not (0 < i < 126):
raise PatternError(f'Annotation key {key} converts to {i} (must be in the range [1,125])') raise PatternError(f'Annotation key {key} converts to {i} (must be in the range [1,125])')
@ -426,93 +434,138 @@ def _annotations_to_properties(annotations: annotations_t, max_len: int = 126) -
def _shapes_to_elements( def _shapes_to_elements(
shapes: dict[layer_t, list[Shape]], shapes: List[Shape],
polygonize_paths: bool = False, polygonize_paths: bool = False,
) -> list[klamath.elements.Element]: ) -> List[klamath.elements.Element]:
elements: list[klamath.elements.Element] = [] elements: List[klamath.elements.Element] = []
# Add a Boundary element for each shape, and Path elements if necessary # Add a Boundary element for each shape, and Path elements if necessary
for mlayer, sseq in shapes.items(): for shape in shapes:
layer, data_type = _mlayer2gds(mlayer) layer, data_type = _mlayer2gds(shape.layer)
for shape in sseq:
if shape.repetition is not None:
raise PatternError('Shape repetitions are not supported by GDS.'
' Please call library.wrap_repeated_shapes() before writing to file.')
properties = _annotations_to_properties(shape.annotations, 128) properties = _annotations_to_properties(shape.annotations, 128)
if isinstance(shape, Path) and not polygonize_paths: if isinstance(shape, Path) and not polygonize_paths:
xy = rint_cast(shape.vertices + shape.offset) xy = numpy.round(shape.vertices + shape.offset).astype(int)
width = rint_cast(shape.width) width = numpy.round(shape.width).astype(int)
path_type = next(k for k, v in path_cap_map.items() if v == shape.cap) # reverse lookup path_type = next(k for k, v in path_cap_map.items() if v == shape.cap) # reverse lookup
extension: tuple[int, int] extension: Tuple[int, int]
if shape.cap == Path.Cap.SquareCustom and shape.cap_extensions is not None: if shape.cap == Path.Cap.SquareCustom and shape.cap_extensions is not None:
extension = tuple(shape.cap_extensions) # type: ignore extension = tuple(shape.cap_extensions) # type: ignore
else: else:
extension = (0, 0) extension = (0, 0)
path = klamath.elements.Path( path = klamath.elements.Path(layer=(layer, data_type),
layer=(layer, data_type),
xy=xy, xy=xy,
path_type=path_type, path_type=path_type,
width=int(width), width=width,
extension=extension, extension=extension,
properties=properties, properties=properties)
)
elements.append(path) elements.append(path)
elif isinstance(shape, Polygon): elif isinstance(shape, Polygon):
polygon = shape polygon = shape
xy_closed = numpy.empty((polygon.vertices.shape[0] + 1, 2), dtype=numpy.int32) xy_closed = numpy.empty((polygon.vertices.shape[0] + 1, 2), dtype=numpy.int32)
numpy.rint(polygon.vertices + polygon.offset, out=xy_closed[:-1], casting='unsafe') numpy.rint(polygon.vertices + polygon.offset, out=xy_closed[:-1], casting='unsafe')
xy_closed[-1] = xy_closed[0] xy_closed[-1] = xy_closed[0]
boundary = klamath.elements.Boundary( boundary = klamath.elements.Boundary(layer=(layer, data_type),
layer=(layer, data_type),
xy=xy_closed, xy=xy_closed,
properties=properties, properties=properties)
)
elements.append(boundary) elements.append(boundary)
else: else:
for polygon in shape.to_polygons(): for polygon in shape.to_polygons():
xy_closed = numpy.empty((polygon.vertices.shape[0] + 1, 2), dtype=numpy.int32) xy_closed = numpy.empty((polygon.vertices.shape[0] + 1, 2), dtype=numpy.int32)
numpy.rint(polygon.vertices + polygon.offset, out=xy_closed[:-1], casting='unsafe') numpy.rint(polygon.vertices + polygon.offset, out=xy_closed[:-1], casting='unsafe')
xy_closed[-1] = xy_closed[0] xy_closed[-1] = xy_closed[0]
boundary = klamath.elements.Boundary( boundary = klamath.elements.Boundary(layer=(layer, data_type),
layer=(layer, data_type),
xy=xy_closed, xy=xy_closed,
properties=properties, properties=properties)
)
elements.append(boundary) elements.append(boundary)
return elements return elements
def _labels_to_texts(labels: dict[layer_t, list[Label]]) -> list[klamath.elements.Text]: def _labels_to_texts(labels: List[Label]) -> List[klamath.elements.Text]:
texts = [] texts = []
for mlayer, lseq in labels.items(): for label in labels:
layer, text_type = _mlayer2gds(mlayer)
for label in lseq:
properties = _annotations_to_properties(label.annotations, 128) properties = _annotations_to_properties(label.annotations, 128)
xy = rint_cast([label.offset]) layer, text_type = _mlayer2gds(label.layer)
text = klamath.elements.Text( xy = numpy.round([label.offset]).astype(int)
layer=(layer, text_type), text = klamath.elements.Text(layer=(layer, text_type),
xy=xy, xy=xy,
string=label.string.encode('ASCII'), string=label.string.encode('ASCII'),
properties=properties, properties=properties,
presentation=0, # font number & alignment -- unused by us presentation=0, # TODO maybe set some of these?
angle_deg=0, # rotation -- unused by us angle_deg=0,
invert_y=False, # inversion -- unused by us invert_y=False,
width=0, # stroke width -- unused by us width=0,
path_type=0, # text path endcaps, unused path_type=0,
mag=1, # size -- unused by us mag=1)
)
texts.append(text) texts.append(text)
return texts return texts
def disambiguate_pattern_names(
patterns: Sequence[Pattern],
max_name_length: int = 32,
suffix_length: int = 6,
dup_warn_filter: Optional[Callable[[str], bool]] = None,
) -> None:
"""
Args:
patterns: List of patterns to disambiguate
max_name_length: Names longer than this will be truncated
suffix_length: Names which get truncated are truncated by this many extra characters. This is to
leave room for a suffix if one is necessary.
dup_warn_filter: (optional) Function for suppressing warnings about cell names changing. Receives
the cell name and returns `False` if the warning should be suppressed and `True` if it should
be displayed. Default displays all warnings.
"""
used_names = []
for pat in set(patterns):
# Shorten names which already exceed max-length
if len(pat.name) > max_name_length:
shortened_name = pat.name[:max_name_length - suffix_length]
logger.warning(f'Pattern name "{pat.name}" is too long ({len(pat.name)}/{max_name_length} chars),\n'
+ f' shortening to "{shortened_name}" before generating suffix')
else:
shortened_name = pat.name
# Remove invalid characters
sanitized_name = re.compile(r'[^A-Za-z0-9_\?\$]').sub('_', shortened_name)
# Add a suffix that makes the name unique
i = 0
suffixed_name = sanitized_name
while suffixed_name in used_names or suffixed_name == '':
suffix = base64.b64encode(struct.pack('>Q', i), b'$?').decode('ASCII')
suffixed_name = sanitized_name + '$' + suffix[:-1].lstrip('A')
i += 1
if sanitized_name == '':
logger.warning(f'Empty pattern name saved as "{suffixed_name}"')
elif suffixed_name != sanitized_name:
if dup_warn_filter is None or dup_warn_filter(pat.name):
logger.warning(f'Pattern name "{pat.name}" ({sanitized_name}) appears multiple times;\n'
+ f' renaming to "{suffixed_name}"')
# Encode into a byte-string and perform some final checks
encoded_name = suffixed_name.encode('ASCII')
if len(encoded_name) == 0:
# Should never happen since zero-length names are replaced
raise PatternError(f'Zero-length name after sanitize+encode,\n originally "{pat.name}"')
if len(encoded_name) > max_name_length:
raise PatternError(f'Pattern name "{encoded_name!r}" length > {max_name_length} after encode,\n'
+ f' originally "{pat.name}"')
pat.name = suffixed_name
used_names.append(suffixed_name)
def load_library( def load_library(
stream: IO[bytes], stream: BinaryIO,
tag: str,
is_secondary: Optional[Callable[[str], bool]] = None,
*, *,
full_load: bool = False, full_load: bool = False,
postprocess: Callable[[ILibraryView, str, Pattern], Pattern] | None = None ) -> Tuple[Library, Dict[str, Any]]:
) -> tuple[LazyLibrary, dict[str, Any]]:
""" """
Scan a GDSII stream to determine what structures are present, and create Scan a GDSII stream to determine what structures are present, and create
a library from them. This enables deferred reading of structures a library from them. This enables deferred reading of structures
@ -524,27 +577,33 @@ def load_library(
The caller should leave the stream open while the library The caller should leave the stream open while the library
is still in use, since the library will need to access it is still in use, since the library will need to access it
in order to read the structure contents. in order to read the structure contents.
tag: Unique identifier that will be used to identify this data source
is_secondary: Function which takes a structure name and returns
True if the structure should only be used as a subcell
and not appear in the main Library interface.
Default always returns False.
full_load: If True, force all structures to be read immediately rather full_load: If True, force all structures to be read immediately rather
than as-needed. Since data is read sequentially from the file, this than as-needed. Since data is read sequentially from the file,
will be faster than using the resulting library's `precache` method. this will be faster than using the resulting library's
postprocess: If given, this function is used to post-process each `precache` method.
pattern *upon first load only*.
Returns: Returns:
LazyLibrary object, allowing for deferred load of structures. Library object, allowing for deferred load of structures.
Additional library info (dict, same format as from `read`). Additional library info (dict, same format as from `read`).
""" """
if is_secondary is None:
def is_secondary(k: str) -> bool:
return False
assert(is_secondary is not None)
stream.seek(0) stream.seek(0)
lib = LazyLibrary() lib = Library()
if full_load: if full_load:
# Full load approach (immediately load everything) # Full load approach (immediately load everything)
patterns, library_info = read(stream) patterns, library_info = read(stream)
for name, pattern in patterns.items(): for name, pattern in patterns.items():
if postprocess is not None: lib.set_const(name, tag, pattern, secondary=is_secondary(name))
lib[name] = postprocess(lib, name, pattern)
else:
lib[name] = pattern
return lib, library_info return lib, library_info
# Normal approach (scan and defer load) # Normal approach (scan and defer load)
@ -556,23 +615,21 @@ def load_library(
def mkstruct(pos: int = pos, name: str = name) -> Pattern: def mkstruct(pos: int = pos, name: str = name) -> Pattern:
stream.seek(pos) stream.seek(pos)
pat = read_elements(stream, raw_mode=True) return read_elements(stream, name, raw_mode=True)
if postprocess is not None:
pat = postprocess(lib, name, pat)
return pat
lib[name] = mkstruct lib.set_value(name, tag, mkstruct, secondary=is_secondary(name))
return lib, library_info return lib, library_info
def load_libraryfile( def load_libraryfile(
filename: str | pathlib.Path, filename: Union[str, pathlib.Path],
tag: str,
is_secondary: Optional[Callable[[str], bool]] = None,
*, *,
use_mmap: bool = True, use_mmap: bool = True,
full_load: bool = False, full_load: bool = False,
postprocess: Callable[[ILibraryView, str, Pattern], Pattern] | None = None ) -> Tuple[Library, Dict[str, Any]]:
) -> tuple[LazyLibrary, dict[str, Any]]:
""" """
Wrapper for `load_library()` that takes a filename or path instead of a stream. Wrapper for `load_library()` that takes a filename or path instead of a stream.
@ -583,65 +640,31 @@ def load_libraryfile(
Args: Args:
path: filename or path to read from path: filename or path to read from
tag: Unique identifier for library, see `load_library`
is_secondary: Function specifying subcess, see `load_library`
use_mmap: If `True`, will attempt to memory-map the file instead use_mmap: If `True`, will attempt to memory-map the file instead
of buffering. In the case of gzipped files, the file of buffering. In the case of gzipped files, the file
is decompressed into a python `bytes` object in memory is decompressed into a python `bytes` object in memory
and reopened as an `io.BytesIO` stream. and reopened as an `io.BytesIO` stream.
full_load: If `True`, immediately loads all data. See `load_library`. full_load: If `True`, immediately loads all data. See `load_library`.
postprocess: Passed to `load_library`
Returns: Returns:
LazyLibrary object, allowing for deferred load of structures. Library object, allowing for deferred load of structures.
Additional library info (dict, same format as from `read`). Additional library info (dict, same format as from `read`).
""" """
path = pathlib.Path(filename) path = pathlib.Path(filename)
stream: IO[bytes]
if is_gzipped(path): if is_gzipped(path):
if use_mmap: if mmap:
logger.info('Asked to mmap a gzipped file, reading into memory instead...') logger.info('Asked to mmap a gzipped file, reading into memory instead...')
gz_stream = gzip.open(path, mode='rb') # noqa: SIM115 base_stream = gzip.open(path, mode='rb')
stream = io.BytesIO(gz_stream.read()) # type: ignore stream = io.BytesIO(base_stream.read())
else: else:
gz_stream = gzip.open(path, mode='rb') # noqa: SIM115 base_stream = gzip.open(path, mode='rb')
stream = io.BufferedReader(gz_stream) # type: ignore stream = io.BufferedReader(base_stream)
else: # noqa: PLR5501
if use_mmap:
base_stream = path.open(mode='rb', buffering=0) # noqa: SIM115
stream = mmap.mmap(base_stream.fileno(), 0, access=mmap.ACCESS_READ) # type: ignore
else: else:
stream = path.open(mode='rb') # noqa: SIM115 base_stream = open(path, mode='rb')
return load_library(stream, full_load=full_load, postprocess=postprocess) if mmap:
stream = mmap.mmap(base_stream.fileno(), 0, access=mmap.ACCESS_READ)
else:
def check_valid_names( stream = io.BufferedReader(base_stream)
names: Iterable[str], return load_library(stream, tag, is_secondary)
max_length: int = 32,
) -> None:
"""
Check all provided names to see if they're valid GDSII cell names.
Args:
names: Collection of names to check
max_length: Max allowed length
"""
allowed_chars = set(string.ascii_letters + string.digits + '_?$')
bad_chars = [
name for name in names
if not set(name).issubset(allowed_chars)
]
bad_lengths = [
name for name in names
if len(name) > max_length
]
if bad_chars:
logger.error('Names contain invalid characters:\n' + pformat(bad_chars))
if bad_lengths:
logger.error(f'Names too long (>{max_length}:\n' + pformat(bad_chars))
if bad_chars or bad_lengths:
raise LibraryError('Library contains invalid names, see log above')

2
masque/file/klamath.py Normal file
View File

@ -0,0 +1,2 @@
# FOr backwards compatibility
from .gdsii import *

View File

@ -10,36 +10,33 @@ Note that OASIS references follow the same convention as `masque`,
Scaling, rotation, and mirroring apply to individual instances, not grid Scaling, rotation, and mirroring apply to individual instances, not grid
vectors or offsets. vectors or offsets.
Notes:
* Gzip modification time is set to 0 (start of current epoch, usually 1970-01-01)
""" """
from typing import Any, IO, cast from typing import List, Any, Dict, Tuple, Callable, Union, Sequence, Iterable, Optional
from collections.abc import Sequence, Iterable, Mapping, Callable import re
import io
import copy
import base64
import struct
import logging import logging
import pathlib import pathlib
import gzip import gzip
import string
from pprint import pformat
import numpy import numpy
from numpy.typing import ArrayLike, NDArray
import fatamorgana import fatamorgana
import fatamorgana.records as fatrec import fatamorgana.records as fatrec
from fatamorgana.basic import PathExtensionScheme, AString, NString, PropStringReference from fatamorgana.basic import PathExtensionScheme, AString, NString, PropStringReference
from .utils import is_gzipped, tmpfile from .utils import clean_pattern_vertices, is_gzipped
from .. import Pattern, Ref, PatternError, LibraryError, Label, Shape from .. import Pattern, SubPattern, PatternError, Label, Shape
from ..library import Library, ILibrary from ..shapes import Polygon, Path, Circle
from ..shapes import Path, Circle
from ..repetition import Grid, Arbitrary, Repetition from ..repetition import Grid, Arbitrary, Repetition
from ..utils import layer_t, annotations_t from ..utils import layer_t, normalize_mirror, annotations_t
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
logger.warning('OASIS support is experimental!') logger.warning('OASIS support is experimental and mostly untested!')
path_cap_map = { path_cap_map = {
@ -48,23 +45,21 @@ path_cap_map = {
PathExtensionScheme.Arbitrary: Path.Cap.SquareCustom, PathExtensionScheme.Arbitrary: Path.Cap.SquareCustom,
} }
#TODO implement more shape types in OASIS? #TODO implement more shape types?
def rint_cast(val: ArrayLike) -> NDArray[numpy.int64]:
return numpy.rint(val).astype(numpy.int64)
def build( def build(
library: Mapping[str, Pattern], # NOTE: Pattern here should be treated as immutable! patterns: Union[Pattern, Sequence[Pattern]],
units_per_micron: int, units_per_micron: int,
layer_map: dict[str, int | tuple[int, int]] | None = None, layer_map: Optional[Dict[str, Union[int, Tuple[int, int]]]] = None,
*, *,
annotations: annotations_t | None = None, modify_originals: bool = False,
disambiguate_func: Optional[Callable[[Iterable[Pattern]], None]] = None,
annotations: Optional[annotations_t] = None,
) -> fatamorgana.OasisLayout: ) -> fatamorgana.OasisLayout:
""" """
Convert a collection of {name: Pattern} pairs to an OASIS stream, writing patterns Convert a `Pattern` or list of patterns to an OASIS stream, writing patterns
as OASIS cells, refs as Placement records, and mapping other shapes and labels as OASIS cells, subpatterns as Placement records, and other shapes and labels
to equivalent record types (Polygon, Path, Circle, Text). mapped to equivalent record types (Polygon, Path, Circle, Text).
Other shape types may be converted to polygons if no equivalent Other shape types may be converted to polygons if no equivalent
record type exists (or is not implemented here yet). record type exists (or is not implemented here yet).
@ -76,17 +71,14 @@ def build(
If a layer map is provided, layer strings will be converted If a layer map is provided, layer strings will be converted
automatically, and layer names will be written to the file. automatically, and layer names will be written to the file.
Other functions you may want to call: If you want pattern polygonized with non-default arguments, just call `pattern.polygonize()`
- `masque.file.oasis.check_valid_names(library.keys())` to check for invalid names prior to calling this function.
- `library.dangling_refs()` to check for references to missing patterns
- `pattern.polygonize()` for any patterns with shapes other
than `masque.shapes.Polygon`, `masque.shapes.Path`, or `masque.shapes.Circle`
Args: Args:
library: A {name: Pattern} mapping of patterns to write. patterns: A Pattern or list of patterns to convert.
units_per_micron: Written into the OASIS file, number of grid steps per micrometer. units_per_micron: Written into the OASIS file, number of grid steps per micrometer.
All distances are assumed to be an integer multiple of the grid step, and are stored as such. All distances are assumed to be an integer multiple of the grid step, and are stored as such.
layer_map: dictionary which translates layer names into layer numbers. If this argument is layer_map: Dictionary which translates layer names into layer numbers. If this argument is
provided, input shapes and labels are allowed to have layer names instead of numbers. provided, input shapes and labels are allowed to have layer names instead of numbers.
It is assumed that geometry and text share the same layer names, and each name is It is assumed that geometry and text share the same layer names, and each name is
assigned only to a single layer (not a range). assigned only to a single layer (not a range).
@ -94,23 +86,31 @@ def build(
into numbers, omit this argument, and manually generate the required into numbers, omit this argument, and manually generate the required
`fatamorgana.records.LayerName` entries. `fatamorgana.records.LayerName` entries.
Default is an empty dict (no names provided). Default is an empty dict (no names provided).
modify_originals: If `True`, the original pattern is modified as part of the writing
process. Otherwise, a copy is made and `deepunlock()`-ed.
Default `False`.
disambiguate_func: Function which takes a list of patterns and alters them
to make their names valid and unique. Default is `disambiguate_pattern_names`.
annotations: dictionary of key-value pairs which are saved as library-level properties annotations: dictionary of key-value pairs which are saved as library-level properties
Returns: Returns:
`fatamorgana.OasisLayout` `fatamorgana.OasisLayout`
""" """
if not isinstance(library, ILibrary): if isinstance(patterns, Pattern):
if isinstance(library, dict): patterns = [patterns]
library = Library(library)
else:
library = Library(dict(library))
if layer_map is None: if layer_map is None:
layer_map = {} layer_map = {}
if disambiguate_func is None:
disambiguate_func = disambiguate_pattern_names
if annotations is None: if annotations is None:
annotations = {} annotations = {}
if not modify_originals:
patterns = [p.deepunlock() for p in copy.deepcopy(patterns)]
# Create library # Create library
lib = fatamorgana.OasisLayout(unit=units_per_micron, validation=None) lib = fatamorgana.OasisLayout(unit=units_per_micron, validation=None)
lib.properties = annotations_to_properties(annotations) lib.properties = annotations_to_properties(annotations)
@ -119,38 +119,44 @@ def build(
for name, layer_num in layer_map.items(): for name, layer_num in layer_map.items():
layer, data_type = _mlayer2oas(layer_num) layer, data_type = _mlayer2oas(layer_num)
lib.layers += [ lib.layers += [
fatrec.LayerName( fatrec.LayerName(nstring=name,
nstring=name,
layer_interval=(layer, layer), layer_interval=(layer, layer),
type_interval=(data_type, data_type), type_interval=(data_type, data_type),
is_textlayer=tt, is_textlayer=tt)
)
for tt in (True, False)] for tt in (True, False)]
def layer2oas(mlayer: layer_t) -> tuple[int, int]: def layer2oas(mlayer: layer_t) -> Tuple[int, int]:
assert layer_map is not None assert(layer_map is not None)
layer_num = layer_map[mlayer] if isinstance(mlayer, str) else mlayer layer_num = layer_map[mlayer] if isinstance(mlayer, str) else mlayer
return _mlayer2oas(layer_num) return _mlayer2oas(layer_num)
else: else:
layer2oas = _mlayer2oas layer2oas = _mlayer2oas
# Get a dict of id(pattern) -> pattern
patterns_by_id = {id(pattern): pattern for pattern in patterns}
for pattern in patterns:
for i, p in pattern.referenced_patterns_by_id().items():
patterns_by_id[i] = p
disambiguate_func(patterns_by_id.values())
# Now create a structure for each pattern # Now create a structure for each pattern
for name, pat in library.items(): for pat in patterns_by_id.values():
structure = fatamorgana.Cell(name=name) structure = fatamorgana.Cell(name=pat.name)
lib.cells.append(structure) lib.cells.append(structure)
structure.properties += annotations_to_properties(pat.annotations) structure.properties += annotations_to_properties(pat.annotations)
structure.geometry += _shapes_to_elements(pat.shapes, layer2oas) structure.geometry += _shapes_to_elements(pat.shapes, layer2oas)
structure.geometry += _labels_to_texts(pat.labels, layer2oas) structure.geometry += _labels_to_texts(pat.labels, layer2oas)
structure.placements += _refs_to_placements(pat.refs) structure.placements += _subpatterns_to_placements(pat.subpatterns)
return lib return lib
def write( def write(
library: Mapping[str, Pattern], # NOTE: Pattern here should be treated as immutable! patterns: Union[Sequence[Pattern], Pattern],
stream: IO[bytes], stream: io.BufferedIOBase,
*args, *args,
**kwargs, **kwargs,
) -> None: ) -> None:
@ -159,18 +165,18 @@ def write(
for details. for details.
Args: Args:
library: A {name: Pattern} mapping of patterns to write. patterns: A Pattern or list of patterns to write to file.
stream: Stream to write to. stream: Stream to write to.
*args: passed to `oasis.build()` *args: passed to `oasis.build()`
**kwargs: passed to `oasis.build()` **kwargs: passed to `oasis.build()`
""" """
lib = build(library, *args, **kwargs) lib = build(patterns, *args, **kwargs)
lib.write(stream) lib.write(stream)
def writefile( def writefile(
library: Mapping[str, Pattern], # NOTE: Pattern here should be treated as immutable! patterns: Union[Sequence[Pattern], Pattern],
filename: str | pathlib.Path, filename: Union[str, pathlib.Path],
*args, *args,
**kwargs, **kwargs,
) -> None: ) -> None:
@ -180,33 +186,26 @@ def writefile(
Will automatically compress the file if it has a .gz suffix. Will automatically compress the file if it has a .gz suffix.
Args: Args:
library: A {name: Pattern} mapping of patterns to write. patterns: `Pattern` or list of patterns to save
filename: Filename to save to. filename: Filename to save to.
*args: passed to `oasis.write` *args: passed to `oasis.write`
**kwargs: passed to `oasis.write` **kwargs: passed to `oasis.write`
""" """
path = pathlib.Path(filename) path = pathlib.Path(filename)
with tmpfile(path) as base_stream:
streams: tuple[Any, ...] = (base_stream,)
if path.suffix == '.gz': if path.suffix == '.gz':
stream = cast(IO[bytes], gzip.GzipFile(filename='', mtime=0, fileobj=base_stream, mode='wb')) open_func: Callable = gzip.open
streams += (stream,)
else: else:
stream = base_stream open_func = open
try: with io.BufferedWriter(open_func(path, mode='wb')) as stream:
write(library, stream, *args, **kwargs) write(patterns, stream, *args, **kwargs)
finally:
for ss in streams:
ss.close()
def readfile( def readfile(
filename: str | pathlib.Path, filename: Union[str, pathlib.Path],
*args, *args,
**kwargs, **kwargs,
) -> tuple[Library, dict[str, Any]]: ) -> Tuple[Dict[str, Pattern], Dict[str, Any]]:
""" """
Wrapper for `oasis.read()` that takes a filename or path instead of a stream. Wrapper for `oasis.read()` that takes a filename or path instead of a stream.
@ -223,18 +222,19 @@ def readfile(
else: else:
open_func = open open_func = open
with open_func(path, mode='rb') as stream: with io.BufferedReader(open_func(path, mode='rb')) as stream:
results = read(stream, *args, **kwargs) results = read(stream, *args, **kwargs)
return results return results
def read( def read(
stream: IO[bytes], stream: io.BufferedIOBase,
) -> tuple[Library, dict[str, Any]]: clean_vertices: bool = True,
) -> Tuple[Dict[str, Pattern], Dict[str, Any]]:
""" """
Read a OASIS file and translate it into a dict of Pattern objects. OASIS cells are Read a OASIS file and translate it into a dict of Pattern objects. OASIS cells are
translated into Pattern objects; Polygons are translated into polygons, and Placements translated into Pattern objects; Polygons are translated into polygons, and Placements
are translated into Ref objects. are translated into SubPattern objects.
Additional library info is returned in a dict, containing: Additional library info is returned in a dict, containing:
'units_per_micrometer': number of database units per micrometer (all values are in database units) 'units_per_micrometer': number of database units per micrometer (all values are in database units)
@ -243,15 +243,18 @@ def read(
Args: Args:
stream: Stream to read from. stream: Stream to read from.
clean_vertices: If `True`, remove any redundant vertices when loading polygons.
The cleaning process removes any polygons with zero area or <3 vertices.
Default `True`.
Returns: Returns:
- dict of `pattern_name`:`Pattern`s generated from OASIS cells - Dict of `pattern_name`:`Pattern`s generated from OASIS cells
- dict of OASIS library info - Dict of OASIS library info
""" """
lib = fatamorgana.OasisLayout.read(stream) lib = fatamorgana.OasisLayout.read(stream)
library_info: dict[str, Any] = { library_info: Dict[str, Any] = {
'units_per_micrometer': lib.unit, 'units_per_micrometer': lib.unit,
'annotations': properties_to_annotations(lib.properties, lib.propnames, lib.propstrings), 'annotations': properties_to_annotations(lib.properties, lib.propnames, lib.propstrings),
} }
@ -261,76 +264,72 @@ def read(
layer_map[str(layer_name.nstring)] = layer_name layer_map[str(layer_name.nstring)] = layer_name
library_info['layer_map'] = layer_map library_info['layer_map'] = layer_map
mlib = Library() patterns = []
for cell in lib.cells: for cell in lib.cells:
if isinstance(cell.name, int): if isinstance(cell.name, int):
cell_name = lib.cellnames[cell.name].nstring.string cell_name = lib.cellnames[cell.name].nstring.string
else: else:
cell_name = cell.name.string cell_name = cell.name.string
pat = Pattern() pat = Pattern(name=cell_name)
for element in cell.geometry: for element in cell.geometry:
if isinstance(element, fatrec.XElement): if isinstance(element, fatrec.XElement):
logger.warning('Skipping XElement record') logger.warning('Skipping XElement record')
# note XELEMENT has no repetition # note XELEMENT has no repetition
continue continue
assert not isinstance(element.repetition, fatamorgana.ReuseRepetition) assert(not isinstance(element.repetition, fatamorgana.ReuseRepetition))
repetition = repetition_fata2masq(element.repetition) repetition = repetition_fata2masq(element.repetition)
# Switch based on element type: # Switch based on element type:
if isinstance(element, fatrec.Polygon): if isinstance(element, fatrec.Polygon):
# Drop last point (`fatamorgana` returns explicity closed list; we use implicit close) vertices = numpy.cumsum(numpy.vstack(((0, 0), element.get_point_list())), axis=0)
# also need `cumsum` to convert from deltas to locations
vertices = numpy.cumsum(numpy.vstack(((0, 0), element.get_point_list()[:-1])), axis=0)
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings) annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
pat.polygon( poly = Polygon(vertices=vertices,
vertices=vertices,
layer=element.get_layer_tuple(), layer=element.get_layer_tuple(),
offset=element.get_xy(), offset=element.get_xy(),
annotations=annotations, annotations=annotations,
repetition=repetition, repetition=repetition)
)
pat.shapes.append(poly)
elif isinstance(element, fatrec.Path): elif isinstance(element, fatrec.Path):
vertices = numpy.cumsum(numpy.vstack(((0, 0), element.get_point_list())), axis=0) vertices = numpy.cumsum(numpy.vstack(((0, 0), element.get_point_list())), axis=0)
cap_start = path_cap_map[element.get_extension_start()[0]] cap_start = path_cap_map[element.get_extension_start()[0]]
cap_end = path_cap_map[element.get_extension_end()[0]] cap_end = path_cap_map[element.get_extension_end()[0]]
if cap_start != cap_end: if cap_start != cap_end:
raise PatternError('masque does not support multiple cap types on a single path.') # TODO handle multiple cap types raise Exception('masque does not support multiple cap types on a single path.') # TODO handle multiple cap types
cap = cap_start cap = cap_start
path_args: dict[str, Any] = {} path_args: Dict[str, Any] = {}
if cap == Path.Cap.SquareCustom: if cap == Path.Cap.SquareCustom:
path_args['cap_extensions'] = numpy.array(( path_args['cap_extensions'] = numpy.array((element.get_extension_start()[1],
element.get_extension_start()[1], element.get_extension_end()[1]))
element.get_extension_end()[1],
))
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings) annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
pat.path( path = Path(vertices=vertices,
vertices=vertices,
layer=element.get_layer_tuple(), layer=element.get_layer_tuple(),
offset=element.get_xy(), offset=element.get_xy(),
repetition=repetition, repetition=repetition,
annotations=annotations, annotations=annotations,
width=element.get_half_width() * 2, width=element.get_half_width() * 2,
cap=cap, cap=cap,
**path_args, **path_args)
)
pat.shapes.append(path)
elif isinstance(element, fatrec.Rectangle): elif isinstance(element, fatrec.Rectangle):
width = element.get_width() width = element.get_width()
height = element.get_height() height = element.get_height()
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings) annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
pat.polygon( rect = Polygon(layer=element.get_layer_tuple(),
layer=element.get_layer_tuple(),
offset=element.get_xy(), offset=element.get_xy(),
repetition=repetition, repetition=repetition,
vertices=numpy.array(((0, 0), (1, 0), (1, 1), (0, 1))) * (width, height), vertices=numpy.array(((0, 0), (1, 0), (1, 1), (0, 1))) * (width, height),
annotations=annotations, annotations=annotations,
) )
pat.shapes.append(rect)
elif isinstance(element, fatrec.Trapezoid): elif isinstance(element, fatrec.Trapezoid):
vertices = numpy.array(((0, 0), (1, 0), (1, 1), (0, 1))) * (element.get_width(), element.get_height()) vertices = numpy.array(((0, 0), (1, 0), (1, 1), (0, 1))) * (element.get_width(), element.get_height())
@ -358,13 +357,13 @@ def read(
vertices[2, 0] -= b vertices[2, 0] -= b
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings) annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
pat.polygon( trapz = Polygon(layer=element.get_layer_tuple(),
layer=element.get_layer_tuple(),
offset=element.get_xy(), offset=element.get_xy(),
repetition=repetition, repetition=repetition,
vertices=vertices, vertices=vertices,
annotations=annotations, annotations=annotations,
) )
pat.shapes.append(trapz)
elif isinstance(element, fatrec.CTrapezoid): elif isinstance(element, fatrec.CTrapezoid):
cttype = element.get_ctrapezoid_type() cttype = element.get_ctrapezoid_type()
@ -413,24 +412,22 @@ def read(
vertices[0, 1] += width vertices[0, 1] += width
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings) annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
pat.polygon( ctrapz = Polygon(layer=element.get_layer_tuple(),
layer=element.get_layer_tuple(),
offset=element.get_xy(), offset=element.get_xy(),
repetition=repetition, repetition=repetition,
vertices=vertices, vertices=vertices,
annotations=annotations, annotations=annotations,
) )
pat.shapes.append(ctrapz)
elif isinstance(element, fatrec.Circle): elif isinstance(element, fatrec.Circle):
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings) annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
layer = element.get_layer_tuple() circle = Circle(layer=element.get_layer_tuple(),
circle = Circle(
offset=element.get_xy(), offset=element.get_xy(),
repetition=repetition, repetition=repetition,
annotations=annotations, annotations=annotations,
radius=float(element.get_radius()), radius=float(element.get_radius()))
) pat.shapes.append(circle)
pat.shapes[layer].append(circle)
elif isinstance(element, fatrec.Text): elif isinstance(element, fatrec.Text):
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings) annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
@ -439,30 +436,38 @@ def read(
string = lib.textstrings[str_or_ref].string string = lib.textstrings[str_or_ref].string
else: else:
string = str_or_ref.string string = str_or_ref.string
pat.label( label = Label(layer=element.get_layer_tuple(),
layer=element.get_layer_tuple(),
offset=element.get_xy(), offset=element.get_xy(),
repetition=repetition, repetition=repetition,
annotations=annotations, annotations=annotations,
string=string, string=string)
) pat.labels.append(label)
else: else:
logger.warning(f'Skipping record {element} (unimplemented)') logger.warning(f'Skipping record {element} (unimplemented)')
continue continue
for placement in cell.placements: for placement in cell.placements:
target, ref = _placement_to_ref(placement, lib) pat.subpatterns.append(_placement_to_subpat(placement, lib))
if isinstance(target, int):
target = lib.cellnames[target].nstring.string
pat.refs[target].append(ref)
mlib[cell_name] = pat if clean_vertices:
clean_pattern_vertices(pat)
patterns.append(pat)
return mlib, library_info # Create a dict of {pattern.name: pattern, ...}, then fix up all subpattern.pattern entries
# according to the subpattern.identifier (which is deleted after use).
patterns_dict = dict(((p.name, p) for p in patterns))
for p in patterns_dict.values():
for sp in p.subpatterns:
ident = sp.identifier[0]
name = ident if isinstance(ident, str) else lib.cellnames[ident].nstring.string
sp.pattern = patterns_dict[name]
del sp.identifier
return patterns_dict, library_info
def _mlayer2oas(mlayer: layer_t) -> tuple[int, int]: def _mlayer2oas(mlayer: layer_t) -> Tuple[int, int]:
""" Helper to turn a layer tuple-or-int into a layer and datatype""" """ Helper to turn a layer tuple-or-int into a layer and datatype"""
if isinstance(mlayer, int): if isinstance(mlayer, int):
layer = mlayer layer = mlayer
@ -474,102 +479,97 @@ def _mlayer2oas(mlayer: layer_t) -> tuple[int, int]:
else: else:
data_type = 0 data_type = 0
else: else:
raise PatternError(f'Invalid layer for OASIS: {mlayer}. Note that OASIS layers cannot be ' raise PatternError(f'Invalid layer for OASIS: {layer}. Note that OASIS layers cannot be '
f'strings unless a layer map is provided.') f'strings unless a layer map is provided.')
return layer, data_type return layer, data_type
def _placement_to_ref(placement: fatrec.Placement, lib: fatamorgana.OasisLayout) -> tuple[int | str, Ref]: def _placement_to_subpat(placement: fatrec.Placement, lib: fatamorgana.OasisLayout) -> SubPattern:
""" """
Helper function to create a Ref from a placment. Also returns the placement name (or id). Helper function to create a SubPattern from a placment. Sets subpat.pattern to None
and sets the instance .identifier to (struct_name,).
""" """
assert not isinstance(placement.repetition, fatamorgana.ReuseRepetition) assert(not isinstance(placement.repetition, fatamorgana.ReuseRepetition))
xy = numpy.array((placement.x, placement.y)) xy = numpy.array((placement.x, placement.y))
mag = placement.magnification if placement.magnification is not None else 1 mag = placement.magnification if placement.magnification is not None else 1
pname = placement.get_name() pname = placement.get_name()
name: int | str = pname if isinstance(pname, int) else pname.string # TODO deal with referenced names name = pname if isinstance(pname, int) else pname.string
annotations = properties_to_annotations(placement.properties, lib.propnames, lib.propstrings) annotations = properties_to_annotations(placement.properties, lib.propnames, lib.propstrings)
if placement.angle is None: if placement.angle is None:
rotation = 0 rotation = 0
else: else:
rotation = numpy.deg2rad(float(placement.angle)) rotation = numpy.deg2rad(float(placement.angle))
ref = Ref( subpat = SubPattern(offset=xy,
offset=xy, pattern=None,
mirrored=placement.flip, mirrored=(placement.flip, False),
rotation=rotation, rotation=rotation,
scale=float(mag), scale=float(mag),
identifier=(name,),
repetition=repetition_fata2masq(placement.repetition), repetition=repetition_fata2masq(placement.repetition),
annotations=annotations, annotations=annotations)
) return subpat
return name, ref
def _refs_to_placements( def _subpatterns_to_placements(
refs: dict[str | None, list[Ref]], subpatterns: List[SubPattern],
) -> list[fatrec.Placement]: ) -> List[fatrec.Placement]:
placements = [] refs = []
for target, rseq in refs.items(): for subpat in subpatterns:
if target is None: if subpat.pattern is None:
continue continue
for ref in rseq:
# Note: OASIS also mirrors first and rotates second
frep, rep_offset = repetition_masq2fata(ref.repetition)
offset = rint_cast(ref.offset + rep_offset) # Note: OASIS mirrors first and rotates second
angle = numpy.rad2deg(ref.rotation) % 360 mirror_across_x, extra_angle = normalize_mirror(subpat.mirrored)
placement = fatrec.Placement( frep, rep_offset = repetition_masq2fata(subpat.repetition)
name=target,
flip=ref.mirrored, offset = numpy.round(subpat.offset + rep_offset).astype(int)
angle = numpy.rad2deg(subpat.rotation + extra_angle) % 360
ref = fatrec.Placement(
name=subpat.pattern.name,
flip=mirror_across_x,
angle=angle, angle=angle,
magnification=ref.scale, magnification=subpat.scale,
properties=annotations_to_properties(ref.annotations), properties=annotations_to_properties(subpat.annotations),
x=offset[0], x=offset[0],
y=offset[1], y=offset[1],
repetition=frep, repetition=frep)
)
placements.append(placement) refs.append(ref)
return placements return refs
def _shapes_to_elements( def _shapes_to_elements(
shapes: dict[layer_t, list[Shape]], shapes: List[Shape],
layer2oas: Callable[[layer_t], tuple[int, int]], layer2oas: Callable[[layer_t], Tuple[int, int]],
) -> list[fatrec.Polygon | fatrec.Path | fatrec.Circle]: ) -> List[Union[fatrec.Polygon, fatrec.Path, fatrec.Circle]]:
# Add a Polygon record for each shape, and Path elements if necessary # Add a Polygon record for each shape, and Path elements if necessary
elements: list[fatrec.Polygon | fatrec.Path | fatrec.Circle] = [] elements: List[Union[fatrec.Polygon, fatrec.Path, fatrec.Circle]] = []
for mlayer, sseq in shapes.items(): for shape in shapes:
layer, datatype = layer2oas(mlayer) layer, datatype = layer2oas(shape.layer)
for shape in sseq:
repetition, rep_offset = repetition_masq2fata(shape.repetition) repetition, rep_offset = repetition_masq2fata(shape.repetition)
properties = annotations_to_properties(shape.annotations) properties = annotations_to_properties(shape.annotations)
if isinstance(shape, Circle): if isinstance(shape, Circle):
offset = rint_cast(shape.offset + rep_offset) offset = numpy.round(shape.offset + rep_offset).astype(int)
radius = rint_cast(shape.radius) radius = numpy.round(shape.radius).astype(int)
circle = fatrec.Circle( circle = fatrec.Circle(layer=layer,
layer=layer,
datatype=datatype, datatype=datatype,
radius=cast(int, radius), radius=radius,
x=offset[0], x=offset[0],
y=offset[1], y=offset[1],
properties=properties, properties=properties,
repetition=repetition, repetition=repetition)
)
elements.append(circle) elements.append(circle)
elif isinstance(shape, Path): elif isinstance(shape, Path):
xy = rint_cast(shape.offset + shape.vertices[0] + rep_offset) xy = numpy.round(shape.offset + shape.vertices[0] + rep_offset).astype(int)
deltas = rint_cast(numpy.diff(shape.vertices, axis=0)) deltas = numpy.round(numpy.diff(shape.vertices, axis=0)).astype(int)
half_width = rint_cast(shape.width / 2) half_width = numpy.round(shape.width / 2).astype(int)
path_type = next(k for k, v in path_cap_map.items() if v == shape.cap) # reverse lookup path_type = next(k for k, v in path_cap_map.items() if v == shape.cap) # reverse lookup
extension_start = (path_type, shape.cap_extensions[0] if shape.cap_extensions is not None else None) extension_start = (path_type, shape.cap_extensions[0] if shape.cap_extensions is not None else None)
extension_end = (path_type, shape.cap_extensions[1] if shape.cap_extensions is not None else None) extension_end = (path_type, shape.cap_extensions[1] if shape.cap_extensions is not None else None)
path = fatrec.Path( path = fatrec.Path(layer=layer,
layer=layer,
datatype=datatype, datatype=datatype,
point_list=cast(Sequence[Sequence[int]], deltas), point_list=deltas,
half_width=cast(int, half_width), half_width=half_width,
x=xy[0], x=xy[0],
y=xy[1], y=xy[1],
extension_start=extension_start, # TODO implement multiple cap types? extension_start=extension_start, # TODO implement multiple cap types?
@ -580,57 +580,81 @@ def _shapes_to_elements(
elements.append(path) elements.append(path)
else: else:
for polygon in shape.to_polygons(): for polygon in shape.to_polygons():
xy = rint_cast(polygon.offset + polygon.vertices[0] + rep_offset) xy = numpy.round(polygon.offset + polygon.vertices[0] + rep_offset).astype(int)
points = rint_cast(numpy.diff(polygon.vertices, axis=0)) points = numpy.round(numpy.diff(polygon.vertices, axis=0)).astype(int)
elements.append(fatrec.Polygon( elements.append(fatrec.Polygon(layer=layer,
layer=layer,
datatype=datatype, datatype=datatype,
x=xy[0], x=xy[0],
y=xy[1], y=xy[1],
point_list=cast(list[list[int]], points), point_list=points,
properties=properties, properties=properties,
repetition=repetition, repetition=repetition))
))
return elements return elements
def _labels_to_texts( def _labels_to_texts(
labels: dict[layer_t, list[Label]], labels: List[Label],
layer2oas: Callable[[layer_t], tuple[int, int]], layer2oas: Callable[[layer_t], Tuple[int, int]],
) -> list[fatrec.Text]: ) -> List[fatrec.Text]:
texts = [] texts = []
for mlayer, lseq in labels.items(): for label in labels:
layer, datatype = layer2oas(mlayer) layer, datatype = layer2oas(label.layer)
for label in lseq:
repetition, rep_offset = repetition_masq2fata(label.repetition) repetition, rep_offset = repetition_masq2fata(label.repetition)
xy = rint_cast(label.offset + rep_offset) xy = numpy.round(label.offset + rep_offset).astype(int)
properties = annotations_to_properties(label.annotations) properties = annotations_to_properties(label.annotations)
texts.append(fatrec.Text( texts.append(fatrec.Text(layer=layer,
layer=layer,
datatype=datatype, datatype=datatype,
x=xy[0], x=xy[0],
y=xy[1], y=xy[1],
string=label.string, string=label.string,
properties=properties, properties=properties,
repetition=repetition, repetition=repetition))
))
return texts return texts
def disambiguate_pattern_names(
patterns,
dup_warn_filter: Callable[[str], bool] = None, # If returns False, don't warn about this name
) -> None:
used_names = []
for pat in patterns:
sanitized_name = re.compile(r'[^A-Za-z0-9_\?\$]').sub('_', pat.name)
i = 0
suffixed_name = sanitized_name
while suffixed_name in used_names or suffixed_name == '':
suffix = base64.b64encode(struct.pack('>Q', i), b'$?').decode('ASCII')
suffixed_name = sanitized_name + '$' + suffix[:-1].lstrip('A')
i += 1
if sanitized_name == '':
logger.warning(f'Empty pattern name saved as "{suffixed_name}"')
elif suffixed_name != sanitized_name:
if dup_warn_filter is None or dup_warn_filter(pat.name):
logger.warning(f'Pattern name "{pat.name}" ({sanitized_name}) appears multiple times;\n'
+ f' renaming to "{suffixed_name}"')
if len(suffixed_name) == 0:
# Should never happen since zero-length names are replaced
raise PatternError(f'Zero-length name after sanitize+encode,\n originally "{pat.name}"')
pat.name = suffixed_name
used_names.append(suffixed_name)
def repetition_fata2masq( def repetition_fata2masq(
rep: fatamorgana.GridRepetition | fatamorgana.ArbitraryRepetition | None, rep: Union[fatamorgana.GridRepetition, fatamorgana.ArbitraryRepetition, None],
) -> Repetition | None: ) -> Optional[Repetition]:
mrep: Repetition | None mrep: Optional[Repetition]
if isinstance(rep, fatamorgana.GridRepetition): if isinstance(rep, fatamorgana.GridRepetition):
mrep = Grid(a_vector=rep.a_vector, mrep = Grid(a_vector=rep.a_vector,
b_vector=rep.b_vector, b_vector=rep.b_vector,
a_count=rep.a_count, a_count=rep.a_count,
b_count=rep.b_count) b_count=rep.b_count)
elif isinstance(rep, fatamorgana.ArbitraryRepetition): elif isinstance(rep, fatamorgana.ArbitraryRepetition):
displacements = numpy.cumsum(numpy.column_stack(( displacements = numpy.cumsum(numpy.column_stack((rep.x_displacements,
rep.x_displacements, rep.y_displacements)), axis=0)
rep.y_displacements,
)), axis=0)
displacements = numpy.vstack(([0, 0], displacements)) displacements = numpy.vstack(([0, 0], displacements))
mrep = Arbitrary(displacements) mrep = Arbitrary(displacements)
elif rep is None: elif rep is None:
@ -639,37 +663,37 @@ def repetition_fata2masq(
def repetition_masq2fata( def repetition_masq2fata(
rep: Repetition | None, rep: Optional[Repetition],
) -> tuple[ ) -> Tuple[Union[fatamorgana.GridRepetition,
fatamorgana.GridRepetition | fatamorgana.ArbitraryRepetition | None, fatamorgana.ArbitraryRepetition,
tuple[int, int] None],
]: Tuple[int, int]]:
frep: fatamorgana.GridRepetition | fatamorgana.ArbitraryRepetition | None frep: Union[fatamorgana.GridRepetition, fatamorgana.ArbitraryRepetition, None]
if isinstance(rep, Grid): if isinstance(rep, Grid):
a_vector = rint_cast(rep.a_vector) a_vector = rint_cast(rep.a_vector)
b_vector = rint_cast(rep.b_vector) if rep.b_vector is not None else None b_vector = rint_cast(rep.b_vector) if rep.b_vector is not None else None
a_count = rint_cast(rep.a_count) a_count = rint_cast(rep.a_count)
b_count = rint_cast(rep.b_count) if rep.b_count is not None else None b_count = rint_cast(rep.b_count) if rep.b_count is not None else None
frep = fatamorgana.GridRepetition( frep = fatamorgana.GridRepetition(
a_vector=cast(list[int], a_vector), a_vector=a_vector,
b_vector=cast(list[int] | None, b_vector), b_vector=b_vector,
a_count=cast(int, a_count), a_count=a_count,
b_count=cast(int | None, b_count), b_count=b_count,
) )
offset = (0, 0) offset = (0, 0)
elif isinstance(rep, Arbitrary): elif isinstance(rep, Arbitrary):
diffs = numpy.diff(rep.displacements, axis=0) diffs = numpy.diff(rep.displacements, axis=0)
diff_ints = rint_cast(diffs) diff_ints = rint_cast(diffs)
frep = fatamorgana.ArbitraryRepetition(diff_ints[:, 0], diff_ints[:, 1]) # type: ignore frep = fatamorgana.ArbitraryRepetition(diff_ints[:, 0], diff_ints[:, 1])
offset = rep.displacements[0, :] offset = rep.displacements[0, :]
else: else:
assert rep is None assert(rep is None)
frep = None frep = None
offset = (0, 0) offset = (0, 0)
return frep, offset return frep, offset
def annotations_to_properties(annotations: annotations_t) -> list[fatrec.Property]: def annotations_to_properties(annotations: annotations_t) -> List[fatrec.Property]:
#TODO determine is_standard based on key? #TODO determine is_standard based on key?
properties = [] properties = []
for key, values in annotations.items(): for key, values in annotations.items():
@ -680,24 +704,24 @@ def annotations_to_properties(annotations: annotations_t) -> list[fatrec.Propert
def properties_to_annotations( def properties_to_annotations(
properties: list[fatrec.Property], properties: List[fatrec.Property],
propnames: dict[int, NString], propnames: Dict[int, NString],
propstrings: dict[int, AString], propstrings: Dict[int, AString],
) -> annotations_t: ) -> annotations_t:
annotations = {} annotations = {}
for proprec in properties: for proprec in properties:
assert proprec.name is not None assert(proprec.name is not None)
if isinstance(proprec.name, int): if isinstance(proprec.name, int):
key = propnames[proprec.name].string key = propnames[proprec.name].string
else: else:
key = proprec.name.string key = proprec.name.string
values: list[str | float | int] = [] values: List[Union[str, float, int]] = []
assert proprec.values is not None assert(proprec.values is not None)
for value in proprec.values: for value in proprec.values:
if isinstance(value, float | int): if isinstance(value, (float, int)):
values.append(value) values.append(value)
elif isinstance(value, NString | AString): elif isinstance(value, (NString, AString)):
values.append(value.string) values.append(value.string)
elif isinstance(value, PropStringReference): elif isinstance(value, PropStringReference):
values.append(propstrings[value.ref].string) # dereference values.append(propstrings[value.ref].string) # dereference
@ -711,25 +735,3 @@ def properties_to_annotations(
properties = [fatrec.Property(key, vals, is_standard=False) properties = [fatrec.Property(key, vals, is_standard=False)
for key, vals in annotations.items()] for key, vals in annotations.items()]
return properties return properties
def check_valid_names(
names: Iterable[str],
) -> None:
"""
Check all provided names to see if they're valid GDSII cell names.
Args:
names: Collection of names to check
max_length: Max allowed length
"""
allowed_chars = set(string.ascii_letters + string.digits + string.punctuation + ' ')
bad_chars = [
name for name in names
if not set(name).issubset(allowed_chars)
]
if bad_chars:
raise LibraryError('Names contain invalid characters:\n' + pformat(bad_chars))

580
masque/file/python_gdsii.py Normal file
View File

@ -0,0 +1,580 @@
"""
GDSII file format readers and writers using python-gdsii
Note that GDSII references follow the same convention as `masque`,
with this order of operations:
1. Mirroring
2. Rotation
3. Scaling
4. Offset and array expansion (no mirroring/rotation/scaling applied to offsets)
Scaling, rotation, and mirroring apply to individual instances, not grid
vectors or offsets.
Notes:
* absolute positioning is not supported
* PLEX is not supported
* ELFLAGS are not supported
* GDS does not support library- or structure-level annotations
"""
from typing import List, Any, Dict, Tuple, Callable, Union, Iterable, Optional
from typing import Sequence
import re
import io
import copy
import base64
import struct
import logging
import pathlib
import gzip
import numpy
from numpy.typing import NDArray, ArrayLike
# python-gdsii
import gdsii.library #type: ignore
import gdsii.structure #type: ignore
import gdsii.elements #type: ignore
from .utils import clean_pattern_vertices, is_gzipped
from .. import Pattern, SubPattern, PatternError, Label, Shape
from ..shapes import Polygon, Path
from ..repetition import Grid
from ..utils import get_bit, set_bit, layer_t, normalize_mirror, annotations_t
logger = logging.getLogger(__name__)
path_cap_map = {
None: Path.Cap.Flush,
0: Path.Cap.Flush,
1: Path.Cap.Circle,
2: Path.Cap.Square,
4: Path.Cap.SquareCustom,
}
def rint_cast(val: ArrayLike) -> NDArray[numpy.int32]:
return numpy.rint(val, dtype=numpy.int32, casting='unsafe')
def build(
patterns: Union[Pattern, Sequence[Pattern]],
meters_per_unit: float,
logical_units_per_unit: float = 1,
library_name: str = 'masque-gdsii-write',
*,
modify_originals: bool = False,
disambiguate_func: Callable[[Iterable[Pattern]], None] = None,
) -> gdsii.library.Library:
"""
Convert a `Pattern` or list of patterns to a GDSII stream, by first calling
`.polygonize()` to change the shapes into polygons, and then writing patterns
as GDSII structures, polygons as boundary elements, and subpatterns as structure
references (sref).
For each shape,
layer is chosen to be equal to `shape.layer` if it is an int,
or `shape.layer[0]` if it is a tuple
datatype is chosen to be `shape.layer[1]` if available,
otherwise `0`
It is often a good idea to run `pattern.subpatternize()` prior to calling this function,
especially if calling `.polygonize()` will result in very many vertices.
If you want pattern polygonized with non-default arguments, just call `pattern.polygonize()`
prior to calling this function.
Args:
patterns: A Pattern or list of patterns to convert.
meters_per_unit: Written into the GDSII file, meters per (database) length unit.
All distances are assumed to be an integer multiple of this unit, and are stored as such.
logical_units_per_unit: Written into the GDSII file. Allows the GDSII to specify a
"logical" unit which is different from the "database" unit, for display purposes.
Default `1`.
library_name: Library name written into the GDSII file.
Default 'masque-gdsii-write'.
modify_originals: If `True`, the original pattern is modified as part of the writing
process. Otherwise, a copy is made and `deepunlock()`-ed.
Default `False`.
disambiguate_func: Function which takes a list of patterns and alters them
to make their names valid and unique. Default is `disambiguate_pattern_names`, which
attempts to adhere to the GDSII standard as well as possible.
WARNING: No additional error checking is performed on the results.
Returns:
`gdsii.library.Library`
"""
if isinstance(patterns, Pattern):
patterns = [patterns]
if disambiguate_func is None:
disambiguate_func = disambiguate_pattern_names # type: ignore
assert(disambiguate_func is not None) # placate mypy
if not modify_originals:
patterns = [p.deepunlock() for p in copy.deepcopy(patterns)]
patterns = [p.wrap_repeated_shapes() for p in patterns]
# Create library
lib = gdsii.library.Library(version=600,
name=library_name.encode('ASCII'),
logical_unit=logical_units_per_unit,
physical_unit=meters_per_unit)
# Get a dict of id(pattern) -> pattern
patterns_by_id = {id(pattern): pattern for pattern in patterns}
for pattern in patterns:
for i, p in pattern.referenced_patterns_by_id().items():
patterns_by_id[i] = p
disambiguate_func(patterns_by_id.values())
# Now create a structure for each pattern, and add in any Boundary and SREF elements
for pat in patterns_by_id.values():
structure = gdsii.structure.Structure(name=pat.name.encode('ASCII'))
lib.append(structure)
structure += _shapes_to_elements(pat.shapes)
structure += _labels_to_texts(pat.labels)
structure += _subpatterns_to_refs(pat.subpatterns)
return lib
def write(
patterns: Union[Pattern, Sequence[Pattern]],
stream: io.BufferedIOBase,
*args,
**kwargs,
) -> None:
"""
Write a `Pattern` or list of patterns to a GDSII file.
See `masque.file.gdsii.build()` for details.
Args:
patterns: A Pattern or list of patterns to write to file.
stream: Stream to write to.
*args: passed to `masque.file.gdsii.build()`
**kwargs: passed to `masque.file.gdsii.build()`
"""
lib = build(patterns, *args, **kwargs)
lib.save(stream)
return
def writefile(
patterns: Union[Sequence[Pattern], Pattern],
filename: Union[str, pathlib.Path],
*args,
**kwargs,
) -> None:
"""
Wrapper for `masque.file.gdsii.write()` that takes a filename or path instead of a stream.
Will automatically compress the file if it has a .gz suffix.
Args:
patterns: `Pattern` or list of patterns to save
filename: Filename to save to.
*args: passed to `masque.file.gdsii.write`
**kwargs: passed to `masque.file.gdsii.write`
"""
path = pathlib.Path(filename)
if path.suffix == '.gz':
open_func: Callable = gzip.open
else:
open_func = open
with io.BufferedWriter(open_func(path, mode='wb')) as stream:
write(patterns, stream, *args, **kwargs)
def readfile(
filename: Union[str, pathlib.Path],
*args,
**kwargs,
) -> Tuple[Dict[str, Pattern], Dict[str, Any]]:
"""
Wrapper for `masque.file.gdsii.read()` that takes a filename or path instead of a stream.
Will automatically decompress gzipped files.
Args:
filename: Filename to save to.
*args: passed to `masque.file.gdsii.read`
**kwargs: passed to `masque.file.gdsii.read`
"""
path = pathlib.Path(filename)
if is_gzipped(path):
open_func: Callable = gzip.open
else:
open_func = open
with io.BufferedReader(open_func(path, mode='rb')) as stream:
results = read(stream, *args, **kwargs)
return results
def read(
stream: io.BufferedIOBase,
clean_vertices: bool = True,
) -> Tuple[Dict[str, Pattern], Dict[str, Any]]:
"""
Read a gdsii file and translate it into a dict of Pattern objects. GDSII structures are
translated into Pattern objects; boundaries are translated into polygons, and srefs and arefs
are translated into SubPattern objects.
Additional library info is returned in a dict, containing:
'name': name of the library
'meters_per_unit': number of meters per database unit (all values are in database units)
'logical_units_per_unit': number of "logical" units displayed by layout tools (typically microns)
per database unit
Args:
stream: Stream to read from.
clean_vertices: If `True`, remove any redundant vertices when loading polygons.
The cleaning process removes any polygons with zero area or <3 vertices.
Default `True`.
Returns:
- Dict of pattern_name:Patterns generated from GDSII structures
- Dict of GDSII library info
"""
lib = gdsii.library.Library.load(stream)
library_info = {'name': lib.name.decode('ASCII'),
'meters_per_unit': lib.physical_unit,
'logical_units_per_unit': lib.logical_unit,
}
raw_mode = True # Whether to construct shapes in raw mode (less error checking)
patterns = []
for structure in lib:
pat = Pattern(name=structure.name.decode('ASCII'))
for element in structure:
# Switch based on element type:
if isinstance(element, gdsii.elements.Boundary):
poly = _boundary_to_polygon(element, raw_mode)
pat.shapes.append(poly)
if isinstance(element, gdsii.elements.Path):
path = _gpath_to_mpath(element, raw_mode)
pat.shapes.append(path)
elif isinstance(element, gdsii.elements.Text):
label = Label(offset=element.xy.astype(float),
layer=(element.layer, element.text_type),
string=element.string.decode('ASCII'))
pat.labels.append(label)
elif isinstance(element, (gdsii.elements.SRef, gdsii.elements.ARef)):
pat.subpatterns.append(_ref_to_subpat(element))
if clean_vertices:
clean_pattern_vertices(pat)
patterns.append(pat)
# Create a dict of {pattern.name: pattern, ...}, then fix up all subpattern.pattern entries
# according to the subpattern.identifier (which is deleted after use).
patterns_dict = dict(((p.name, p) for p in patterns))
for p in patterns_dict.values():
for sp in p.subpatterns:
sp.pattern = patterns_dict[sp.identifier[0].decode('ASCII')]
del sp.identifier
return patterns_dict, library_info
def _mlayer2gds(mlayer: layer_t) -> Tuple[int, int]:
""" Helper to turn a layer tuple-or-int into a layer and datatype"""
if isinstance(mlayer, int):
layer = mlayer
data_type = 0
elif isinstance(mlayer, tuple):
layer = mlayer[0]
if len(mlayer) > 1:
data_type = mlayer[1]
else:
data_type = 0
else:
raise PatternError(f'Invalid layer for gdsii: {mlayer}. Note that gdsii layers cannot be strings.')
return layer, data_type
def _ref_to_subpat(
element: Union[gdsii.elements.SRef,
gdsii.elements.ARef]
) -> SubPattern:
"""
Helper function to create a SubPattern from an SREF or AREF. Sets subpat.pattern to None
and sets the instance .identifier to (struct_name,).
NOTE: "Absolute" means not affected by parent elements.
That's not currently supported by masque at all (and not planned).
"""
rotation = 0.0
offset = numpy.array(element.xy[0], dtype=float)
scale = 1.0
mirror_across_x = False
repetition = None
if element.strans is not None:
if element.mag is not None:
scale = element.mag
# Bit 13 means absolute scale
if get_bit(element.strans, 15 - 13):
raise PatternError('Absolute scale is not implemented in masque!')
if element.angle is not None:
rotation = numpy.deg2rad(element.angle)
# Bit 14 means absolute rotation
if get_bit(element.strans, 15 - 14):
raise PatternError('Absolute rotation is not implemented in masque!')
# Bit 0 means mirror x-axis
if get_bit(element.strans, 15 - 0):
mirror_across_x = True
if isinstance(element, gdsii.elements.ARef):
a_count = element.cols
b_count = element.rows
a_vector = (element.xy[1] - offset) / a_count
b_vector = (element.xy[2] - offset) / b_count
repetition = Grid(a_vector=a_vector, b_vector=b_vector,
a_count=a_count, b_count=b_count)
subpat = SubPattern(pattern=None,
offset=offset,
rotation=rotation,
scale=scale,
mirrored=(mirror_across_x, False),
annotations=_properties_to_annotations(element.properties),
repetition=repetition)
subpat.identifier = (element.struct_name,)
return subpat
def _gpath_to_mpath(element: gdsii.elements.Path, raw_mode: bool) -> Path:
if element.path_type in path_cap_map:
cap = path_cap_map[element.path_type]
else:
raise PatternError(f'Unrecognized path type: {element.path_type}')
args = {'vertices': element.xy.astype(float),
'layer': (element.layer, element.data_type),
'width': element.width if element.width is not None else 0.0,
'cap': cap,
'offset': numpy.zeros(2),
'annotations': _properties_to_annotations(element.properties),
'raw': raw_mode,
}
if cap == Path.Cap.SquareCustom:
args['cap_extensions'] = numpy.zeros(2)
if element.bgn_extn is not None:
args['cap_extensions'][0] = element.bgn_extn
if element.end_extn is not None:
args['cap_extensions'][1] = element.end_extn
return Path(**args)
def _boundary_to_polygon(element: gdsii.elements.Boundary, raw_mode: bool) -> Polygon:
args = {'vertices': element.xy[:-1].astype(float),
'layer': (element.layer, element.data_type),
'offset': numpy.zeros(2),
'annotations': _properties_to_annotations(element.properties),
'raw': raw_mode,
}
return Polygon(**args)
def _subpatterns_to_refs(
subpatterns: List[SubPattern],
) -> List[Union[gdsii.elements.ARef, gdsii.elements.SRef]]:
refs = []
for subpat in subpatterns:
if subpat.pattern is None:
continue
encoded_name = subpat.pattern.name.encode('ASCII')
# Note: GDS mirrors first and rotates second
mirror_across_x, extra_angle = normalize_mirror(subpat.mirrored)
rep = subpat.repetition
new_refs: List[Union[gdsii.elements.SRef, gdsii.elements.ARef]]
ref: Union[gdsii.elements.SRef, gdsii.elements.ARef]
if isinstance(rep, Grid):
b_vector = rep.b_vector if rep.b_vector is not None else numpy.zeros(2)
b_count = rep.b_count if rep.b_count is not None else 1
xy: NDArray[numpy.float64] = numpy.array(subpat.offset) + [
[0, 0],
rep.a_vector * rep.a_count,
b_vector * b_count,
]
ref = gdsii.elements.ARef(
struct_name=encoded_name,
xy=rint_cast(xy),
cols=rint_cast(rep.a_count),
rows=rint_cast(rep.b_count),
)
new_refs = [ref]
elif rep is None:
ref = gdsii.elements.SRef(
struct_name=encoded_name,
xy=rint_cast([subpat.offset]),
)
new_refs = [ref]
else:
new_refs = [gdsii.elements.SRef(
struct_name=encoded_name,
xy=rint_cast([subpat.offset + dd]),
)
for dd in rep.displacements]
for ref in new_refs:
ref.angle = numpy.rad2deg(subpat.rotation + extra_angle) % 360
# strans must be non-None for angle and mag to take effect
ref.strans = set_bit(0, 15 - 0, mirror_across_x)
ref.mag = subpat.scale
ref.properties = _annotations_to_properties(subpat.annotations, 512)
refs += new_refs
return refs
def _properties_to_annotations(properties: List[Tuple[int, bytes]]) -> annotations_t:
return {str(k): [v.decode()] for k, v in properties}
def _annotations_to_properties(annotations: annotations_t, max_len: int = 126) -> List[Tuple[int, bytes]]:
cum_len = 0
props = []
for key, vals in annotations.items():
try:
i = int(key)
except ValueError:
raise PatternError(f'Annotation key {key} is not convertable to an integer')
if not (0 < i < 126):
raise PatternError(f'Annotation key {key} converts to {i} (must be in the range [1,125])')
val_strings = ' '.join(str(val) for val in vals)
b = val_strings.encode()
if len(b) > 126:
raise PatternError(f'Annotation value {b!r} is longer than 126 characters!')
cum_len += numpy.ceil(len(b) / 2) * 2 + 2
if cum_len > max_len:
raise PatternError(f'Sum of annotation data will be longer than {max_len} bytes! Generated bytes were {b!r}')
props.append((i, b))
return props
def _shapes_to_elements(
shapes: List[Shape],
polygonize_paths: bool = False,
) -> List[Union[gdsii.elements.Boundary, gdsii.elements.Path]]:
elements: List[Union[gdsii.elements.Boundary, gdsii.elements.Path]] = []
# Add a Boundary element for each shape, and Path elements if necessary
for shape in shapes:
layer, data_type = _mlayer2gds(shape.layer)
properties = _annotations_to_properties(shape.annotations, 128)
if isinstance(shape, Path) and not polygonize_paths:
xy = rint_cast(shape.vertices + shape.offset)
width = rint_cast(shape.width)
path_type = next(k for k, v in path_cap_map.items() if v == shape.cap) # reverse lookup
path = gdsii.elements.Path(layer=layer,
data_type=data_type,
xy=xy)
path.path_type = path_type
path.width = width
path.properties = properties
elements.append(path)
else:
for polygon in shape.to_polygons():
xy_closed = numpy.empty((polygon.vertices.shape[0] + 1, 2), dtype=numpy.int32)
numpy.rint(polygon.vertices + polygon.offset, out=xy_closed[:-1], casting='unsafe')
xy_closed[-1] = xy_closed[0]
boundary = gdsii.elements.Boundary(
layer=layer,
data_type=data_type,
xy=xy_closed,
)
boundary.properties = properties
elements.append(boundary)
return elements
def _labels_to_texts(labels: List[Label]) -> List[gdsii.elements.Text]:
texts = []
for label in labels:
properties = _annotations_to_properties(label.annotations, 128)
layer, text_type = _mlayer2gds(label.layer)
xy = rint_cast([label.offset])
text = gdsii.elements.Text(
layer=layer,
text_type=text_type,
xy=xy,
string=label.string.encode('ASCII'),
)
text.properties = properties
texts.append(text)
return texts
def disambiguate_pattern_names(
patterns: Sequence[Pattern],
max_name_length: int = 32,
suffix_length: int = 6,
dup_warn_filter: Optional[Callable[[str], bool]] = None,
) -> None:
"""
Args:
patterns: List of patterns to disambiguate
max_name_length: Names longer than this will be truncated
suffix_length: Names which get truncated are truncated by this many extra characters. This is to
leave room for a suffix if one is necessary.
dup_warn_filter: (optional) Function for suppressing warnings about cell names changing. Receives
the cell name and returns `False` if the warning should be suppressed and `True` if it should
be displayed. Default displays all warnings.
"""
used_names = []
for pat in set(patterns):
# Shorten names which already exceed max-length
if len(pat.name) > max_name_length:
shortened_name = pat.name[:max_name_length - suffix_length]
logger.warning(f'Pattern name "{pat.name}" is too long ({len(pat.name)}/{max_name_length} chars),\n'
+ f' shortening to "{shortened_name}" before generating suffix')
else:
shortened_name = pat.name
# Remove invalid characters
sanitized_name = re.compile(r'[^A-Za-z0-9_\?\$]').sub('_', shortened_name)
# Add a suffix that makes the name unique
i = 0
suffixed_name = sanitized_name
while suffixed_name in used_names or suffixed_name == '':
suffix = base64.b64encode(struct.pack('>Q', i), b'$?').decode('ASCII')
suffixed_name = sanitized_name + '$' + suffix[:-1].lstrip('A')
i += 1
if sanitized_name == '':
logger.warning(f'Empty pattern name saved as "{suffixed_name}"')
elif suffixed_name != sanitized_name:
if dup_warn_filter is None or dup_warn_filter(pat.name):
logger.warning(f'Pattern name "{pat.name}" ({sanitized_name}) appears multiple times;\n'
+ f' renaming to "{suffixed_name}"')
# Encode into a byte-string and perform some final checks
encoded_name = suffixed_name.encode('ASCII')
if len(encoded_name) == 0:
# Should never happen since zero-length names are replaced
raise PatternError(f'Zero-length name after sanitize+encode,\n originally "{pat.name}"')
if len(encoded_name) > max_name_length:
raise PatternError(f'Pattern name "{encoded_name!r}" length > {max_name_length} after encode,\n'
+ f' originally "{pat.name}"')
pat.name = suffixed_name
used_names.append(suffixed_name)

View File

@ -1,7 +1,7 @@
""" """
SVG file format readers and writers SVG file format readers and writers
""" """
from collections.abc import Mapping from typing import Dict, Optional
import warnings import warnings
import numpy import numpy
@ -13,23 +13,22 @@ from .. import Pattern
def writefile( def writefile(
library: Mapping[str, Pattern], pattern: Pattern,
top: str,
filename: str, filename: str,
custom_attributes: bool = False, custom_attributes: bool = False,
) -> None: ) -> None:
""" """
Write a Pattern to an SVG file, by first calling .polygonize() on it Write a Pattern to an SVG file, by first calling .polygonize() on it
to change the shapes into polygons, and then writing patterns as SVG to change the shapes into polygons, and then writing patterns as SVG
groups (<g>, inside <defs>), polygons as paths (<path>), and refs groups (<g>, inside <defs>), polygons as paths (<path>), and subpatterns
as <use> elements. as <use> elements.
Note that this function modifies the Pattern. Note that this function modifies the Pattern.
If `custom_attributes` is `True`, a non-standard `pattern_layer` attribute If `custom_attributes` is `True`, non-standard `pattern_layer` and `pattern_dose` attributes
is written to the relevant elements. are written to the relevant elements.
It is often a good idea to run `pattern.dedup()` on pattern prior to It is often a good idea to run `pattern.subpatternize()` on pattern prior to
calling this function, especially if calling `.polygonize()` will result in very calling this function, especially if calling `.polygonize()` will result in very
many vertices. many vertices.
@ -39,18 +38,17 @@ def writefile(
Args: Args:
pattern: Pattern to write to file. Modified by this function. pattern: Pattern to write to file. Modified by this function.
filename: Filename to write to. filename: Filename to write to.
custom_attributes: Whether to write non-standard `pattern_layer` attribute to the custom_attributes: Whether to write non-standard `pattern_layer` and
SVG elements. `pattern_dose` attributes to the SVG elements.
""" """
pattern = library[top]
# Polygonize pattern # Polygonize pattern
pattern.polygonize() pattern.polygonize()
bounds = pattern.get_bounds(library=library) bounds = pattern.get_bounds()
if bounds is None: if bounds is None:
bounds_min, bounds_max = numpy.array([[-1, -1], [1, 1]]) bounds_min, bounds_max = numpy.array([[-1, -1], [1, 1]])
warnings.warn('Pattern had no bounds (empty?); setting arbitrary viewbox', stacklevel=1) warnings.warn('Pattern had no bounds (empty?); setting arbitrary viewbox')
else: else:
bounds_min, bounds_max = bounds bounds_min, bounds_max = bounds
@ -61,39 +59,42 @@ def writefile(
svg = svgwrite.Drawing(filename, profile='full', viewBox=viewbox_string, svg = svgwrite.Drawing(filename, profile='full', viewBox=viewbox_string,
debug=(not custom_attributes)) debug=(not custom_attributes))
# Now create a group for each pattern and add in any Boundary and Use elements # Get a dict of id(pattern) -> pattern
for name, pat in library.items(): patterns_by_id = {**(pattern.referenced_patterns_by_id()), id(pattern): pattern} # type: Dict[int, Optional[Pattern]]
svg_group = svg.g(id=mangle_name(name), fill='blue', stroke='red')
for layer, shapes in pat.shapes.items(): # Now create a group for each row in sd_table (ie, each pattern + dose combination)
for shape in shapes: # and add in any Boundary and Use elements
for pat in patterns_by_id.values():
if pat is None:
continue
svg_group = svg.g(id=mangle_name(pat), fill='blue', stroke='red')
for shape in pat.shapes:
for polygon in shape.to_polygons(): for polygon in shape.to_polygons():
path_spec = poly2path(polygon.vertices + polygon.offset) path_spec = poly2path(polygon.vertices + polygon.offset)
path = svg.path(d=path_spec) path = svg.path(d=path_spec)
if custom_attributes: if custom_attributes:
path['pattern_layer'] = layer path['pattern_layer'] = polygon.layer
path['pattern_dose'] = polygon.dose
svg_group.add(path) svg_group.add(path)
for target, refs in pat.refs.items(): for subpat in pat.subpatterns:
if target is None: if subpat.pattern is None:
continue continue
for ref in refs: transform = f'scale({subpat.scale:g}) rotate({subpat.rotation:g}) translate({subpat.offset[0]:g},{subpat.offset[1]:g})'
transform = f'scale({ref.scale:g}) rotate({ref.rotation:g}) translate({ref.offset[0]:g},{ref.offset[1]:g})' use = svg.use(href='#' + mangle_name(subpat.pattern), transform=transform)
use = svg.use(href='#' + mangle_name(target), transform=transform) if custom_attributes:
use['pattern_dose'] = subpat.dose
svg_group.add(use) svg_group.add(use)
svg.defs.add(svg_group) svg.defs.add(svg_group)
svg.add(svg.use(href='#' + mangle_name(top))) svg.add(svg.use(href='#' + mangle_name(pattern)))
svg.save() svg.save()
def writefile_inverted( def writefile_inverted(pattern: Pattern, filename: str):
library: Mapping[str, Pattern],
top: str,
filename: str,
) -> None:
""" """
Write an inverted Pattern to an SVG file, by first calling `.polygonize()` and Write an inverted Pattern to an SVG file, by first calling `.polygonize()` and
`.flatten()` on it to change the shapes into polygons, then drawing a bounding `.flatten()` on it to change the shapes into polygons, then drawing a bounding
@ -109,15 +110,13 @@ def writefile_inverted(
pattern: Pattern to write to file. Modified by this function. pattern: Pattern to write to file. Modified by this function.
filename: Filename to write to. filename: Filename to write to.
""" """
pattern = library[top]
# Polygonize and flatten pattern # Polygonize and flatten pattern
pattern.polygonize().flatten(library) pattern.polygonize().flatten()
bounds = pattern.get_bounds(library=library) bounds = pattern.get_bounds()
if bounds is None: if bounds is None:
bounds_min, bounds_max = numpy.array([[-1, -1], [1, 1]]) bounds_min, bounds_max = numpy.array([[-1, -1], [1, 1]])
warnings.warn('Pattern had no bounds (empty?); setting arbitrary viewbox', stacklevel=1) warnings.warn('Pattern had no bounds (empty?); setting arbitrary viewbox')
else: else:
bounds_min, bounds_max = bounds bounds_min, bounds_max = bounds
@ -135,8 +134,7 @@ def writefile_inverted(
path_spec = poly2path(slab_edge) path_spec = poly2path(slab_edge)
# Draw polygons with reversed vertex order # Draw polygons with reversed vertex order
for _layer, shapes in pattern.shapes.items(): for shape in pattern.shapes:
for shape in shapes:
for polygon in shape.to_polygons(): for polygon in shape.to_polygons():
path_spec += poly2path(polygon.vertices[::-1] + polygon.offset) path_spec += poly2path(polygon.vertices[::-1] + polygon.offset)
@ -154,9 +152,9 @@ def poly2path(vertices: ArrayLike) -> str:
Returns: Returns:
SVG path-string. SVG path-string.
""" """
verts = numpy.asarray(vertices) verts = numpy.array(vertices, copy=False)
commands = 'M{:g},{:g} '.format(verts[0][0], verts[0][1]) # noqa: UP032 commands = 'M{:g},{:g} '.format(verts[0][0], verts[0][1])
for vertex in verts[1:]: for vertex in verts[1:]:
commands += 'L{:g},{:g}'.format(vertex[0], vertex[1]) # noqa: UP032 commands += 'L{:g},{:g}'.format(vertex[0], vertex[1])
commands += ' Z ' commands += ' Z '
return commands return commands

View File

@ -1,105 +1,29 @@
""" """
Helper functions for file reading and writing Helper functions for file reading and writing
""" """
from typing import IO from typing import Set, Tuple, List
from collections.abc import Iterator, Mapping
import re import re
import copy
import pathlib import pathlib
import logging
import tempfile
import shutil
from collections import defaultdict
from contextlib import contextmanager
from pprint import pformat
from itertools import chain
from .. import Pattern, PatternError, Library, LibraryError from .. import Pattern, PatternError
from ..shapes import Polygon, Path from ..shapes import Polygon, Path
logger = logging.getLogger(__name__) def mangle_name(pattern: Pattern, dose_multiplier: float = 1.0) -> str:
def preflight(
lib: Library,
sort: bool = True,
sort_elements: bool = False,
allow_dangling_refs: bool | None = None,
allow_named_layers: bool = True,
prune_empty_patterns: bool = False,
wrap_repeated_shapes: bool = False,
) -> Library:
""" """
Run a standard set of useful operations and checks, usually done immediately prior Create a name using `pattern.name`, `id(pattern)`, and the dose multiplier.
to writing to a file (or immediately after reading).
Args: Args:
sort: Whether to sort the patterns based on their names, and optionaly sort the pattern contents. pattern: Pattern whose name we want to mangle.
Default True. Useful for reproducible builds. dose_multiplier: Dose multiplier to mangle with.
sort_elements: Whether to sort the pattern contents. Requires sort=True to run.
allow_dangling_refs: If `None` (default), warns about any refs to patterns that are not
in the provided library. If `True`, no check is performed; if `False`, a `LibraryError`
is raised instead.
allow_named_layers: If `False`, raises a `PatternError` if any layer is referred to by
a string instead of a number (or tuple).
prune_empty_patterns: Runs `Library.prune_empty()`, recursively deleting any empty patterns.
wrap_repeated_shapes: Runs `Library.wrap_repeated_shapes()`, turning repeated shapes into
repeated refs containing non-repeated shapes.
Returns:
`lib` or an equivalent sorted library
"""
if sort:
lib = Library(dict(sorted(
(nn, pp.sort(sort_elements=sort_elements)) for nn, pp in lib.items()
)))
if not allow_dangling_refs:
refs = lib.referenced_patterns()
dangling = refs - set(lib.keys())
if dangling:
msg = 'Dangling refs found: ' + pformat(dangling)
if allow_dangling_refs is None:
logger.warning(msg)
else:
raise LibraryError(msg)
if not allow_named_layers:
named_layers: Mapping[str, set] = defaultdict(set)
for name, pat in lib.items():
for layer in chain(pat.shapes.keys(), pat.labels.keys()):
if isinstance(layer, str):
named_layers[name].add(layer)
named_layers = dict(named_layers)
if named_layers:
raise PatternError('Non-numeric layers found:' + pformat(named_layers))
if prune_empty_patterns:
pruned = lib.prune_empty()
if pruned:
logger.info(f'Preflight pruned {len(pruned)} empty patterns')
logger.debug('Pruned: ' + pformat(pruned))
else:
logger.debug('Preflight found no empty patterns')
if wrap_repeated_shapes:
lib.wrap_repeated_shapes()
return lib
def mangle_name(name: str) -> str:
"""
Sanitize a name.
Args:
name: Name we want to mangle.
Returns: Returns:
Mangled name. Mangled name.
""" """
expression = re.compile(r'[^A-Za-z0-9_\?\$]') expression = re.compile(r'[^A-Za-z0-9_\?\$]')
sanitized_name = expression.sub('_', name) full_name = '{}_{}_{}'.format(pattern.name, dose_multiplier, id(pattern))
sanitized_name = expression.sub('_', full_name)
return sanitized_name return sanitized_name
@ -114,39 +38,149 @@ def clean_pattern_vertices(pat: Pattern) -> Pattern:
Returns: Returns:
pat pat
""" """
for shapes in pat.shapes.values():
remove_inds = [] remove_inds = []
for ii, shape in enumerate(shapes): for ii, shape in enumerate(pat.shapes):
if not isinstance(shape, Polygon | Path): if not isinstance(shape, (Polygon, Path)):
continue continue
try: try:
shape.clean_vertices() shape.clean_vertices()
except PatternError: except PatternError:
remove_inds.append(ii) remove_inds.append(ii)
for ii in sorted(remove_inds, reverse=True): for ii in sorted(remove_inds, reverse=True):
del shapes[ii] del pat.shapes[ii]
return pat return pat
def make_dose_table(patterns: List[Pattern], dose_multiplier: float = 1.0) -> Set[Tuple[int, float]]:
"""
Create a set containing `(id(pat), written_dose)` for each pattern (including subpatterns)
Args:
pattern: Source Patterns.
dose_multiplier: Multiplier for all written_dose entries.
Returns:
`{(id(subpat.pattern), written_dose), ...}`
"""
dose_table = {(id(pattern), dose_multiplier) for pattern in patterns}
for pattern in patterns:
for subpat in pattern.subpatterns:
if subpat.pattern is None:
continue
subpat_dose_entry = (id(subpat.pattern), subpat.dose * dose_multiplier)
if subpat_dose_entry not in dose_table:
subpat_dose_table = make_dose_table([subpat.pattern], subpat.dose * dose_multiplier)
dose_table = dose_table.union(subpat_dose_table)
return dose_table
def dtype2dose(pattern: Pattern) -> Pattern:
"""
For each shape in the pattern, if the layer is a tuple, set the
layer to the tuple's first element and set the dose to the
tuple's second element.
Generally intended for use with `Pattern.apply()`.
Args:
pattern: Pattern to modify
Returns:
pattern
"""
for shape in pattern.shapes:
if isinstance(shape.layer, tuple):
shape.dose = shape.layer[1]
shape.layer = shape.layer[0]
return pattern
def dose2dtype(
patterns: List[Pattern],
) -> Tuple[List[Pattern], List[float]]:
"""
For each shape in each pattern, set shape.layer to the tuple
(base_layer, datatype), where:
layer is chosen to be equal to the original shape.layer if it is an int,
or shape.layer[0] if it is a tuple. `str` layers raise a PatterError.
datatype is chosen arbitrarily, based on calcualted dose for each shape.
Shapes with equal calcualted dose will have the same datatype.
A list of doses is retured, providing a mapping between datatype
(list index) and dose (list entry).
Note that this function modifies the input Pattern(s).
Args:
patterns: A `Pattern` or list of patterns to write to file. Modified by this function.
Returns:
(patterns, dose_list)
patterns: modified input patterns
dose_list: A list of doses, providing a mapping between datatype (int, list index)
and dose (float, list entry).
"""
# Get a dict of id(pattern) -> pattern
patterns_by_id = {id(pattern): pattern for pattern in patterns}
for pattern in patterns:
for i, p in pattern.referenced_patterns_by_id().items():
patterns_by_id[i] = p
# Get a table of (id(pat), written_dose) for each pattern and subpattern
sd_table = make_dose_table(patterns)
# Figure out all the unique doses necessary to write this pattern
# This means going through each row in sd_table and adding the dose values needed to write
# that subpattern at that dose level
dose_vals = set()
for pat_id, pat_dose in sd_table:
pat = patterns_by_id[pat_id]
for shape in pat.shapes:
dose_vals.add(shape.dose * pat_dose)
if len(dose_vals) > 256:
raise PatternError('Too many dose values: {}, maximum 256 when using dtypes.'.format(len(dose_vals)))
dose_vals_list = list(dose_vals)
# Create a new pattern for each non-1-dose entry in the dose table
# and update the shapes to reflect their new dose
new_pats = {} # (id, dose) -> new_pattern mapping
for pat_id, pat_dose in sd_table:
if pat_dose == 1:
new_pats[(pat_id, pat_dose)] = patterns_by_id[pat_id]
continue
old_pat = patterns_by_id[pat_id]
pat = old_pat.copy() # keep old subpatterns
pat.shapes = copy.deepcopy(old_pat.shapes)
pat.labels = copy.deepcopy(old_pat.labels)
encoded_name = mangle_name(pat, pat_dose)
if len(encoded_name) == 0:
raise PatternError('Zero-length name after mangle+encode, originally "{}"'.format(pat.name))
pat.name = encoded_name
for shape in pat.shapes:
data_type = dose_vals_list.index(shape.dose * pat_dose)
if isinstance(shape.layer, int):
shape.layer = (shape.layer, data_type)
elif isinstance(shape.layer, tuple):
shape.layer = (shape.layer[0], data_type)
else:
raise PatternError(f'Invalid layer for gdsii: {shape.layer}')
new_pats[(pat_id, pat_dose)] = pat
# Go back through all the dose-specific patterns and fix up their subpattern entries
for (pat_id, pat_dose), pat in new_pats.items():
for subpat in pat.subpatterns:
dose_mult = subpat.dose * pat_dose
subpat.pattern = new_pats[(id(subpat.pattern), dose_mult)]
return patterns, dose_vals_list
def is_gzipped(path: pathlib.Path) -> bool: def is_gzipped(path: pathlib.Path) -> bool:
with path.open('rb') as stream: with open(path, 'rb') as stream:
magic_bytes = stream.read(2) magic_bytes = stream.read(2)
return magic_bytes == b'\x1f\x8b' return magic_bytes == b'\x1f\x8b'
@contextmanager
def tmpfile(path: str | pathlib.Path) -> Iterator[IO[bytes]]:
"""
Context manager which allows you to write to a temporary file,
and move that file into its final location only after the write
has finished.
"""
path = pathlib.Path(path)
suffixes = ''.join(path.suffixes)
with tempfile.NamedTemporaryFile(suffix=suffixes, delete=False) as tmp_stream:
yield tmp_stream
try:
shutil.move(tmp_stream.name, path)
finally:
pathlib.Path(tmp_stream.name).unlink(missing_ok=True)

View File

@ -1,30 +1,31 @@
from typing import Self, Any from typing import Tuple, Dict, Optional, TypeVar
import copy import copy
import functools
import numpy import numpy
from numpy.typing import ArrayLike, NDArray from numpy.typing import ArrayLike, NDArray
from .repetition import Repetition from .repetition import Repetition
from .utils import rotation_matrix_2d, annotations_t, annotations_eq, annotations_lt, rep2key from .utils import rotation_matrix_2d, layer_t, AutoSlots, annotations_t
from .traits import PositionableImpl, Copyable, Pivotable, RepeatableImpl, Bounded from .traits import PositionableImpl, LayerableImpl, Copyable, Pivotable, LockableImpl, RepeatableImpl
from .traits import AnnotatableImpl from .traits import AnnotatableImpl
@functools.total_ordering L = TypeVar('L', bound='Label')
class Label(PositionableImpl, RepeatableImpl, AnnotatableImpl, Bounded, Pivotable, Copyable):
class Label(PositionableImpl, LayerableImpl, LockableImpl, RepeatableImpl, AnnotatableImpl,
Pivotable, Copyable, metaclass=AutoSlots):
""" """
A text annotation with a position (but no size; it is not drawn) A text annotation with a position and layer (but no size; it is not drawn)
""" """
__slots__ = ( __slots__ = ( '_string', 'identifier')
'_string',
# Inherited
'_offset', '_repetition', '_annotations',
)
_string: str _string: str
""" Label string """ """ Label string """
identifier: Tuple
""" Arbitrary identifier tuple, useful for keeping track of history when flattening """
''' '''
---- Properties ---- Properties
''' '''
@ -45,45 +46,38 @@ class Label(PositionableImpl, RepeatableImpl, AnnotatableImpl, Bounded, Pivotabl
string: str, string: str,
*, *,
offset: ArrayLike = (0.0, 0.0), offset: ArrayLike = (0.0, 0.0),
repetition: Repetition | None = None, layer: layer_t = 0,
annotations: annotations_t | None = None, repetition: Optional[Repetition] = None,
annotations: Optional[annotations_t] = None,
locked: bool = False,
identifier: Tuple = (),
) -> None: ) -> None:
LockableImpl.unlock(self)
self.identifier = identifier
self.string = string self.string = string
self.offset = numpy.array(offset, dtype=float) self.offset = numpy.array(offset, dtype=float, copy=True)
self.layer = layer
self.repetition = repetition self.repetition = repetition
self.annotations = annotations if annotations is not None else {} self.annotations = annotations if annotations is not None else {}
self.set_locked(locked)
def __copy__(self) -> Self: def __copy__(self: L) -> L:
return type(self)( return type(self)(string=self.string,
string=self.string,
offset=self.offset.copy(), offset=self.offset.copy(),
layer=self.layer,
repetition=self.repetition, repetition=self.repetition,
) locked=self.locked,
identifier=self.identifier)
def __deepcopy__(self, memo: dict | None = None) -> Self: def __deepcopy__(self: L, memo: Dict = None) -> L:
memo = {} if memo is None else memo memo = {} if memo is None else memo
new = copy.copy(self) new = copy.copy(self)
LockableImpl.unlock(new)
new._offset = self._offset.copy() new._offset = self._offset.copy()
new.set_locked(self.locked)
return new return new
def __lt__(self, other: 'Label') -> bool: def rotate_around(self: L, pivot: ArrayLike, rotation: float) -> L:
if self.string != other.string:
return self.string < other.string
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
def __eq__(self, other: Any) -> bool:
return (
self.string == other.string
and numpy.array_equal(self.offset, other.offset)
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self:
""" """
Rotate the label around a point. Rotate the label around a point.
@ -94,13 +88,13 @@ class Label(PositionableImpl, RepeatableImpl, AnnotatableImpl, Bounded, Pivotabl
Returns: Returns:
self self
""" """
pivot = numpy.asarray(pivot, dtype=float) pivot = numpy.array(pivot, dtype=float)
self.translate(-pivot) self.translate(-pivot)
self.offset = numpy.dot(rotation_matrix_2d(rotation), self.offset) self.offset = numpy.dot(rotation_matrix_2d(rotation), self.offset)
self.translate(+pivot) self.translate(+pivot)
return self return self
def get_bounds_single(self) -> NDArray[numpy.float64]: def get_bounds(self) -> NDArray[numpy.float64]:
""" """
Return the bounds of the label. Return the bounds of the label.
@ -112,3 +106,17 @@ class Label(PositionableImpl, RepeatableImpl, AnnotatableImpl, Bounded, Pivotabl
Bounds [[xmin, xmax], [ymin, ymax]] Bounds [[xmin, xmax], [ymin, ymax]]
""" """
return numpy.array([self.offset, self.offset]) return numpy.array([self.offset, self.offset])
def lock(self: L) -> L:
PositionableImpl._lock(self)
LockableImpl.lock(self)
return self
def unlock(self: L) -> L:
LockableImpl.unlock(self)
PositionableImpl._unlock(self)
return self
def __repr__(self) -> str:
locked = ' L' if self.locked else ''
return f'<Label "{self.string}" l{self.layer} o{self.offset}{locked}>'

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,2 @@
from .library import Library, PatternGenerator
from .device_library import DeviceLibrary, LibDeviceLibrary

View File

@ -0,0 +1,298 @@
"""
DeviceLibrary class for managing unique name->device mappings and
deferred loading or creation.
"""
from typing import Dict, Callable, TypeVar, TYPE_CHECKING
from typing import Any, Tuple, Union, Iterator
import logging
from pprint import pformat
from ..error import DeviceLibraryError
from ..library import Library
from ..builder import Device
from .. import Pattern
logger = logging.getLogger(__name__)
D = TypeVar('D', bound='DeviceLibrary')
L = TypeVar('L', bound='LibDeviceLibrary')
class DeviceLibrary:
"""
This class maps names to functions which generate or load the
relevant `Device` object.
This class largely functions the same way as `Library`, but
operates on `Device`s rather than `Patterns` and thus has no
need for distinctions between primary/secondary devices (as
there is no inter-`Device` hierarchy).
Each device is cached the first time it is used. The cache can
be disabled by setting the `enable_cache` attribute to `False`.
"""
generators: Dict[str, Callable[[], Device]]
cache: Dict[Union[str, Tuple[str, str]], Device]
enable_cache: bool = True
def __init__(self) -> None:
self.generators = {}
self.cache = {}
def __setitem__(self, key: str, value: Callable[[], Device]) -> None:
self.generators[key] = value
if key in self.cache:
del self.cache[key]
def __delitem__(self, key: str) -> None:
del self.generators[key]
if key in self.cache:
del self.cache[key]
def __getitem__(self, key: str) -> Device:
if self.enable_cache and key in self.cache:
logger.debug(f'found {key} in cache')
return self.cache[key]
logger.debug(f'loading {key}')
dev = self.generators[key]()
self.cache[key] = dev
return dev
def __iter__(self) -> Iterator[str]:
return iter(self.keys())
def __contains__(self, key: str) -> bool:
return key in self.generators
def keys(self) -> Iterator[str]:
return iter(self.generators.keys())
def values(self) -> Iterator[Device]:
return iter(self[key] for key in self.keys())
def items(self) -> Iterator[Tuple[str, Device]]:
return iter((key, self[key]) for key in self.keys())
def __repr__(self) -> str:
return '<DeviceLibrary with keys ' + repr(list(self.generators.keys())) + '>'
def set_const(self, const: Device) -> None:
"""
Convenience function to avoid having to manually wrap
already-generated Device objects into callables.
Args:
const: Pre-generated device object
"""
self.generators[const.pattern.name] = lambda: const
def add(
self: D,
other: D,
use_ours: Callable[[str], bool] = lambda name: False,
use_theirs: Callable[[str], bool] = lambda name: False,
) -> D:
"""
Add keys from another library into this one.
There must be no conflicting keys.
Args:
other: The library to insert keys from
use_ours: Decision function for name conflicts. Will be called with duplicate cell names.
Should return `True` if the value from `self` should be used.
use_theirs: Decision function for name conflicts. Same format as `use_ours`.
Should return `True` if the value from `other` should be used.
`use_ours` takes priority over `use_theirs`.
Returns:
self
"""
duplicates = set(self.keys()) & set(other.keys())
keep_ours = set(name for name in duplicates if use_ours(name))
keep_theirs = set(name for name in duplicates - keep_ours if use_theirs(name))
conflicts = duplicates - keep_ours - keep_theirs
if conflicts:
raise DeviceLibraryError('Duplicate keys encountered in DeviceLibrary merge: '
+ pformat(conflicts))
for name in set(other.generators.keys()) - keep_ours:
self.generators[name] = other.generators[name]
if name in other.cache:
self.cache[name] = other.cache[name]
return self
def clear_cache(self: D) -> D:
"""
Clear the cache of this library.
This is usually used before modifying or deleting cells, e.g. when merging
with another library.
Returns:
self
"""
self.cache = {}
return self
def add_device(
self,
name: str,
fn: Callable[[], Device],
dev2pat: Callable[[Device], Pattern],
prefix: str = '',
) -> None:
"""
Convenience function for adding a device to the library.
- The device is generated with the provided `fn()`
- Port info is written to the pattern using the provied dev2pat
- The pattern is renamed to match the provided `prefix + name`
- If `prefix` is non-empty, a wrapped copy is also added, named
`name` (no prefix). See `wrap_device()` for details.
Adding devices with this function helps to
- Make sure Pattern names are reflective of what the devices are named
- Ensure port info is written into the `Pattern`, so that the `Device`
can be reconstituted from the layout.
- Simplify adding a prefix to all device names, to make it easier to
track their provenance and purpose, while also allowing for
generic device names which can later be swapped out with different
underlying implementations.
Args:
name: Base name for the device. If a prefix is used, this is the
"generic" name (e.g. "L3_cavity" vs "2022_02_02_L3_cavity").
fn: Function which is called to generate the device.
dev2pat: Post-processing function which is called to add the port
info into the device's pattern.
prefix: If present, the actual device is named `prefix + name`, and
a second device with name `name` is also added (containing only
this one).
"""
def build_dev() -> Device:
dev = fn()
dev.pattern = dev2pat(dev)
dev.pattern.rename(prefix + name)
return dev
self[prefix + name] = build_dev
if prefix:
self.wrap_device(name, prefix + name)
def wrap_device(
self,
name: str,
old_name: str,
) -> None:
"""
Create a new device which simply contains an instance of an already-existing device.
This is useful for assigning an alternate name to a device, while still keeping
the original name available for traceability.
Args:
name: Name for the wrapped device.
old_name: Name of the existing device to wrap.
"""
def build_wrapped_dev() -> Device:
old_dev = self[old_name]
wrapper = Pattern(name=name)
wrapper.addsp(old_dev.pattern)
return Device(wrapper, old_dev.ports)
self[name] = build_wrapped_dev
class LibDeviceLibrary(DeviceLibrary):
"""
Extends `DeviceLibrary`, enabling it to ingest `Library` objects
(e.g. obtained by loading a GDS file).
Each `Library` object must be accompanied by a `pat2dev` function,
which takes in the `Pattern` and returns a full `Device` (including
port info). This is usually accomplished by scanning the `Pattern` for
port-related geometry, but could also bake in external info.
`Library` objects are ingested into `underlying`, which is a
`Library` which is kept in sync with the `DeviceLibrary` when
devices are removed (or new libraries added via `add_library()`).
"""
underlying: Library
def __init__(self) -> None:
DeviceLibrary.__init__(self)
self.underlying = Library()
def __setitem__(self, key: str, value: Callable[[], Device]) -> None:
self.generators[key] = value
if key in self.cache:
del self.cache[key]
# If any `Library` that has been (or will be) added has an entry for `key`,
# it will be added to `self.underlying` and then returned by it during subpattern
# resolution for other entries, and will conflict with the name for our
# wrapped device. To avoid that, we need to set ourselves as the "true" source of
# the `Pattern` named `key`.
if key in self.underlying:
raise DeviceLibraryError(f'Device name {key} already exists in underlying Library!'
' Demote or delete it first.')
# NOTE that this means the `Device` may be cached without the `Pattern` being in
# the `underlying` cache yet!
self.underlying.set_value(key, '__DeviceLibrary', lambda: self[key].pattern)
def __delitem__(self, key: str) -> None:
DeviceLibrary.__delitem__(self, key)
if key in self.underlying:
del self.underlying[key]
def add_library(
self: L,
lib: Library,
pat2dev: Callable[[Pattern], Device],
use_ours: Callable[[Union[str, Tuple[str, str]]], bool] = lambda name: False,
use_theirs: Callable[[Union[str, Tuple[str, str]]], bool] = lambda name: False,
) -> L:
"""
Add a pattern `Library` into this `LibDeviceLibrary`.
This requires a `pat2dev` function which can transform each `Pattern`
into a `Device`. For example, this can be accomplished by scanning
the `Pattern` data for port location info or by looking up port info
based on the pattern name or other characteristics in a hardcoded or
user-supplied dictionary.
Args:
lib: Pattern library to add.
pat2dev: Function for transforming each `Pattern` object from `lib`
into a `Device` which will be returned by this device library.
use_ours: Decision function for name conflicts. Will be called with
duplicate cell names, and (name, tag) tuples from the underlying library.
Should return `True` if the value from `self` should be used.
use_theirs: Decision function for name conflicts. Same format as `use_ours`.
Should return `True` if the value from `other` should be used.
`use_ours` takes priority over `use_theirs`.
Returns:
self
"""
duplicates = set(lib.keys()) & set(self.keys())
keep_ours = set(name for name in duplicates if use_ours(name))
keep_theirs = set(name for name in duplicates - keep_ours if use_theirs(name))
bad_duplicates = duplicates - keep_ours - keep_theirs
if bad_duplicates:
raise DeviceLibraryError('Duplicate devices (no action specified): ' + pformat(bad_duplicates))
# No 'bad' duplicates, so all duplicates should be overwritten
for name in keep_theirs:
self.underlying.demote(name)
self.underlying.add(lib, use_ours, use_theirs)
for name in lib:
self.generators[name] = lambda name=name: pat2dev(self.underlying[name])
return self

355
masque/library/library.py Normal file
View File

@ -0,0 +1,355 @@
"""
Library class for managing unique name->pattern mappings and
deferred loading or creation.
"""
from typing import Dict, Callable, TypeVar, TYPE_CHECKING
from typing import Any, Tuple, Union, Iterator
import logging
from pprint import pformat
from dataclasses import dataclass
import copy
from ..error import LibraryError
if TYPE_CHECKING:
from ..pattern import Pattern
logger = logging.getLogger(__name__)
@dataclass
class PatternGenerator:
__slots__ = ('tag', 'gen')
tag: str
""" Unique identifier for the source """
gen: Callable[[], 'Pattern']
""" Function which generates a pattern when called """
L = TypeVar('L', bound='Library')
class Library:
"""
This class is usually used to create a library of Patterns by mapping names to
functions which generate or load the relevant `Pattern` object as-needed.
Generated/loaded patterns can have "symbolic" references, where a SubPattern
object `sp` has a `None`-valued `sp.pattern` attribute, in which case the
Library expects `sp.identifier[0]` to contain a string which specifies the
referenced pattern's name.
Patterns can either be "primary" (default) or "secondary". Both get the
same deferred-load behavior, but "secondary" patterns may have conflicting
names and are not accessible through basic []-indexing. They are only used
to fill symbolic references in cases where there is no "primary" pattern
available, and only if both the referencing and referenced pattern-generators'
`tag` values match (i.e., only if they came from the same source).
Primary patterns can be turned into secondary patterns with the `demote`
method, `promote` performs the reverse (secondary -> primary) operation.
The `set_const` and `set_value` methods provide an easy way to transparently
construct PatternGenerator objects and directly set create "secondary"
patterns.
The cache can be disabled by setting the `enable_cache` attribute to `False`.
"""
primary: Dict[str, PatternGenerator]
secondary: Dict[Tuple[str, str], PatternGenerator]
cache: Dict[Union[str, Tuple[str, str]], 'Pattern']
enable_cache: bool = True
def __init__(self) -> None:
self.primary = {}
self.secondary = {}
self.cache = {}
def __setitem__(self, key: str, value: PatternGenerator) -> None:
self.primary[key] = value
if key in self.cache:
logger.warning(f'Replaced library item "{key}" & existing cache entry.'
' Previously-generated Pattern will *not* be updated!')
del self.cache[key]
def __delitem__(self, key: str) -> None:
if isinstance(key, str):
del self.primary[key]
elif isinstance(key, tuple):
del self.secondary[key]
if key in self.cache:
logger.warning(f'Deleting library item "{key}" & existing cache entry.'
' Previously-generated Pattern may remain in the wild!')
del self.cache[key]
def __getitem__(self, key: str) -> 'Pattern':
return self.get_primary(key)
def __iter__(self) -> Iterator[str]:
return iter(self.keys())
def __contains__(self, key: str) -> bool:
return key in self.primary
def get_primary(self, key: str) -> 'Pattern':
if self.enable_cache and key in self.cache:
logger.debug(f'found {key} in cache')
return self.cache[key]
logger.debug(f'loading {key}')
pg = self.primary[key]
pat = pg.gen()
self.resolve_subpatterns(pat, pg.tag)
self.cache[key] = pat
return pat
def get_secondary(self, key: str, tag: str) -> 'Pattern':
logger.debug(f'get_secondary({key}, {tag})')
key2 = (key, tag)
if self.enable_cache and key2 in self.cache:
return self.cache[key2]
pg = self.secondary[key2]
pat = pg.gen()
self.resolve_subpatterns(pat, pg.tag)
self.cache[key2] = pat
return pat
def set_secondary(self, key: str, tag: str, value: PatternGenerator) -> None:
self.secondary[(key, tag)] = value
if (key, tag) in self.cache:
logger.warning(f'Replaced library item "{key}" & existing cache entry.'
' Previously-generated Pattern will *not* be updated!')
del self.cache[(key, tag)]
def resolve_subpatterns(self, pat: 'Pattern', tag: str) -> 'Pattern':
logger.debug(f'Resolving subpatterns in {pat.name}')
for sp in pat.subpatterns:
if sp.pattern is not None:
continue
key = sp.identifier[0]
if key in self.primary:
sp.pattern = self.get_primary(key)
continue
if (key, tag) in self.secondary:
sp.pattern = self.get_secondary(key, tag)
continue
raise LibraryError(f'Broken reference to {key} (tag {tag})')
return pat
def keys(self) -> Iterator[str]:
return iter(self.primary.keys())
def values(self) -> Iterator['Pattern']:
return iter(self[key] for key in self.keys())
def items(self) -> Iterator[Tuple[str, 'Pattern']]:
return iter((key, self[key]) for key in self.keys())
def __repr__(self) -> str:
return '<Library with keys ' + repr(list(self.primary.keys())) + '>'
def set_const(
self,
key: str,
tag: Any,
const: 'Pattern',
secondary: bool = False,
) -> None:
"""
Convenience function to avoid having to manually wrap
constant values into callables.
Args:
key: Lookup key, usually the cell/pattern name
tag: Unique tag for the source, used to disambiguate secondary patterns
const: Pattern object to return
secondary: If True, this pattern is not accessible for normal lookup, and is
only used as a sub-component of other patterns if no non-secondary
equivalent is available.
"""
pg = PatternGenerator(tag=tag, gen=lambda: const)
if secondary:
self.secondary[(key, tag)] = pg
else:
self.primary[key] = pg
def set_value(
self,
key: str,
tag: str,
value: Callable[[], 'Pattern'],
secondary: bool = False,
) -> None:
"""
Convenience function to automatically build a PatternGenerator.
Args:
key: Lookup key, usually the cell/pattern name
tag: Unique tag for the source, used to disambiguate secondary patterns
value: Callable which takes no arguments and generates the `Pattern` object
secondary: If True, this pattern is not accessible for normal lookup, and is
only used as a sub-component of other patterns if no non-secondary
equivalent is available.
"""
pg = PatternGenerator(tag=tag, gen=value)
if secondary:
self.secondary[(key, tag)] = pg
else:
self.primary[key] = pg
def precache(self: L) -> L:
"""
Force all patterns into the cache
Returns:
self
"""
for key in self.primary:
_ = self.get_primary(key)
for key2 in self.secondary:
_ = self.get_secondary(*key2)
return self
def add(
self: L,
other: L,
use_ours: Callable[[Union[str, Tuple[str, str]]], bool] = lambda name: False,
use_theirs: Callable[[Union[str, Tuple[str, str]]], bool] = lambda name: False,
) -> L:
"""
Add keys from another library into this one.
Args:
other: The library to insert keys from
use_ours: Decision function for name conflicts.
May be called with cell names and (name, tag) tuples for primary or
secondary cells, respectively.
Should return `True` if the value from `self` should be used.
use_theirs: Decision function for name conflicts. Same format as `use_ours`.
Should return `True` if the value from `other` should be used.
`use_ours` takes priority over `use_theirs`.
Returns:
self
"""
duplicates1 = set(self.primary.keys()) & set(other.primary.keys())
duplicates2 = set(self.secondary.keys()) & set(other.secondary.keys())
keep_ours1 = set(name for name in duplicates1 if use_ours(name))
keep_ours2 = set(name for name in duplicates2 if use_ours(name))
keep_theirs1 = set(name for name in duplicates1 - keep_ours1 if use_theirs(name))
keep_theirs2 = set(name for name in duplicates2 - keep_ours2 if use_theirs(name))
conflicts1 = duplicates1 - keep_ours1 - keep_theirs1
conflicts2 = duplicates2 - keep_ours2 - keep_theirs2
if conflicts1:
raise LibraryError('Unresolved duplicate keys encountered in library merge: ' + pformat(conflicts1))
if conflicts2:
raise LibraryError('Unresolved duplicate secondary keys encountered in library merge: ' + pformat(conflicts2))
for key1 in set(other.primary.keys()) - keep_ours1:
self[key1] = other.primary[key1]
if key1 in other.cache:
self.cache[key1] = other.cache[key1]
for key2 in set(other.secondary.keys()) - keep_ours2:
self.set_secondary(*key2, other.secondary[key2])
if key2 in other.cache:
self.cache[key2] = other.cache[key2]
return self
def demote(self, key: str) -> None:
"""
Turn a primary pattern into a secondary one.
It will no longer be accessible through [] indexing and will only be used to
when referenced by other patterns from the same source, and only if no primary
pattern with the same name exists.
Args:
key: Lookup key, usually the cell/pattern name
"""
pg = self.primary[key]
key2 = (key, pg.tag)
self.secondary[key2] = pg
if key in self.cache:
self.cache[key2] = self.cache[key]
del self[key]
def promote(self, key: str, tag: str) -> None:
"""
Turn a secondary pattern into a primary one.
It will become accessible through [] indexing and will be used to satisfy any
reference to a pattern with its key, regardless of tag.
Args:
key: Lookup key, usually the cell/pattern name
tag: Unique tag for identifying the pattern's source, used to disambiguate
secondary patterns
"""
if key in self.primary:
raise LibraryError(f'Promoting ({key}, {tag}), but {key} already exists in primary!')
key2 = (key, tag)
pg = self.secondary[key2]
self.primary[key] = pg
if key2 in self.cache:
self.cache[key] = self.cache[key2]
del self.secondary[key2]
del self.cache[key2]
def copy(self, preserve_cache: bool = False) -> 'Library':
"""
Create a copy of this `Library`.
A shallow copy is made of the contained dicts.
Note that you should probably clear the cache (with `clear_cache()`) after copying.
Returns:
A copy of self
"""
new = Library()
new.primary.update(self.primary)
new.secondary.update(self.secondary)
new.cache.update(self.cache)
return new
def clear_cache(self: L) -> L:
"""
Clear the cache of this library.
This is usually used before modifying or deleting cells, e.g. when merging
with another library.
Returns:
self
"""
self.cache = {}
return self
r"""
# Add a filter for names which aren't added
- Registration:
- scanned files (tag=filename, gen_fn[stream, {name: pos}])
- generator functions (tag='fn?', gen_fn[params])
- merge decision function (based on tag and cell name, can be "neither") ??? neither=keep both, load using same tag!
- Load process:
- file:
- read single cell
- check subpat identifiers, and load stuff recursively based on those. If not present, load from same file??
- function:
- generate cell
- traverse and check if we should load any subcells from elsewhere. replace if so.
* should fn generate subcells at all, or register those separately and have us control flow? maybe ask us and generate itself if not present?
- Scan all GDS files, save name -> (file, position). Keep the streams handy.
- Merge all names. This requires subcell merge because we don't know hierarchy.
- possibly include a "neither" option during merge, to deal with subcells. Means: just use parent's file.
"""

View File

@ -1,5 +1,4 @@
from typing import TypeVar, Generic from typing import Callable, TypeVar, Generic
from collections.abc import Callable
from functools import lru_cache from functools import lru_cache

File diff suppressed because it is too large Load Diff

View File

@ -1,539 +0,0 @@
from typing import overload, Self, NoReturn, Any
from collections.abc import Iterable, KeysView, ValuesView, Mapping
import warnings
import traceback
import logging
import functools
from collections import Counter
from abc import ABCMeta, abstractmethod
from itertools import chain
import numpy
from numpy import pi
from numpy.typing import ArrayLike, NDArray
from .traits import PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable
from .utils import rotate_offsets_around
from .error import PortError
logger = logging.getLogger(__name__)
@functools.total_ordering
class Port(PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable):
"""
A point at which a `Device` can be snapped to another `Device`.
Each port has an `offset` ((x, y) position) and may also have a
`rotation` (orientation) and a `ptype` (port type).
The `rotation` is an angle, in radians, measured counterclockwise
from the +x axis, pointing inwards into the device which owns the port.
The rotation may be set to `None`, indicating that any orientation is
allowed (e.g. for a DC electrical port). It is stored modulo 2pi.
The `ptype` is an arbitrary string, default of `unk` (unknown).
"""
__slots__ = (
'ptype', '_rotation',
# inherited:
'_offset',
)
_rotation: float | None
""" radians counterclockwise from +x, pointing into device body.
Can be `None` to signify undirected port """
ptype: str
""" Port types must match to be plugged together if both are non-zero """
def __init__(
self,
offset: ArrayLike,
rotation: float | None,
ptype: str = 'unk',
) -> None:
self.offset = offset
self.rotation = rotation
self.ptype = ptype
@property
def rotation(self) -> float | None:
""" Rotation, radians counterclockwise, pointing into device body. Can be None. """
return self._rotation
@rotation.setter
def rotation(self, val: float) -> None:
if val is None:
self._rotation = None
else:
if not numpy.size(val) == 1:
raise PortError('Rotation must be a scalar')
self._rotation = val % (2 * pi)
@property
def x(self) -> float:
""" Alias for offset[0] """
return self.offset[0]
@x.setter
def x(self, val: float) -> None:
self.offset[0] = val
@property
def y(self) -> float:
""" Alias for offset[1] """
return self.offset[1]
@y.setter
def y(self, val: float) -> None:
self.offset[1] = val
def copy(self) -> Self:
return self.deepcopy()
def get_bounds(self) -> NDArray[numpy.float64]:
return numpy.vstack((self.offset, self.offset))
def set_ptype(self, ptype: str) -> Self:
""" Chainable setter for `ptype` """
self.ptype = ptype
return self
def mirror(self, axis: int = 0) -> Self:
self.offset[1 - axis] *= -1
if self.rotation is not None:
self.rotation *= -1
self.rotation += axis * pi
return self
def rotate(self, rotation: float) -> Self:
if self.rotation is not None:
self.rotation += rotation
return self
def set_rotation(self, rotation: float | None) -> Self:
self.rotation = rotation
return self
def __repr__(self) -> str:
if self.rotation is None:
rot = 'any'
else:
rot = str(numpy.rad2deg(self.rotation))
return f'<{self.offset}, {rot}, [{self.ptype}]>'
def __lt__(self, other: 'Port') -> bool:
if self.ptype != other.ptype:
return self.ptype < other.ptype
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.rotation != other.rotation:
if self.rotation is None:
return True
if other.rotation is None:
return False
return self.rotation < other.rotation
return False
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and self.ptype == other.ptype
and numpy.array_equal(self.offset, other.offset)
and self.rotation == other.rotation
)
class PortList(metaclass=ABCMeta):
__slots__ = () # Allow subclasses to use __slots__
@property
@abstractmethod
def ports(self) -> dict[str, Port]:
""" Uniquely-named ports which can be used to snap to other Device instances"""
pass
@ports.setter
@abstractmethod
def ports(self, value: dict[str, Port]) -> None:
pass
@overload
def __getitem__(self, key: str) -> Port:
pass
@overload
def __getitem__(self, key: list[str] | tuple[str, ...] | KeysView[str] | ValuesView[str]) -> dict[str, Port]:
pass
def __getitem__(self, key: str | Iterable[str]) -> Port | dict[str, Port]:
"""
For convenience, ports can be read out using square brackets:
- `pattern['A'] == Port((0, 0), 0)`
- ```
pattern[['A', 'B']] == {
'A': Port((0, 0), 0),
'B': Port((0, 0), pi),
}
```
"""
if isinstance(key, str):
return self.ports[key]
else: # noqa: RET505
return {k: self.ports[k] for k in key}
def __contains__(self, key: str) -> NoReturn:
raise NotImplementedError('PortsList.__contains__ is left unimplemented. Use `key in container.ports` instead.')
# NOTE: Didn't add keys(), items(), values(), __contains__(), etc.
# because it's weird on stuff like Pattern that contains other lists
# and because you can just grab .ports and use that instead
def mkport(
self,
name: str,
value: Port,
) -> Self:
"""
Create a port, raising a `PortError` if a port with the same name already exists.
Args:
name: Name for the port. A port with this name should not already exist.
value: The `Port` object to which `name` will refer.
Returns:
self
Raises:
`PortError` if the name already exists.
"""
if name in self.ports:
raise PortError(f'Port {name} already exists.')
assert name not in self.ports
self.ports[name] = value
return self
def rename_ports(
self,
mapping: dict[str, str | None],
overwrite: bool = False,
) -> Self:
"""
Renames ports as specified by `mapping`.
Ports can be explicitly deleted by mapping them to `None`.
Args:
mapping: dict of `{'old_name': 'new_name'}` pairs. Names can be mapped
to `None` to perform an explicit deletion. `'new_name'` can also
overwrite an existing non-renamed port to implicitly delete it if
`overwrite` is set to `True`.
overwrite: Allows implicit deletion of ports if set to `True`; see `mapping`.
Returns:
self
"""
if not overwrite:
duplicates = (set(self.ports.keys()) - set(mapping.keys())) & set(mapping.values())
if duplicates:
raise PortError(f'Unrenamed ports would be overwritten: {duplicates}')
renamed = {vv: self.ports.pop(kk) for kk, vv in mapping.items()}
if None in renamed:
del renamed[None]
self.ports.update(renamed) # type: ignore
return self
def add_port_pair(
self,
offset: ArrayLike = (0, 0),
rotation: float = 0.0,
names: tuple[str, str] = ('A', 'B'),
ptype: str = 'unk',
) -> Self:
"""
Add a pair of ports with opposing directions at the specified location.
Args:
offset: Location at which to add the ports
rotation: Orientation of the first port. Radians, counterclockwise.
Default 0.
names: Names for the two ports. Default 'A' and 'B'
ptype: Sets the port type for both ports.
Returns:
self
"""
new_ports = {
names[0]: Port(offset, rotation=rotation, ptype=ptype),
names[1]: Port(offset, rotation=rotation + pi, ptype=ptype),
}
self.check_ports(names)
self.ports.update(new_ports)
return self
def plugged(
self,
connections: dict[str, str],
) -> Self:
"""
Verify that the ports specified by `connections` are coincident and have opposing
rotations, then remove the ports.
This is used when ports have been "manually" aligned as part of some other routing,
but for whatever reason were not eliminated via `plug()`.
Args:
connections: Pairs of ports which "plug" each other (same offset, opposing directions)
Returns:
self
Raises:
`PortError` if the ports are not properly aligned.
"""
a_names, b_names = list(zip(*connections.items(), strict=True))
a_ports = [self.ports[pp] for pp in a_names]
b_ports = [self.ports[pp] for pp in b_names]
a_types = [pp.ptype for pp in a_ports]
b_types = [pp.ptype for pp in b_ports]
type_conflicts = numpy.array([at != bt and 'unk' not in (at, bt)
for at, bt in zip(a_types, b_types, strict=True)])
if type_conflicts.any():
msg = 'Ports have conflicting types:\n'
for nn, (k, v) in enumerate(connections.items()):
if type_conflicts[nn]:
msg += f'{k} | {a_types[nn]}:{b_types[nn]} | {v}\n'
msg = ''.join(traceback.format_stack()) + '\n' + msg
warnings.warn(msg, stacklevel=2)
a_offsets = numpy.array([pp.offset for pp in a_ports])
b_offsets = numpy.array([pp.offset for pp in b_ports])
a_rotations = numpy.array([pp.rotation if pp.rotation is not None else 0 for pp in a_ports])
b_rotations = numpy.array([pp.rotation if pp.rotation is not None else 0 for pp in b_ports])
a_has_rot = numpy.array([pp.rotation is not None for pp in a_ports], dtype=bool)
b_has_rot = numpy.array([pp.rotation is not None for pp in b_ports], dtype=bool)
has_rot = a_has_rot & b_has_rot
if has_rot.any():
rotations = numpy.mod(a_rotations - b_rotations - pi, 2 * pi)
rotations[~has_rot] = rotations[has_rot][0]
if not numpy.allclose(rotations, 0):
rot_deg = numpy.rad2deg(rotations)
msg = 'Port orientations do not match:\n'
for nn, (k, v) in enumerate(connections.items()):
if not numpy.isclose(rot_deg[nn], 0):
msg += f'{k} | {rot_deg[nn]:g} | {v}\n'
raise PortError(msg)
translations = a_offsets - b_offsets
if not numpy.allclose(translations, 0):
msg = 'Port translations do not match:\n'
for nn, (k, v) in enumerate(connections.items()):
if not numpy.allclose(translations[nn], 0):
msg += f'{k} | {translations[nn]} | {v}\n'
raise PortError(msg)
for pp in chain(a_names, b_names):
del self.ports[pp]
return self
def check_ports(
self,
other_names: Iterable[str],
map_in: dict[str, str] | None = None,
map_out: dict[str, str | None] | None = None,
) -> Self:
"""
Given the provided port mappings, check that:
- All of the ports specified in the mappings exist
- There are no duplicate port names after all the mappings are performed
Args:
other_names: List of port names being considered for inclusion into
`self.ports` (before mapping)
map_in: dict of `{'self_port': 'other_port'}` mappings, specifying
port connections between the two devices.
map_out: dict of `{'old_name': 'new_name'}` mappings, specifying
new names for unconnected `other_names` ports.
Returns:
self
Raises:
`PortError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other_names`.
`PortError` if there are any duplicate names after `map_in` and `map_out`
are applied.
"""
if map_in is None:
map_in = {}
if map_out is None:
map_out = {}
other = set(other_names)
missing_inkeys = set(map_in.keys()) - set(self.ports.keys())
if missing_inkeys:
raise PortError(f'`map_in` keys not present in device: {missing_inkeys}')
missing_invals = set(map_in.values()) - other
if missing_invals:
raise PortError(f'`map_in` values not present in other device: {missing_invals}')
missing_outkeys = set(map_out.keys()) - other
if missing_outkeys:
raise PortError(f'`map_out` keys not present in other device: {missing_outkeys}')
orig_remaining = set(self.ports.keys()) - set(map_in.keys())
other_remaining = other - set(map_out.keys()) - set(map_in.values())
mapped_vals = set(map_out.values())
mapped_vals.discard(None)
conflicts_final = orig_remaining & (other_remaining | mapped_vals)
if conflicts_final:
raise PortError(f'Device ports conflict with existing ports: {conflicts_final}')
conflicts_partial = other_remaining & mapped_vals
if conflicts_partial:
raise PortError(f'`map_out` targets conflict with non-mapped outputs: {conflicts_partial}')
map_out_counts = Counter(map_out.values())
map_out_counts[None] = 0
conflicts_out = {k for k, v in map_out_counts.items() if v > 1}
if conflicts_out:
raise PortError(f'Duplicate targets in `map_out`: {conflicts_out}')
return self
def find_transform(
self,
other: 'PortList',
map_in: dict[str, str],
*,
mirrored: bool = False,
set_rotation: bool | None = None,
) -> tuple[NDArray[numpy.float64], float, NDArray[numpy.float64]]:
"""
Given a device `other` and a mapping `map_in` specifying port connections,
find the transform which will correctly align the specified ports.
Args:
other: a device
map_in: dict of `{'self_port': 'other_port'}` mappings, specifying
port connections between the two devices.
mirrored: Mirrors `other` across the x axis prior to
connecting any ports.
set_rotation: If the necessary rotation cannot be determined from
the ports being connected (i.e. all pairs have at least one
port with `rotation=None`), `set_rotation` must be provided
to indicate how much `other` should be rotated. Otherwise,
`set_rotation` must remain `None`.
Returns:
- The (x, y) translation (performed last)
- The rotation (radians, counterclockwise)
- The (x, y) pivot point for the rotation
The rotation should be performed before the translation.
"""
s_ports = self[map_in.keys()]
o_ports = other[map_in.values()]
return self.find_port_transform(
s_ports=s_ports,
o_ports=o_ports,
map_in=map_in,
mirrored=mirrored,
set_rotation=set_rotation,
)
@staticmethod
def find_port_transform(
s_ports: Mapping[str, Port],
o_ports: Mapping[str, Port],
map_in: dict[str, str],
*,
mirrored: bool = False,
set_rotation: bool | None = None,
) -> tuple[NDArray[numpy.float64], float, NDArray[numpy.float64]]:
"""
Given two sets of ports (s_ports and o_ports) and a mapping `map_in`
specifying port connections, find the transform which will correctly
align the specified o_ports onto their respective s_ports.
Args:t
s_ports: A list of stationary ports
o_ports: A list of ports which are to be moved/mirrored.
map_in: dict of `{'s_port': 'o_port'}` mappings, specifying
port connections.
mirrored: Mirrors `o_ports` across the x axis prior to
connecting any ports.
set_rotation: If the necessary rotation cannot be determined from
the ports being connected (i.e. all pairs have at least one
port with `rotation=None`), `set_rotation` must be provided
to indicate how much `o_ports` should be rotated. Otherwise,
`set_rotation` must remain `None`.
Returns:
- The (x, y) translation (performed last)
- The rotation (radians, counterclockwise)
- The (x, y) pivot point for the rotation
The rotation should be performed before the translation.
"""
s_offsets = numpy.array([p.offset for p in s_ports.values()])
o_offsets = numpy.array([p.offset for p in o_ports.values()])
s_types = [p.ptype for p in s_ports.values()]
o_types = [p.ptype for p in o_ports.values()]
s_rotations = numpy.array([p.rotation if p.rotation is not None else 0 for p in s_ports.values()])
o_rotations = numpy.array([p.rotation if p.rotation is not None else 0 for p in o_ports.values()])
s_has_rot = numpy.array([p.rotation is not None for p in s_ports.values()], dtype=bool)
o_has_rot = numpy.array([p.rotation is not None for p in o_ports.values()], dtype=bool)
has_rot = s_has_rot & o_has_rot
if mirrored:
o_offsets[:, 1] *= -1
o_rotations *= -1
type_conflicts = numpy.array([st != ot and 'unk' not in (st, ot)
for st, ot in zip(s_types, o_types, strict=True)])
if type_conflicts.any():
msg = 'Ports have conflicting types:\n'
for nn, (k, v) in enumerate(map_in.items()):
if type_conflicts[nn]:
msg += f'{k} | {s_types[nn]}:{o_types[nn]} | {v}\n'
msg = ''.join(traceback.format_stack()) + '\n' + msg
warnings.warn(msg, stacklevel=2)
rotations = numpy.mod(s_rotations - o_rotations - pi, 2 * pi)
if not has_rot.any():
if set_rotation is None:
PortError('Must provide set_rotation if rotation is indeterminate')
rotations[:] = set_rotation
else:
rotations[~has_rot] = rotations[has_rot][0]
if not numpy.allclose(rotations[:1], rotations):
rot_deg = numpy.rad2deg(rotations)
msg = 'Port orientations do not match:\n'
for nn, (k, v) in enumerate(map_in.items()):
msg += f'{k} | {rot_deg[nn]:g} | {v}\n'
raise PortError(msg)
pivot = o_offsets[0].copy()
rotate_offsets_around(o_offsets, pivot, rotations[0])
translations = s_offsets - o_offsets
if not numpy.allclose(translations[:1], translations):
msg = 'Port translations do not match:\n'
for nn, (k, v) in enumerate(map_in.items()):
msg += f'{k} | {translations[nn]} | {v}\n'
raise PortError(msg)
return translations[0], rotations[0], o_offsets[0]

View File

@ -1,236 +0,0 @@
"""
Ref provides basic support for nesting Pattern objects within each other.
It carries offset, rotation, mirroring, and scaling data for each individual instance.
"""
from typing import TYPE_CHECKING, Self, Any
from collections.abc import Mapping
import copy
import functools
import numpy
from numpy import pi
from numpy.typing import NDArray, ArrayLike
from .utils import annotations_t, rotation_matrix_2d, annotations_eq, annotations_lt, rep2key
from .repetition import Repetition
from .traits import (
PositionableImpl, RotatableImpl, ScalableImpl,
Mirrorable, PivotableImpl, Copyable, RepeatableImpl, AnnotatableImpl,
)
if TYPE_CHECKING:
from . import Pattern
@functools.total_ordering
class Ref(
PositionableImpl, RotatableImpl, ScalableImpl, Mirrorable,
PivotableImpl, Copyable, RepeatableImpl, AnnotatableImpl,
):
"""
`Ref` provides basic support for nesting Pattern objects within each other.
It containts the transformation (mirror, rotation, scale, offset, repetition)
and annotations for a single instantiation of a `Pattern`.
Note that the target (i.e. which pattern a `Ref` instantiates) is not stored within the
`Ref` itself, but is specified by the containing `Pattern`.
Order of operations is (mirror, rotate, scale, translate, repeat).
"""
__slots__ = (
'_mirrored',
# inherited
'_offset', '_rotation', 'scale', '_repetition', '_annotations',
)
_mirrored: bool
""" Whether to mirror the instance across the x axis (new_y = -old_y)ubefore rotating. """
# Mirrored property
@property
def mirrored(self) -> bool: # mypy#3004, setter should be SupportsBool
return self._mirrored
@mirrored.setter
def mirrored(self, val: bool) -> None:
self._mirrored = bool(val)
def __init__(
self,
*,
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0.0,
mirrored: bool = False,
scale: float = 1.0,
repetition: Repetition | None = None,
annotations: annotations_t | None = None,
) -> None:
"""
Note: Order is (mirror, rotate, scale, translate, repeat)
Args:
offset: (x, y) offset applied to the referenced pattern. Not affected by rotation etc.
rotation: Rotation (radians, counterclockwise) relative to the referenced pattern's (0, 0).
mirrored: Whether to mirror the referenced pattern across its x axis before rotating.
scale: Scaling factor applied to the pattern's geometry.
repetition: `Repetition` object, default `None`
"""
self.offset = offset
self.rotation = rotation
self.scale = scale
self.mirrored = mirrored
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
def __copy__(self) -> 'Ref':
new = Ref(
offset=self.offset.copy(),
rotation=self.rotation,
scale=self.scale,
mirrored=self.mirrored,
repetition=copy.deepcopy(self.repetition),
annotations=copy.deepcopy(self.annotations),
)
return new
def __deepcopy__(self, memo: dict | None = None) -> 'Ref':
memo = {} if memo is None else memo
new = copy.copy(self)
#new.repetition = copy.deepcopy(self.repetition, memo)
#new.annotations = copy.deepcopy(self.annotations, memo)
return new
def __lt__(self, other: 'Ref') -> bool:
if (self.offset != other.offset).any():
return tuple(self.offset) < tuple(other.offset)
if self.mirrored != other.mirrored:
return self.mirrored < other.mirrored
if self.rotation != other.rotation:
return self.rotation < other.rotation
if self.scale != other.scale:
return self.scale < other.scale
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
def __eq__(self, other: Any) -> bool:
return (
numpy.array_equal(self.offset, other.offset)
and self.mirrored == other.mirrored
and self.rotation == other.rotation
and self.scale == other.scale
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def as_pattern(
self,
pattern: 'Pattern',
) -> 'Pattern':
"""
Args:
pattern: Pattern object to transform
Returns:
A copy of the referenced Pattern which has been scaled, rotated, etc.
according to this `Ref`'s properties.
"""
pattern = pattern.deepcopy()
if self.scale != 1:
pattern.scale_by(self.scale)
if self.mirrored:
pattern.mirror()
if self.rotation % (2 * pi) != 0:
pattern.rotate_around((0.0, 0.0), self.rotation)
if numpy.any(self.offset):
pattern.translate_elements(self.offset)
if self.repetition is not None:
combined = type(pattern)()
for dd in self.repetition.displacements:
temp_pat = pattern.deepcopy()
temp_pat.ports = {}
temp_pat.translate_elements(dd)
combined.append(temp_pat)
pattern = combined
return pattern
def rotate(self, rotation: float) -> Self:
self.rotation += rotation
if self.repetition is not None:
self.repetition.rotate(rotation)
return self
def mirror(self, axis: int = 0) -> Self:
self.mirror_target(axis)
self.rotation *= -1
if self.repetition is not None:
self.repetition.mirror(axis)
return self
def mirror_target(self, axis: int = 0) -> Self:
self.mirrored = not self.mirrored
self.rotation += axis * pi
return self
def mirror2d_target(self, across_x: bool = False, across_y: bool = False) -> Self:
self.mirrored = bool((self.mirrored + across_x + across_y) % 2)
if across_y:
self.rotation += pi
return self
def as_transforms(self) -> NDArray[numpy.float64]:
xys = self.offset[None, :]
if self.repetition is not None:
xys = xys + self.repetition.displacements
transforms = numpy.empty((xys.shape[0], 4))
transforms[:, :2] = xys
transforms[:, 2] = self.rotation
transforms[:, 3] = self.mirrored
return transforms
def get_bounds_single(
self,
pattern: 'Pattern',
*,
library: Mapping[str, 'Pattern'] | None = None,
) -> NDArray[numpy.float64] | None:
"""
Return a `numpy.ndarray` containing `[[x_min, y_min], [x_max, y_max]]`, corresponding to the
extent of the `Ref` in each dimension.
Returns `None` if the contained `Pattern` is empty.
Args:
library: Name-to-Pattern mapping for resul
Returns:
`[[x_min, y_min], [x_max, y_max]]` or `None`
"""
if pattern.is_empty():
# no need to run as_pattern()
return None
# if rotation is manhattan, can take pattern's bounds and transform them
if numpy.isclose(self.rotation % (pi / 2), 0):
unrot_bounds = pattern.get_bounds(library)
if unrot_bounds is None:
return None
if self.mirrored:
unrot_bounds[:, 1] *= -1
corners = (rotation_matrix_2d(self.rotation) @ unrot_bounds.T).T
bounds = numpy.vstack((numpy.min(corners, axis=0),
numpy.max(corners, axis=0))) * self.scale + [self.offset]
return bounds
return self.as_pattern(pattern=pattern).get_bounds(library)
def __repr__(self) -> str:
rotation = f' r{numpy.rad2deg(self.rotation):g}' if self.rotation != 0 else ''
scale = f' d{self.scale:g}' if self.scale != 1 else ''
mirrored = ' m' if self.mirrored else ''
return f'<Ref {self.offset}{rotation}{scale}{mirrored}>'

View File

@ -2,28 +2,24 @@
Repetitions provide support for efficiently representing multiple identical Repetitions provide support for efficiently representing multiple identical
instances of an object . instances of an object .
""" """
from typing import Any, Self, TypeVar, cast
from typing import Union, Dict, Optional, Sequence, Any, Type
import copy import copy
import functools
from abc import ABCMeta, abstractmethod from abc import ABCMeta, abstractmethod
import numpy import numpy
from numpy.typing import ArrayLike, NDArray from numpy.typing import ArrayLike, NDArray
from .traits import Copyable, Scalable, Rotatable, Mirrorable, Bounded
from .error import PatternError from .error import PatternError
from .utils import rotation_matrix_2d from .utils import rotation_matrix_2d, AutoSlots
from .traits import LockableImpl, Copyable, Scalable, Rotatable, Mirrorable
GG = TypeVar('GG', bound='Grid') class Repetition(Copyable, Rotatable, Mirrorable, Scalable, metaclass=ABCMeta):
@functools.total_ordering
class Repetition(Copyable, Rotatable, Mirrorable, Scalable, Bounded, metaclass=ABCMeta):
""" """
Interface common to all objects which specify repetitions Interface common to all objects which specify repetitions
""" """
__slots__ = () # Allow subclasses to use __slots__ __slots__ = ()
@property @property
@abstractmethod @abstractmethod
@ -33,16 +29,8 @@ class Repetition(Copyable, Rotatable, Mirrorable, Scalable, Bounded, metaclass=A
""" """
pass pass
@abstractmethod
def __le__(self, other: 'Repetition') -> bool:
pass
@abstractmethod class Grid(LockableImpl, Repetition, metaclass=AutoSlots):
def __eq__(self, other: Any) -> bool:
pass
class Grid(Repetition):
""" """
`Grid` describes a 2D grid formed by two basis vectors and two 'counts' (sizes). `Grid` describes a 2D grid formed by two basis vectors and two 'counts' (sizes).
@ -51,10 +39,10 @@ class Grid(Repetition):
Note that the offsets in either the 2D or 1D grids do not have to be axis-aligned. Note that the offsets in either the 2D or 1D grids do not have to be axis-aligned.
""" """
__slots__ = ( __slots__ = ('_a_vector',
'_a_vector', '_b_vector', '_b_vector',
'_a_count', '_b_count', '_a_count',
) '_b_count')
_a_vector: NDArray[numpy.float64] _a_vector: NDArray[numpy.float64]
""" Vector `[x, y]` specifying the first lattice vector of the grid. """ Vector `[x, y]` specifying the first lattice vector of the grid.
@ -64,7 +52,7 @@ class Grid(Repetition):
_a_count: int _a_count: int
""" Number of instances along the direction specified by the `a_vector` """ """ Number of instances along the direction specified by the `a_vector` """
_b_vector: NDArray[numpy.float64] | None _b_vector: Optional[NDArray[numpy.float64]]
""" Vector `[x, y]` specifying a second lattice vector for the grid. """ Vector `[x, y]` specifying a second lattice vector for the grid.
Specifies center-to-center spacing between adjacent elements. Specifies center-to-center spacing between adjacent elements.
Can be `None` for a 1D array. Can be `None` for a 1D array.
@ -77,8 +65,9 @@ class Grid(Repetition):
self, self,
a_vector: ArrayLike, a_vector: ArrayLike,
a_count: int, a_count: int,
b_vector: ArrayLike | None = None, b_vector: Optional[ArrayLike] = None,
b_count: int | None = 1, b_count: Optional[int] = 1,
locked: bool = False,
) -> None: ) -> None:
""" """
Args: Args:
@ -90,6 +79,7 @@ class Grid(Repetition):
Can be omitted when specifying a 1D array. Can be omitted when specifying a 1D array.
b_count: Number of elements in the `b_vector` direction. b_count: Number of elements in the `b_vector` direction.
Should be omitted if `b_vector` was omitted. Should be omitted if `b_vector` was omitted.
locked: Whether the `Grid` is locked after initialization.
Raises: Raises:
PatternError if `b_*` inputs conflict with each other PatternError if `b_*` inputs conflict with each other
@ -101,6 +91,7 @@ class Grid(Repetition):
if b_vector is None: if b_vector is None:
if b_count > 1: if b_count > 1:
raise PatternError('Repetition has b_count > 1 but no b_vector') raise PatternError('Repetition has b_count > 1 but no b_vector')
else:
b_vector = numpy.array([0.0, 0.0]) b_vector = numpy.array([0.0, 0.0])
if a_count < 1: if a_count < 1:
@ -108,19 +99,21 @@ class Grid(Repetition):
if b_count < 1: if b_count < 1:
raise PatternError(f'Repetition has too-small b_count: {b_count}') raise PatternError(f'Repetition has too-small b_count: {b_count}')
object.__setattr__(self, 'locked', False)
self.a_vector = a_vector # type: ignore # setter handles type conversion self.a_vector = a_vector # type: ignore # setter handles type conversion
self.b_vector = b_vector # type: ignore # setter handles type conversion self.b_vector = b_vector # type: ignore # setter handles type conversion
self.a_count = a_count self.a_count = a_count
self.b_count = b_count self.b_count = b_count
self.locked = locked
@classmethod @classmethod
def aligned( def aligned(
cls: type[GG], cls: Type,
x: float, x: float,
y: float, y: float,
x_count: int, x_count: int,
y_count: int, y_count: int,
) -> GG: ) -> 'Grid':
""" """
Simple constructor for an axis-aligned 2D grid Simple constructor for an axis-aligned 2D grid
@ -136,17 +129,18 @@ class Grid(Repetition):
return cls(a_vector=(x, 0), b_vector=(0, y), a_count=x_count, b_count=y_count) return cls(a_vector=(x, 0), b_vector=(0, y), a_count=x_count, b_count=y_count)
def __copy__(self) -> 'Grid': def __copy__(self) -> 'Grid':
new = Grid( new = Grid(a_vector=self.a_vector.copy(),
a_vector=self.a_vector.copy(),
b_vector=copy.copy(self.b_vector), b_vector=copy.copy(self.b_vector),
a_count=self.a_count, a_count=self.a_count,
b_count=self.b_count, b_count=self.b_count,
) locked=self.locked)
return new return new
def __deepcopy__(self, memo: dict | None = None) -> Self: def __deepcopy__(self, memo: Dict = None) -> 'Grid':
memo = {} if memo is None else memo memo = {} if memo is None else memo
new = copy.copy(self) new = copy.copy(self)
LocakbleImpl.unlock(new)
new.locked = self.locked
return new return new
# a_vector property # a_vector property
@ -156,20 +150,22 @@ class Grid(Repetition):
@a_vector.setter @a_vector.setter
def a_vector(self, val: ArrayLike) -> None: def a_vector(self, val: ArrayLike) -> None:
if not isinstance(val, numpy.ndarray):
val = numpy.array(val, dtype=float) val = numpy.array(val, dtype=float)
if val.size != 2: if val.size != 2:
raise PatternError('a_vector must be convertible to size-2 ndarray') raise PatternError('a_vector must be convertible to size-2 ndarray')
self._a_vector = val.flatten() self._a_vector = val.flatten().astype(float)
# b_vector property # b_vector property
@property @property
def b_vector(self) -> NDArray[numpy.float64] | None: def b_vector(self) -> Optional[NDArray[numpy.float64]]:
return self._b_vector return self._b_vector
@b_vector.setter @b_vector.setter
def b_vector(self, val: ArrayLike) -> None: def b_vector(self, val: ArrayLike) -> None:
val = numpy.array(val, dtype=float) if not isinstance(val, numpy.ndarray):
val = numpy.array(val, dtype=float, copy=True)
if val.size != 2: if val.size != 2:
raise PatternError('b_vector must be convertible to size-2 ndarray') raise PatternError('b_vector must be convertible to size-2 ndarray')
@ -206,7 +202,7 @@ class Grid(Repetition):
return (aa.flatten()[:, None] * self.a_vector[None, :] return (aa.flatten()[:, None] * self.a_vector[None, :]
+ bb.flatten()[:, None] * self.b_vector[None, :]) # noqa + bb.flatten()[:, None] * self.b_vector[None, :]) # noqa
def rotate(self, rotation: float) -> Self: def rotate(self, rotation: float) -> 'Grid':
""" """
Rotate lattice vectors (around (0, 0)) Rotate lattice vectors (around (0, 0))
@ -221,7 +217,7 @@ class Grid(Repetition):
self.b_vector = numpy.dot(rotation_matrix_2d(rotation), self.b_vector) self.b_vector = numpy.dot(rotation_matrix_2d(rotation), self.b_vector)
return self return self
def mirror(self, axis: int = 0) -> Self: def mirror(self, axis: int) -> 'Grid':
""" """
Mirror the Grid across an axis. Mirror the Grid across an axis.
@ -237,7 +233,7 @@ class Grid(Repetition):
self.b_vector[1 - axis] *= -1 self.b_vector[1 - axis] *= -1
return self return self
def get_bounds(self) -> NDArray[numpy.float64] | None: def get_bounds(self) -> Optional[NDArray[numpy.float64]]:
""" """
Return a `numpy.ndarray` containing `[[x_min, y_min], [x_max, y_max]]`, corresponding to the Return a `numpy.ndarray` containing `[[x_min, y_min], [x_max, y_max]]`, corresponding to the
extent of the `Grid` in each dimension. extent of the `Grid` in each dimension.
@ -245,19 +241,15 @@ class Grid(Repetition):
Returns: Returns:
`[[x_min, y_min], [x_max, y_max]]` or `None` `[[x_min, y_min], [x_max, y_max]]` or `None`
""" """
a_extent = self.a_vector * (self.a_count - 1) a_extent = self.a_vector * self.a_count
if self.b_count is None: b_extent = self.b_vector * self.b_count if (self.b_vector is not None) else 0 # type: Union[NDArray[numpy.float64], float]
b_extent = numpy.zeros(2)
else:
assert self.b_vector is not None
b_extent = self.b_vector * (self.b_count - 1)
corners = numpy.stack(((0, 0), a_extent, b_extent, a_extent + b_extent)) corners = ((0, 0), a_extent, b_extent, a_extent + b_extent)
xy_min = numpy.min(corners, axis=0) xy_min = numpy.min(corners, axis=0)
xy_max = numpy.max(corners, axis=0) xy_max = numpy.max(corners, axis=0)
return numpy.array((xy_min, xy_max)) return numpy.array((xy_min, xy_max))
def scale_by(self, c: float) -> Self: def scale_by(self, c: float) -> 'Grid':
""" """
Scale the Grid by a factor Scale the Grid by a factor
@ -272,12 +264,39 @@ class Grid(Repetition):
self.b_vector *= c self.b_vector *= c
return self return self
def lock(self) -> 'Grid':
"""
Lock the `Grid`, disallowing changes.
Returns:
self
"""
self.a_vector.flags.writeable = False
if self.b_vector is not None:
self.b_vector.flags.writeable = False
LockableImpl.lock(self)
return self
def unlock(self) -> 'Grid':
"""
Unlock the `Grid`
Returns:
self
"""
self.a_vector.flags.writeable = True
if self.b_vector is not None:
self.b_vector.flags.writeable = True
LockableImpl.unlock(self)
return self
def __repr__(self) -> str: def __repr__(self) -> str:
locked = ' L' if self.locked else ''
bv = f', {self.b_vector}' if self.b_vector is not None else '' bv = f', {self.b_vector}' if self.b_vector is not None else ''
return (f'<Grid {self.a_count}x{self.b_count} ({self.a_vector}{bv})>') return (f'<Grid {self.a_count}x{self.b_count} ({self.a_vector}{bv}){locked}>')
def __eq__(self, other: Any) -> bool: def __eq__(self, other: Any) -> bool:
if type(other) is not type(self): if not isinstance(other, type(self)):
return False return False
if self.a_count != other.a_count or self.b_count != other.b_count: if self.a_count != other.a_count or self.b_count != other.b_count:
return False return False
@ -287,30 +306,14 @@ class Grid(Repetition):
return True return True
if self.b_vector is None or other.b_vector is None: if self.b_vector is None or other.b_vector is None:
return False return False
if any(self.b_vector[ii] != other.b_vector[ii] for ii in range(2)): # noqa: SIM103 if any(self.b_vector[ii] != other.b_vector[ii] for ii in range(2)):
return False
if self.locked != other.locked:
return False return False
return True return True
def __le__(self, other: Repetition) -> bool:
if type(self) is not type(other):
return repr(type(self)) < repr(type(other))
other = cast(Grid, other)
if self.a_count != other.a_count:
return self.a_count < other.a_count
if self.b_count != other.b_count:
return self.b_count < other.b_count
if not numpy.array_equal(self.a_vector, other.a_vector):
return tuple(self.a_vector) < tuple(other.a_vector)
if self.b_vector is None:
return other.b_vector is not None
if other.b_vector is None:
return False
if not numpy.array_equal(self.b_vector, other.b_vector):
return tuple(self.a_vector) < tuple(other.a_vector)
return False
class Arbitrary(LockableImpl, Repetition, metaclass=AutoSlots):
class Arbitrary(Repetition):
""" """
`Arbitrary` is a simple list of (absolute) displacements for instances. `Arbitrary` is a simple list of (absolute) displacements for instances.
@ -327,47 +330,63 @@ class Arbitrary(Repetition):
""" """
@property @property
def displacements(self) -> Any: # mypy#3004 NDArray[numpy.float64]: def displacements(self) -> Any: # TODO: mypy#3004 NDArray[numpy.float64]:
return self._displacements return self._displacements
@displacements.setter @displacements.setter
def displacements(self, val: ArrayLike) -> None: def displacements(self, val: ArrayLike) -> None:
vala = numpy.array(val, dtype=float) vala: NDArray[numpy.float64] = numpy.array(val, dtype=float)
order = numpy.lexsort(vala.T[::-1]) # sortrows vala = numpy.sort(vala.view([('', vala.dtype)] * vala.shape[1]), 0).view(vala.dtype) # sort rows
self._displacements = vala[order] self._displacements = vala
def __init__( def __init__(
self, self,
displacements: ArrayLike, displacements: ArrayLike,
locked: bool = False,
) -> None: ) -> None:
""" """
Args: Args:
displacements: List of vectors (Nx2 ndarray) specifying displacements. displacements: List of vectors (Nx2 ndarray) specifying displacements.
locked: Whether the object is locked after initialization.
""" """
object.__setattr__(self, 'locked', False)
self.displacements = displacements self.displacements = displacements
self.locked = locked
def lock(self) -> 'Arbitrary':
"""
Lock the object, disallowing changes.
Returns:
self
"""
self._displacements.flags.writeable = False
LockableImpl.lock(self)
return self
def unlock(self) -> 'Arbitrary':
"""
Unlock the object
Returns:
self
"""
self._displacements.flags.writeable = True
LockableImpl.unlock(self)
return self
def __repr__(self) -> str: def __repr__(self) -> str:
return (f'<Arbitrary {len(self.displacements)}pts >') locked = ' L' if self.locked else ''
return (f'<Arbitrary {len(self.displacements)}pts {locked}>')
def __eq__(self, other: Any) -> bool: def __eq__(self, other: Any) -> bool:
if not type(other) is not type(self): if not isinstance(other, type(self)):
return False
if self.locked != other.locked:
return False return False
return numpy.array_equal(self.displacements, other.displacements) return numpy.array_equal(self.displacements, other.displacements)
def __le__(self, other: Repetition) -> bool: def rotate(self, rotation: float) -> 'Arbitrary':
if type(self) is not type(other):
return repr(type(self)) < repr(type(other))
other = cast(Arbitrary, other)
if self.displacements.size != other.displacements.size:
return self.displacements.size < other.displacements.size
neq = (self.displacements != other.displacements)
if neq.any():
return self.displacements[neq][0] < other.displacements[neq][0]
return False
def rotate(self, rotation: float) -> Self:
""" """
Rotate dispacements (around (0, 0)) Rotate dispacements (around (0, 0))
@ -380,7 +399,7 @@ class Arbitrary(Repetition):
self.displacements = numpy.dot(rotation_matrix_2d(rotation), self.displacements.T).T self.displacements = numpy.dot(rotation_matrix_2d(rotation), self.displacements.T).T
return self return self
def mirror(self, axis: int = 0) -> Self: def mirror(self, axis: int) -> 'Arbitrary':
""" """
Mirror the displacements across an axis. Mirror the displacements across an axis.
@ -394,7 +413,7 @@ class Arbitrary(Repetition):
self.displacements[1 - axis] *= -1 self.displacements[1 - axis] *= -1
return self return self
def get_bounds(self) -> NDArray[numpy.float64] | None: def get_bounds(self) -> Optional[NDArray[numpy.float64]]:
""" """
Return a `numpy.ndarray` containing `[[x_min, y_min], [x_max, y_max]]`, corresponding to the Return a `numpy.ndarray` containing `[[x_min, y_min], [x_max, y_max]]`, corresponding to the
extent of the `displacements` in each dimension. extent of the `displacements` in each dimension.
@ -406,7 +425,7 @@ class Arbitrary(Repetition):
xy_max = numpy.max(self.displacements, axis=0) xy_max = numpy.max(self.displacements, axis=0)
return numpy.array((xy_min, xy_max)) return numpy.array((xy_min, xy_max))
def scale_by(self, c: float) -> Self: def scale_by(self, c: float) -> 'Arbitrary':
""" """
Scale the displacements by a factor Scale the displacements by a factor

View File

@ -3,15 +3,11 @@ Shapes for use with the Pattern class, as well as the Shape abstract class from
which they are derived. which they are derived.
""" """
from .shape import ( from .shape import Shape, normalized_shape_tuple, DEFAULT_POLY_NUM_POINTS
Shape as Shape,
normalized_shape_tuple as normalized_shape_tuple,
DEFAULT_POLY_NUM_VERTICES as DEFAULT_POLY_NUM_VERTICES,
)
from .polygon import Polygon as Polygon from .polygon import Polygon
from .circle import Circle as Circle from .circle import Circle
from .ellipse import Ellipse as Ellipse from .ellipse import Ellipse
from .arc import Arc as Arc from .arc import Arc
from .text import Text as Text from .text import Text
from .path import Path as Path from .path import Path

View File

@ -1,19 +1,19 @@
from typing import Any, cast from typing import List, Dict, Optional, Sequence, Any
import copy import copy
import functools import math
import numpy import numpy
from numpy import pi from numpy import pi
from numpy.typing import NDArray, ArrayLike from numpy.typing import NDArray, ArrayLike
from . import Shape, Polygon, normalized_shape_tuple, DEFAULT_POLY_NUM_VERTICES from . import Shape, Polygon, normalized_shape_tuple, DEFAULT_POLY_NUM_POINTS
from ..error import PatternError from .. import PatternError
from ..repetition import Repetition from ..repetition import Repetition
from ..utils import is_scalar, annotations_t, annotations_lt, annotations_eq, rep2key from ..utils import is_scalar, layer_t, AutoSlots, annotations_t
from ..traits import LockableImpl
@functools.total_ordering class Arc(Shape, metaclass=AutoSlots):
class Arc(Shape):
""" """
An elliptical arc, formed by cutting off an elliptical ring with two rays which exit from its An elliptical arc, formed by cutting off an elliptical ring with two rays which exit from its
center. It has a position, two radii, a start and stop angle, a rotation, and a width. center. It has a position, two radii, a start and stop angle, a rotation, and a width.
@ -22,11 +22,8 @@ class Arc(Shape):
The rotation gives the angle from x-axis, counterclockwise, to the first (x) radius. The rotation gives the angle from x-axis, counterclockwise, to the first (x) radius.
The start and stop angle are measured counterclockwise from the first (x) radius. The start and stop angle are measured counterclockwise from the first (x) radius.
""" """
__slots__ = ( __slots__ = ('_radii', '_angles', '_width', '_rotation',
'_radii', '_angles', '_width', '_rotation', 'poly_num_points', 'poly_max_arclen')
# Inherited
'_offset', '_repetition', '_annotations',
)
_radii: NDArray[numpy.float64] _radii: NDArray[numpy.float64]
""" Two radii for defining an ellipse """ """ Two radii for defining an ellipse """
@ -40,9 +37,15 @@ class Arc(Shape):
_width: float _width: float
""" Width of the arc """ """ Width of the arc """
poly_num_points: Optional[int]
""" Sets the default number of points for `.polygonize()` """
poly_max_arclen: Optional[float]
""" Sets the default max segement length for `.polygonize()` """
# radius properties # radius properties
@property @property
def radii(self) -> Any: # mypy#3004 NDArray[numpy.float64]: def radii(self) -> Any: #TODO mypy#3004 NDArray[numpy.float64]:
""" """
Return the radii `[rx, ry]` Return the radii `[rx, ry]`
""" """
@ -79,7 +82,7 @@ class Arc(Shape):
# arc start/stop angle properties # arc start/stop angle properties
@property @property
def angles(self) -> Any: # mypy#3004 NDArray[numpy.float64]: def angles(self) -> Any: #TODO mypy#3004 NDArray[numpy.float64]:
""" """
Return the start and stop angles `[a_start, a_stop]`. Return the start and stop angles `[a_start, a_stop]`.
Angles are measured from x-axis after rotation Angles are measured from x-axis after rotation
@ -154,16 +157,24 @@ class Arc(Shape):
angles: ArrayLike, angles: ArrayLike,
width: float, width: float,
*, *,
poly_num_points: Optional[int] = DEFAULT_POLY_NUM_POINTS,
poly_max_arclen: Optional[float] = None,
offset: ArrayLike = (0.0, 0.0), offset: ArrayLike = (0.0, 0.0),
rotation: float = 0, rotation: float = 0,
repetition: Repetition | None = None, mirrored: Sequence[bool] = (False, False),
annotations: annotations_t | None = None, layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
annotations: Optional[annotations_t] = None,
locked: bool = False,
raw: bool = False, raw: bool = False,
) -> None: ) -> None:
LockableImpl.unlock(self)
self.identifier = ()
if raw: if raw:
assert isinstance(radii, numpy.ndarray) assert(isinstance(radii, numpy.ndarray))
assert isinstance(angles, numpy.ndarray) assert(isinstance(angles, numpy.ndarray))
assert isinstance(offset, numpy.ndarray) assert(isinstance(offset, numpy.ndarray))
self._radii = radii self._radii = radii
self._angles = angles self._angles = angles
self._width = width self._width = width
@ -171,6 +182,8 @@ class Arc(Shape):
self._rotation = rotation self._rotation = rotation
self._repetition = repetition self._repetition = repetition
self._annotations = annotations if annotations is not None else {} self._annotations = annotations if annotations is not None else {}
self._layer = layer
self._dose = dose
else: else:
self.radii = radii self.radii = radii
self.angles = angles self.angles = angles
@ -179,54 +192,35 @@ class Arc(Shape):
self.rotation = rotation self.rotation = rotation
self.repetition = repetition self.repetition = repetition
self.annotations = annotations if annotations is not None else {} self.annotations = annotations if annotations is not None else {}
self.layer = layer
self.dose = dose
self.poly_num_points = poly_num_points
self.poly_max_arclen = poly_max_arclen
[self.mirror(a) for a, do in enumerate(mirrored) if do]
self.set_locked(locked)
def __deepcopy__(self, memo: dict | None = None) -> 'Arc': def __deepcopy__(self, memo: Dict = None) -> 'Arc':
memo = {} if memo is None else memo memo = {} if memo is None else memo
new = copy.copy(self) new = copy.copy(self)
Shape.unlock(new)
new._offset = self._offset.copy() new._offset = self._offset.copy()
new._radii = self._radii.copy() new._radii = self._radii.copy()
new._angles = self._angles.copy() new._angles = self._angles.copy()
new._annotations = copy.deepcopy(self._annotations) new._annotations = copy.deepcopy(self._annotations)
new.set_locked(self.locked)
return new return new
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and numpy.array_equal(self.offset, other.offset)
and numpy.array_equal(self.radii, other.radii)
and numpy.array_equal(self.angles, other.angles)
and self.width == other.width
and self.rotation == other.rotation
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def __lt__(self, other: Shape) -> bool:
if type(self) is not type(other):
if repr(type(self)) != repr(type(other)):
return repr(type(self)) < repr(type(other))
return id(type(self)) < id(type(other))
other = cast(Arc, other)
if self.width != other.width:
return self.width < other.width
if not numpy.array_equal(self.radii, other.radii):
return tuple(self.radii) < tuple(other.radii)
if not numpy.array_equal(self.angles, other.angles):
return tuple(self.angles) < tuple(other.angles)
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.rotation != other.rotation:
return self.rotation < other.rotation
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
def to_polygons( def to_polygons(
self, self,
num_vertices: int | None = DEFAULT_POLY_NUM_VERTICES, poly_num_points: Optional[int] = None,
max_arclen: float | None = None, poly_max_arclen: Optional[float] = None,
) -> list[Polygon]: ) -> List[Polygon]:
if (num_vertices is None) and (max_arclen is None): if poly_num_points is None:
poly_num_points = self.poly_num_points
if poly_max_arclen is None:
poly_max_arclen = self.poly_max_arclen
if (poly_num_points is None) and (poly_max_arclen is None):
raise PatternError('Max number of points and arclength left unspecified' raise PatternError('Max number of points and arclength left unspecified'
+ ' (default was also overridden)') + ' (default was also overridden)')
@ -235,62 +229,27 @@ class Arc(Shape):
# Convert from polar angle to ellipse parameter (for [rx*cos(t), ry*sin(t)] representation) # Convert from polar angle to ellipse parameter (for [rx*cos(t), ry*sin(t)] representation)
a_ranges = self._angles_to_parameters() a_ranges = self._angles_to_parameters()
# Approximate perimeter via numerical integration # Approximate perimeter
# Ramanujan, S., "Modular Equations and Approximations to ,"
# Quart. J. Pure. Appl. Math., vol. 45 (1913-1914), pp. 350-372
a0, a1 = a_ranges[1] # use outer arc
h = ((r1 - r0) / (r1 + r0)) ** 2
ellipse_perimeter = pi * (r1 + r0) * (1 + 3 * h / (10 + math.sqrt(4 - 3 * h)))
perimeter = abs(a0 - a1) / (2 * pi) * ellipse_perimeter # TODO: make this more accurate
#perimeter1 = numpy.trapz(numpy.sqrt(r0sin * r0sin + r1cos * r1cos), dx=dt) n = []
#from scipy.special import ellipeinc if poly_num_points is not None:
#m = 1 - (r1 / r0) ** 2 n += [poly_num_points]
#t1 = ellipeinc(a1 - pi / 2, m) if poly_max_arclen is not None:
#t0 = ellipeinc(a0 - pi / 2, m) n += [perimeter / poly_max_arclen]
#perimeter2 = r0 * (t1 - t0) num_points = int(round(max(n)))
def get_arclens(n_pts: int, a0: float, a1: float, dr: float) -> tuple[NDArray[numpy.float64], NDArray[numpy.float64]]:
""" Get `n_pts` arclengths """
t, dt = numpy.linspace(a0, a1, n_pts, retstep=True) # NOTE: could probably use an adaptive number of points
r0sin = (r0 + dr) * numpy.sin(t)
r1cos = (r1 + dr) * numpy.cos(t)
arc_dl = numpy.sqrt(r0sin * r0sin + r1cos * r1cos)
#arc_lengths = numpy.diff(t) * (arc_dl[1:] + arc_dl[:-1]) / 2
arc_lengths = (arc_dl[1:] + arc_dl[:-1]) * numpy.abs(dt) / 2
return arc_lengths, t
wh = self.width / 2.0 wh = self.width / 2.0
if num_vertices is not None: if wh == r0 or wh == r1:
n_pts = numpy.ceil(max(self.radii + wh) / min(self.radii) * num_vertices * 100).astype(int)
perimeter_inner = get_arclens(n_pts, *a_ranges[0], dr=-wh)[0].sum()
perimeter_outer = get_arclens(n_pts, *a_ranges[1], dr= wh)[0].sum()
implied_arclen = (perimeter_outer + perimeter_inner + self.width * 2) / num_vertices
max_arclen = min(implied_arclen, max_arclen if max_arclen is not None else numpy.inf)
assert max_arclen is not None
def get_thetas(inner: bool) -> NDArray[numpy.float64]:
""" Figure out the parameter values at which we should place vertices to meet the arclength constraint"""
dr = -wh if inner else wh
n_pts = numpy.ceil(2 * pi * max(self.radii + dr) / max_arclen).astype(int)
arc_lengths, thetas = get_arclens(n_pts, *a_ranges[0 if inner else 1], dr=dr)
keep = [0]
removable = (numpy.cumsum(arc_lengths) <= max_arclen)
start = 1
while start < arc_lengths.size:
next_to_keep = start + numpy.where(removable)[0][-1] # TODO: any chance we haven't sampled finely enough?
keep.append(next_to_keep)
removable = (numpy.cumsum(arc_lengths[next_to_keep + 1:]) <= max_arclen)
start = next_to_keep + 1
if keep[-1] != thetas.size - 1:
keep.append(thetas.size - 1)
thetas = thetas[keep]
if inner:
thetas = thetas[::-1]
return thetas
if wh in (r0, r1):
thetas_inner = numpy.zeros(1) # Don't generate multiple vertices if we're at the origin thetas_inner = numpy.zeros(1) # Don't generate multiple vertices if we're at the origin
else: else:
thetas_inner = get_thetas(inner=True) thetas_inner = numpy.linspace(a_ranges[0][1], a_ranges[0][0], num_points, endpoint=True)
thetas_outer = get_thetas(inner=False) thetas_outer = numpy.linspace(a_ranges[1][0], a_ranges[1][1], num_points, endpoint=True)
sin_th_i, cos_th_i = (numpy.sin(thetas_inner), numpy.cos(thetas_inner)) sin_th_i, cos_th_i = (numpy.sin(thetas_inner), numpy.cos(thetas_inner))
sin_th_o, cos_th_o = (numpy.sin(thetas_outer), numpy.cos(thetas_outer)) sin_th_o, cos_th_o = (numpy.sin(thetas_outer), numpy.cos(thetas_outer))
@ -304,11 +263,11 @@ class Arc(Shape):
ys = numpy.hstack((ys1, ys2)) ys = numpy.hstack((ys1, ys2))
xys = numpy.vstack((xs, ys)).T xys = numpy.vstack((xs, ys)).T
poly = Polygon(xys, offset=self.offset, rotation=self.rotation) poly = Polygon(xys, dose=self.dose, layer=self.layer, offset=self.offset, rotation=self.rotation)
return [poly] return [poly]
def get_bounds_single(self) -> NDArray[numpy.float64]: def get_bounds(self) -> NDArray[numpy.float64]:
""" '''
Equation for rotated ellipse is Equation for rotated ellipse is
`x = x0 + a * cos(t) * cos(rot) - b * sin(t) * sin(phi)` `x = x0 + a * cos(t) * cos(rot) - b * sin(t) * sin(phi)`
`y = y0 + a * cos(t) * sin(rot) + b * sin(t) * cos(rot)` `y = y0 + a * cos(t) * sin(rot) + b * sin(t) * cos(rot)`
@ -319,12 +278,12 @@ class Arc(Shape):
where -+ is for x, y cases, so that's where the extrema are. where -+ is for x, y cases, so that's where the extrema are.
If the extrema are innaccessible due to arc constraints, check the arc endpoints instead. If the extrema are innaccessible due to arc constraints, check the arc endpoints instead.
""" '''
a_ranges = self._angles_to_parameters() a_ranges = self._angles_to_parameters()
mins = [] mins = []
maxs = [] maxs = []
for a, sgn in zip(a_ranges, (-1, +1), strict=True): for a, sgn in zip(a_ranges, (-1, +1)):
wh = sgn * self.width / 2 wh = sgn * self.width / 2
rx = self.radius_x + wh rx = self.radius_x + wh
ry = self.radius_y + wh ry = self.radius_y + wh
@ -381,7 +340,7 @@ class Arc(Shape):
self.rotation += theta self.rotation += theta
return self return self
def mirror(self, axis: int = 0) -> 'Arc': def mirror(self, axis: int) -> 'Arc':
self.offset[axis - 1] *= -1 self.offset[axis - 1] *= -1
self.rotation *= -1 self.rotation *= -1
self.rotation += axis * pi self.rotation += axis * pi
@ -415,27 +374,23 @@ class Arc(Shape):
rotation %= 2 * pi rotation %= 2 * pi
width = self.width width = self.width
return ((type(self), radii, angles, width / norm_value), return ((type(self), radii, angles, width / norm_value, self.layer),
(self.offset, scale / norm_value, rotation, False), (self.offset, scale / norm_value, rotation, False, self.dose),
lambda: Arc( lambda: Arc(radii=radii * norm_value, angles=angles, width=width * norm_value, layer=self.layer))
radii=radii * norm_value,
angles=angles,
width=width * norm_value,
))
def get_cap_edges(self) -> NDArray[numpy.float64]: def get_cap_edges(self) -> NDArray[numpy.float64]:
""" '''
Returns: Returns:
``` ```
[[[x0, y0], [x1, y1]], array of 4 points, specifying the two cuts which [[[x0, y0], [x1, y1]], array of 4 points, specifying the two cuts which
[[x2, y2], [x3, y3]]], would create this arc from its corresponding ellipse. [[x2, y2], [x3, y3]]], would create this arc from its corresponding ellipse.
``` ```
""" '''
a_ranges = self._angles_to_parameters() a_ranges = self._angles_to_parameters()
mins = [] mins = []
maxs = [] maxs = []
for a, sgn in zip(a_ranges, (-1, +1), strict=True): for a, sgn in zip(a_ranges, (-1, +1)):
wh = sgn * self.width / 2 wh = sgn * self.width / 2
rx = self.radius_x + wh rx = self.radius_x + wh
ry = self.radius_y + wh ry = self.radius_y + wh
@ -454,28 +409,41 @@ class Arc(Shape):
return numpy.array([mins, maxs]) + self.offset return numpy.array([mins, maxs]) + self.offset
def _angles_to_parameters(self) -> NDArray[numpy.float64]: def _angles_to_parameters(self) -> NDArray[numpy.float64]:
""" '''
Convert from polar angle to ellipse parameter (for [rx*cos(t), ry*sin(t)] representation)
Returns: Returns:
"Eccentric anomaly" parameter ranges for the inner and outer edges, in the form "Eccentric anomaly" parameter ranges for the inner and outer edges, in the form
`[[a_min_inner, a_max_inner], [a_min_outer, a_max_outer]]` `[[a_min_inner, a_max_inner], [a_min_outer, a_max_outer]]`
""" '''
a = [] a = []
for sgn in (-1, +1): for sgn in (-1, +1):
wh = sgn * self.width / 2.0 wh = sgn * self.width / 2
rx = self.radius_x + wh rx = self.radius_x + wh
ry = self.radius_y + wh ry = self.radius_y + wh
# create paremeter 'a' for parametrized ellipse
a0, a1 = (numpy.arctan2(rx * numpy.sin(a), ry * numpy.cos(a)) for a in self.angles) a0, a1 = (numpy.arctan2(rx * numpy.sin(a), ry * numpy.cos(a)) for a in self.angles)
sign = numpy.sign(self.angles[1] - self.angles[0]) sign = numpy.sign(self.angles[1] - self.angles[0])
if sign != numpy.sign(a1 - a0): if sign != numpy.sign(a1 - a0):
a1 += sign * 2 * pi a1 += sign * 2 * pi
a.append((a0, a1)) a.append((a0, a1))
return numpy.array(a, dtype=float) return numpy.array(a)
def lock(self) -> 'Arc':
self.radii.flags.writeable = False
self.angles.flags.writeable = False
Shape.lock(self)
return self
def unlock(self) -> 'Arc':
Shape.unlock(self)
self.radii.flags.writeable = True
self.angles.flags.writeable = True
return self
def __repr__(self) -> str: def __repr__(self) -> str:
angles = f'{numpy.rad2deg(self.angles)}' angles = f'{numpy.rad2deg(self.angles)}'
rotation = f'{numpy.rad2deg(self.rotation):g}' if self.rotation != 0 else '' rotation = f'{numpy.rad2deg(self.rotation):g}' if self.rotation != 0 else ''
return f'<Arc o{self.offset} r{self.radii}{angles} w{self.width:g}{rotation}>' dose = f' d{self.dose:g}' if self.dose != 1 else ''
locked = ' L' if self.locked else ''
return f'<Arc l{self.layer} o{self.offset} r{self.radii}{angles} w{self.width:g}{rotation}{dose}{locked}>'

View File

@ -1,31 +1,32 @@
from typing import Any, cast from typing import List, Dict, Optional
import copy import copy
import functools
import numpy import numpy
from numpy import pi from numpy import pi
from numpy.typing import NDArray, ArrayLike from numpy.typing import NDArray, ArrayLike
from . import Shape, Polygon, normalized_shape_tuple, DEFAULT_POLY_NUM_VERTICES from . import Shape, Polygon, normalized_shape_tuple, DEFAULT_POLY_NUM_POINTS
from ..error import PatternError from .. import PatternError
from ..repetition import Repetition from ..repetition import Repetition
from ..utils import is_scalar, annotations_t, annotations_lt, annotations_eq, rep2key from ..utils import is_scalar, layer_t, AutoSlots, annotations_t
from ..traits import LockableImpl
@functools.total_ordering class Circle(Shape, metaclass=AutoSlots):
class Circle(Shape):
""" """
A circle, which has a position and radius. A circle, which has a position and radius.
""" """
__slots__ = ( __slots__ = ('_radius', 'poly_num_points', 'poly_max_arclen')
'_radius',
# Inherited
'_offset', '_repetition', '_annotations',
)
_radius: float _radius: float
""" Circle radius """ """ Circle radius """
poly_num_points: Optional[int]
""" Sets the default number of points for `.polygonize()` """
poly_max_arclen: Optional[float]
""" Sets the default max segement length for `.polygonize()` """
# radius property # radius property
@property @property
def radius(self) -> float: def radius(self) -> float:
@ -46,83 +47,81 @@ class Circle(Shape):
self, self,
radius: float, radius: float,
*, *,
poly_num_points: Optional[int] = DEFAULT_POLY_NUM_POINTS,
poly_max_arclen: Optional[float] = None,
offset: ArrayLike = (0.0, 0.0), offset: ArrayLike = (0.0, 0.0),
repetition: Repetition | None = None, layer: layer_t = 0,
annotations: annotations_t | None = None, dose: float = 1.0,
repetition: Optional[Repetition] = None,
annotations: Optional[annotations_t] = None,
locked: bool = False,
raw: bool = False, raw: bool = False,
) -> None: ) -> None:
LockableImpl.unlock(self)
self.identifier = ()
if raw: if raw:
assert isinstance(offset, numpy.ndarray) assert(isinstance(offset, numpy.ndarray))
self._radius = radius self._radius = radius
self._offset = offset self._offset = offset
self._repetition = repetition self._repetition = repetition
self._annotations = annotations if annotations is not None else {} self._annotations = annotations if annotations is not None else {}
self._layer = layer
self._dose = dose
else: else:
self.radius = radius self.radius = radius
self.offset = offset self.offset = offset
self.repetition = repetition self.repetition = repetition
self.annotations = annotations if annotations is not None else {} self.annotations = annotations if annotations is not None else {}
self.layer = layer
self.dose = dose
self.poly_num_points = poly_num_points
self.poly_max_arclen = poly_max_arclen
self.set_locked(locked)
def __deepcopy__(self, memo: dict | None = None) -> 'Circle': def __deepcopy__(self, memo: Dict = None) -> 'Circle':
memo = {} if memo is None else memo memo = {} if memo is None else memo
new = copy.copy(self) new = copy.copy(self)
Shape.unlock(new)
new._offset = self._offset.copy() new._offset = self._offset.copy()
new._annotations = copy.deepcopy(self._annotations) new._annotations = copy.deepcopy(self._annotations)
new.set_locked(self.locked)
return new return new
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and numpy.array_equal(self.offset, other.offset)
and self.radius == other.radius
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def __lt__(self, other: Shape) -> bool:
if type(self) is not type(other):
if repr(type(self)) != repr(type(other)):
return repr(type(self)) < repr(type(other))
return id(type(self)) < id(type(other))
other = cast(Circle, other)
if not self.radius == other.radius:
return self.radius < other.radius
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
def to_polygons( def to_polygons(
self, self,
num_vertices: int | None = DEFAULT_POLY_NUM_VERTICES, poly_num_points: Optional[int] = None,
max_arclen: float | None = None, poly_max_arclen: Optional[float] = None,
) -> list[Polygon]: ) -> List[Polygon]:
if (num_vertices is None) and (max_arclen is None): if poly_num_points is None:
poly_num_points = self.poly_num_points
if poly_max_arclen is None:
poly_max_arclen = self.poly_max_arclen
if (poly_num_points is None) and (poly_max_arclen is None):
raise PatternError('Number of points and arclength left ' raise PatternError('Number of points and arclength left '
'unspecified (default was also overridden)') 'unspecified (default was also overridden)')
n: list[float] = [] n: List[float] = []
if num_vertices is not None: if poly_num_points is not None:
n += [num_vertices] n += [poly_num_points]
if max_arclen is not None: if poly_max_arclen is not None:
n += [2 * pi * self.radius / max_arclen] n += [2 * pi * self.radius / poly_max_arclen]
num_vertices = int(round(max(n))) num_points = int(round(max(n)))
thetas = numpy.linspace(2 * pi, 0, num_vertices, endpoint=False) thetas = numpy.linspace(2 * pi, 0, num_points, endpoint=False)
xs = numpy.cos(thetas) * self.radius xs = numpy.cos(thetas) * self.radius
ys = numpy.sin(thetas) * self.radius ys = numpy.sin(thetas) * self.radius
xys = numpy.vstack((xs, ys)).T xys = numpy.vstack((xs, ys)).T
return [Polygon(xys, offset=self.offset)] return [Polygon(xys, offset=self.offset, dose=self.dose, layer=self.layer)]
def get_bounds_single(self) -> NDArray[numpy.float64]: def get_bounds(self) -> NDArray[numpy.float64]:
return numpy.vstack((self.offset - self.radius, return numpy.vstack((self.offset - self.radius,
self.offset + self.radius)) self.offset + self.radius))
def rotate(self, theta: float) -> 'Circle': # noqa: ARG002 (theta unused) def rotate(self, theta: float) -> 'Circle':
return self return self
def mirror(self, axis: int = 0) -> 'Circle': # noqa: ARG002 (axis unused) def mirror(self, axis: int) -> 'Circle':
self.offset *= -1 self.offset *= -1
return self return self
@ -130,12 +129,14 @@ class Circle(Shape):
self.radius *= c self.radius *= c
return self return self
def normalized_form(self, norm_value: float) -> normalized_shape_tuple: def normalized_form(self, norm_value) -> normalized_shape_tuple:
rotation = 0.0 rotation = 0.0
magnitude = self.radius / norm_value magnitude = self.radius / norm_value
return ((type(self),), return ((type(self), self.layer),
(self.offset, magnitude, rotation, False), (self.offset, magnitude, rotation, False, self.dose),
lambda: Circle(radius=norm_value)) lambda: Circle(radius=norm_value, layer=self.layer))
def __repr__(self) -> str: def __repr__(self) -> str:
return f'<Circle o{self.offset} r{self.radius:g}>' dose = f' d{self.dose:g}' if self.dose != 1 else ''
locked = ' L' if self.locked else ''
return f'<Circle l{self.layer} o{self.offset} r{self.radius:g}{dose}{locked}>'

View File

@ -1,29 +1,25 @@
from typing import Any, Self, cast from typing import List, Dict, Sequence, Optional, Any
import copy import copy
import math import math
import functools
import numpy import numpy
from numpy import pi from numpy import pi
from numpy.typing import ArrayLike, NDArray from numpy.typing import ArrayLike, NDArray
from . import Shape, Polygon, normalized_shape_tuple, DEFAULT_POLY_NUM_VERTICES from . import Shape, Polygon, normalized_shape_tuple, DEFAULT_POLY_NUM_POINTS
from ..error import PatternError from .. import PatternError
from ..repetition import Repetition from ..repetition import Repetition
from ..utils import is_scalar, rotation_matrix_2d, annotations_t, annotations_lt, annotations_eq, rep2key from ..utils import is_scalar, rotation_matrix_2d, layer_t, AutoSlots, annotations_t
from ..traits import LockableImpl
@functools.total_ordering class Ellipse(Shape, metaclass=AutoSlots):
class Ellipse(Shape):
""" """
An ellipse, which has a position, two radii, and a rotation. An ellipse, which has a position, two radii, and a rotation.
The rotation gives the angle from x-axis, counterclockwise, to the first (x) radius. The rotation gives the angle from x-axis, counterclockwise, to the first (x) radius.
""" """
__slots__ = ( __slots__ = ('_radii', '_rotation',
'_radii', '_rotation', 'poly_num_points', 'poly_max_arclen')
# Inherited
'_offset', '_repetition', '_annotations',
)
_radii: NDArray[numpy.float64] _radii: NDArray[numpy.float64]
""" Ellipse radii """ """ Ellipse radii """
@ -31,9 +27,15 @@ class Ellipse(Shape):
_rotation: float _rotation: float
""" Angle from x-axis to first radius (ccw, radians) """ """ Angle from x-axis to first radius (ccw, radians) """
poly_num_points: Optional[int]
""" Sets the default number of points for `.polygonize()` """
poly_max_arclen: Optional[float]
""" Sets the default max segement length for `.polygonize()` """
# radius properties # radius properties
@property @property
def radii(self) -> Any: # mypy#3004 NDArray[numpy.float64]: def radii(self) -> Any: #TODO mypy#3004 NDArray[numpy.float64]:
""" """
Return the radii `[rx, ry]` Return the radii `[rx, ry]`
""" """
@ -90,67 +92,64 @@ class Ellipse(Shape):
self, self,
radii: ArrayLike, radii: ArrayLike,
*, *,
poly_num_points: Optional[int] = DEFAULT_POLY_NUM_POINTS,
poly_max_arclen: Optional[float] = None,
offset: ArrayLike = (0.0, 0.0), offset: ArrayLike = (0.0, 0.0),
rotation: float = 0, rotation: float = 0,
repetition: Repetition | None = None, mirrored: Sequence[bool] = (False, False),
annotations: annotations_t | None = None, layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
annotations: Optional[annotations_t] = None,
locked: bool = False,
raw: bool = False, raw: bool = False,
) -> None: ) -> None:
LockableImpl.unlock(self)
self.identifier = ()
if raw: if raw:
assert isinstance(radii, numpy.ndarray) assert(isinstance(radii, numpy.ndarray))
assert isinstance(offset, numpy.ndarray) assert(isinstance(offset, numpy.ndarray))
self._radii = radii self._radii = radii
self._offset = offset self._offset = offset
self._rotation = rotation self._rotation = rotation
self._repetition = repetition self._repetition = repetition
self._annotations = annotations if annotations is not None else {} self._annotations = annotations if annotations is not None else {}
self._layer = layer
self._dose = dose
else: else:
self.radii = radii self.radii = radii
self.offset = offset self.offset = offset
self.rotation = rotation self.rotation = rotation
self.repetition = repetition self.repetition = repetition
self.annotations = annotations if annotations is not None else {} self.annotations = annotations if annotations is not None else {}
self.layer = layer
self.dose = dose
[self.mirror(a) for a, do in enumerate(mirrored) if do]
self.poly_num_points = poly_num_points
self.poly_max_arclen = poly_max_arclen
self.set_locked(locked)
def __deepcopy__(self, memo: dict | None = None) -> Self: def __deepcopy__(self, memo: Dict = None) -> 'Ellipse':
memo = {} if memo is None else memo memo = {} if memo is None else memo
new = copy.copy(self) new = copy.copy(self)
Shape.unlock(new)
new._offset = self._offset.copy() new._offset = self._offset.copy()
new._radii = self._radii.copy() new._radii = self._radii.copy()
new._annotations = copy.deepcopy(self._annotations) new._annotations = copy.deepcopy(self._annotations)
new.set_locked(self.locked)
return new return new
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and numpy.array_equal(self.offset, other.offset)
and numpy.array_equal(self.radii, other.radii)
and self.rotation == other.rotation
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def __lt__(self, other: Shape) -> bool:
if type(self) is not type(other):
if repr(type(self)) != repr(type(other)):
return repr(type(self)) < repr(type(other))
return id(type(self)) < id(type(other))
other = cast(Ellipse, other)
if not numpy.array_equal(self.radii, other.radii):
return tuple(self.radii) < tuple(other.radii)
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.rotation != other.rotation:
return self.rotation < other.rotation
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
def to_polygons( def to_polygons(
self, self,
num_vertices: int | None = DEFAULT_POLY_NUM_VERTICES, poly_num_points: Optional[int] = None,
max_arclen: float | None = None, poly_max_arclen: Optional[float] = None,
) -> list[Polygon]: ) -> List[Polygon]:
if (num_vertices is None) and (max_arclen is None): if poly_num_points is None:
poly_num_points = self.poly_num_points
if poly_max_arclen is None:
poly_max_arclen = self.poly_max_arclen
if (poly_num_points is None) and (poly_max_arclen is None):
raise PatternError('Number of points and arclength left unspecified' raise PatternError('Number of points and arclength left unspecified'
' (default was also overridden)') ' (default was also overridden)')
@ -163,37 +162,37 @@ class Ellipse(Shape):
perimeter = pi * (r1 + r0) * (1 + 3 * h / (10 + math.sqrt(4 - 3 * h))) perimeter = pi * (r1 + r0) * (1 + 3 * h / (10 + math.sqrt(4 - 3 * h)))
n = [] n = []
if num_vertices is not None: if poly_num_points is not None:
n += [num_vertices] n += [poly_num_points]
if max_arclen is not None: if poly_max_arclen is not None:
n += [perimeter / max_arclen] n += [perimeter / poly_max_arclen]
num_vertices = int(round(max(n))) num_points = int(round(max(n)))
thetas = numpy.linspace(2 * pi, 0, num_vertices, endpoint=False) thetas = numpy.linspace(2 * pi, 0, num_points, endpoint=False)
sin_th, cos_th = (numpy.sin(thetas), numpy.cos(thetas)) sin_th, cos_th = (numpy.sin(thetas), numpy.cos(thetas))
xs = r0 * cos_th xs = r0 * cos_th
ys = r1 * sin_th ys = r1 * sin_th
xys = numpy.vstack((xs, ys)).T xys = numpy.vstack((xs, ys)).T
poly = Polygon(xys, offset=self.offset, rotation=self.rotation) poly = Polygon(xys, dose=self.dose, layer=self.layer, offset=self.offset, rotation=self.rotation)
return [poly] return [poly]
def get_bounds_single(self) -> NDArray[numpy.float64]: def get_bounds(self) -> NDArray[numpy.float64]:
rot_radii = numpy.dot(rotation_matrix_2d(self.rotation), self.radii) rot_radii = numpy.dot(rotation_matrix_2d(self.rotation), self.radii)
return numpy.vstack((self.offset - rot_radii[0], return numpy.vstack((self.offset - rot_radii[0],
self.offset + rot_radii[1])) self.offset + rot_radii[1]))
def rotate(self, theta: float) -> Self: def rotate(self, theta: float) -> 'Ellipse':
self.rotation += theta self.rotation += theta
return self return self
def mirror(self, axis: int = 0) -> Self: def mirror(self, axis: int) -> 'Ellipse':
self.offset[axis - 1] *= -1 self.offset[axis - 1] *= -1
self.rotation *= -1 self.rotation *= -1
self.rotation += axis * pi self.rotation += axis * pi
return self return self
def scale_by(self, c: float) -> Self: def scale_by(self, c: float) -> 'Ellipse':
self.radii *= c self.radii *= c
return self return self
@ -206,10 +205,22 @@ class Ellipse(Shape):
radii = self.radii[::-1] / self.radius_y radii = self.radii[::-1] / self.radius_y
scale = self.radius_y scale = self.radius_y
angle = (self.rotation + pi / 2) % pi angle = (self.rotation + pi / 2) % pi
return ((type(self), radii), return ((type(self), radii, self.layer),
(self.offset, scale / norm_value, angle, False), (self.offset, scale / norm_value, angle, False, self.dose),
lambda: Ellipse(radii=radii * norm_value)) lambda: Ellipse(radii=radii * norm_value, layer=self.layer))
def lock(self) -> 'Ellipse':
self.radii.flags.writeable = False
Shape.lock(self)
return self
def unlock(self) -> 'Ellipse':
Shape.unlock(self)
self.radii.flags.writeable = True
return self
def __repr__(self) -> str: def __repr__(self) -> str:
rotation = f' r{numpy.rad2deg(self.rotation):g}' if self.rotation != 0 else '' rotation = f' r{self.rotation*180/pi:g}' if self.rotation != 0 else ''
return f'<Ellipse o{self.offset} r{self.radii}{rotation}>' dose = f' d{self.dose:g}' if self.dose != 1 else ''
locked = ' L' if self.locked else ''
return f'<Ellipse l{self.layer} o{self.offset} r{self.radii}{rotation}{dose}{locked}>'

View File

@ -1,7 +1,5 @@
from typing import Any, cast from typing import List, Tuple, Dict, Optional, Sequence, Any
from collections.abc import Sequence
import copy import copy
import functools
from enum import Enum from enum import Enum
import numpy import numpy
@ -9,13 +7,13 @@ from numpy import pi, inf
from numpy.typing import NDArray, ArrayLike from numpy.typing import NDArray, ArrayLike
from . import Shape, normalized_shape_tuple, Polygon, Circle from . import Shape, normalized_shape_tuple, Polygon, Circle
from ..error import PatternError from .. import PatternError
from ..repetition import Repetition from ..repetition import Repetition
from ..utils import is_scalar, rotation_matrix_2d, annotations_lt, annotations_eq, rep2key from ..utils import is_scalar, rotation_matrix_2d, layer_t, AutoSlots
from ..utils import remove_colinear_vertices, remove_duplicate_vertices, annotations_t from ..utils import remove_colinear_vertices, remove_duplicate_vertices, annotations_t
from ..traits import LockableImpl
@functools.total_ordering
class PathCap(Enum): class PathCap(Enum):
Flush = 0 # Path ends at final vertices Flush = 0 # Path ends at final vertices
Circle = 1 # Path extends past final vertices with a semicircle of radius width/2 Circle = 1 # Path extends past final vertices with a semicircle of radius width/2
@ -23,29 +21,19 @@ class PathCap(Enum):
SquareCustom = 4 # Path extends past final vertices with a rectangle of length SquareCustom = 4 # Path extends past final vertices with a rectangle of length
# # defined by path.cap_extensions # # defined by path.cap_extensions
def __lt__(self, other: Any) -> bool:
return self.value == other.value
class Path(Shape, metaclass=AutoSlots):
@functools.total_ordering
class Path(Shape):
""" """
A path, consisting of a bunch of vertices (Nx2 ndarray), a width, an end-cap shape, A path, consisting of a bunch of vertices (Nx2 ndarray), a width, an end-cap shape,
and an offset. and an offset.
Note that the setter for `Path.vertices` will create a copy of the passed vertex coordinates.
A normalized_form(...) is available, but can be quite slow with lots of vertices. A normalized_form(...) is available, but can be quite slow with lots of vertices.
""" """
__slots__ = ( __slots__ = ('_vertices', '_width', '_cap', '_cap_extensions')
'_vertices', '_width', '_cap', '_cap_extensions',
# Inherited
'_offset', '_repetition', '_annotations',
)
_vertices: NDArray[numpy.float64] _vertices: NDArray[numpy.float64]
_width: float _width: float
_cap: PathCap _cap: PathCap
_cap_extensions: NDArray[numpy.float64] | None _cap_extensions: Optional[NDArray[numpy.float64]]
Cap = PathCap Cap = PathCap
@ -70,14 +58,12 @@ class Path(Shape):
def cap(self) -> PathCap: def cap(self) -> PathCap:
""" """
Path end-cap Path end-cap
Note that `cap_extensions` will be reset to default values if
`cap` is changed away from `PathCap.SquareCustom`.
""" """
return self._cap return self._cap
@cap.setter @cap.setter
def cap(self, val: PathCap) -> None: def cap(self, val: PathCap) -> None:
# TODO: Document that setting cap can change cap_extensions
self._cap = PathCap(val) self._cap = PathCap(val)
if self.cap != PathCap.SquareCustom: if self.cap != PathCap.SquareCustom:
self.cap_extensions = None self.cap_extensions = None
@ -87,43 +73,38 @@ class Path(Shape):
# cap_extensions property # cap_extensions property
@property @property
def cap_extensions(self) -> Any | None: # mypy#3004 NDArray[numpy.float64]]: def cap_extensions(self) -> Optional[Any]: #TODO mypy#3004 NDArray[numpy.float64]]:
""" """
Path end-cap extension Path end-cap extension
Note that `cap_extensions` will be reset to default values if
`cap` is changed away from `PathCap.SquareCustom`.
Returns: Returns:
2-element ndarray or `None` 2-element ndarray or `None`
""" """
return self._cap_extensions return self._cap_extensions
@cap_extensions.setter @cap_extensions.setter
def cap_extensions(self, vals: ArrayLike | None) -> None: def cap_extensions(self, vals: Optional[ArrayLike]) -> None:
custom_caps = (PathCap.SquareCustom,) custom_caps = (PathCap.SquareCustom,)
if self.cap in custom_caps: if self.cap in custom_caps:
if vals is None: if vals is None:
raise PatternError('Tried to set cap extensions to None on path with custom cap type') raise Exception('Tried to set cap extensions to None on path with custom cap type')
self._cap_extensions = numpy.array(vals, dtype=float) self._cap_extensions = numpy.array(vals, dtype=float)
else: else:
if vals is not None: if vals is not None:
raise PatternError('Tried to set custom cap extensions on path with non-custom cap type') raise Exception('Tried to set custom cap extensions on path with non-custom cap type')
self._cap_extensions = vals self._cap_extensions = vals
# vertices property # vertices property
@property @property
def vertices(self) -> Any: # mypy#3004 NDArray[numpy.float64]]: def vertices(self) -> Any: #TODO mypy#3004 NDArray[numpy.float64]]:
""" """
Vertices of the path (Nx2 ndarray: `[[x0, y0], [x1, y1], ...]` Vertices of the path (Nx2 ndarray: `[[x0, y0], [x1, y1], ...]`)
When setting, note that a copy of the provided vertices will be made.
""" """
return self._vertices return self._vertices
@vertices.setter @vertices.setter
def vertices(self, val: ArrayLike) -> None: def vertices(self, val: ArrayLike) -> None:
val = numpy.array(val, dtype=float) val = numpy.array(val, dtype=float) # TODO document that these might not be copied
if len(val.shape) < 2 or val.shape[1] != 2: if len(val.shape) < 2 or val.shape[1] != 2:
raise PatternError('Vertices must be an Nx2 array') raise PatternError('Vertices must be an Nx2 array')
if val.shape[0] < 2: if val.shape[0] < 2:
@ -166,23 +147,31 @@ class Path(Shape):
width: float = 0.0, width: float = 0.0,
*, *,
cap: PathCap = PathCap.Flush, cap: PathCap = PathCap.Flush,
cap_extensions: ArrayLike | None = None, cap_extensions: Optional[ArrayLike] = None,
offset: ArrayLike = (0.0, 0.0), offset: ArrayLike = (0.0, 0.0),
rotation: float = 0, rotation: float = 0,
repetition: Repetition | None = None, mirrored: Sequence[bool] = (False, False),
annotations: annotations_t | None = None, layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
annotations: Optional[annotations_t] = None,
locked: bool = False,
raw: bool = False, raw: bool = False,
) -> None: ) -> None:
LockableImpl.unlock(self)
self._cap_extensions = None # Since .cap setter might access it self._cap_extensions = None # Since .cap setter might access it
self.identifier = ()
if raw: if raw:
assert isinstance(vertices, numpy.ndarray) assert(isinstance(vertices, numpy.ndarray))
assert isinstance(offset, numpy.ndarray) assert(isinstance(offset, numpy.ndarray))
assert isinstance(cap_extensions, numpy.ndarray) or cap_extensions is None assert(isinstance(cap_extensions, numpy.ndarray) or cap_extensions is None)
self._vertices = vertices self._vertices = vertices
self._offset = offset self._offset = offset
self._repetition = repetition self._repetition = repetition
self._annotations = annotations if annotations is not None else {} self._annotations = annotations if annotations is not None else {}
self._layer = layer
self._dose = dose
self._width = width self._width = width
self._cap = cap self._cap = cap
self._cap_extensions = cap_extensions self._cap_extensions = cap_extensions
@ -191,63 +180,38 @@ class Path(Shape):
self.offset = offset self.offset = offset
self.repetition = repetition self.repetition = repetition
self.annotations = annotations if annotations is not None else {} self.annotations = annotations if annotations is not None else {}
self.layer = layer
self.dose = dose
self.width = width self.width = width
self.cap = cap self.cap = cap
self.cap_extensions = cap_extensions self.cap_extensions = cap_extensions
self.rotate(rotation) self.rotate(rotation)
[self.mirror(a) for a, do in enumerate(mirrored) if do]
self.set_locked(locked)
def __deepcopy__(self, memo: dict | None = None) -> 'Path': def __deepcopy__(self, memo: Dict = None) -> 'Path':
memo = {} if memo is None else memo memo = {} if memo is None else memo
new = copy.copy(self) new = copy.copy(self)
Shape.unlock(new)
new._offset = self._offset.copy() new._offset = self._offset.copy()
new._vertices = self._vertices.copy() new._vertices = self._vertices.copy()
new._cap = copy.deepcopy(self._cap, memo) new._cap = copy.deepcopy(self._cap, memo)
new._cap_extensions = copy.deepcopy(self._cap_extensions, memo) new._cap_extensions = copy.deepcopy(self._cap_extensions, memo)
new._annotations = copy.deepcopy(self._annotations) new._annotations = copy.deepcopy(self._annotations)
new.set_locked(self.locked)
return new return new
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and numpy.array_equal(self.offset, other.offset)
and numpy.array_equal(self.vertices, other.vertices)
and self.width == other.width
and self.cap == other.cap
and numpy.array_equal(self.cap_extensions, other.cap_extensions) # type: ignore
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def __lt__(self, other: Shape) -> bool:
if type(self) is not type(other):
if repr(type(self)) != repr(type(other)):
return repr(type(self)) < repr(type(other))
return id(type(self)) < id(type(other))
other = cast(Path, other)
if self.width != other.width:
return self.width < other.width
if self.cap != other.cap:
return self.cap < other.cap
if not numpy.array_equal(self.cap_extensions, other.cap_extensions): # type: ignore
if other.cap_extensions is None:
return False
if self.cap_extensions is None:
return True
return tuple(self.cap_extensions) < tuple(other.cap_extensions)
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
@staticmethod @staticmethod
def travel( def travel(
travel_pairs: Sequence[tuple[float, float]], travel_pairs: Sequence[Tuple[float, float]],
width: float = 0.0, width: float = 0.0,
cap: PathCap = PathCap.Flush, cap: PathCap = PathCap.Flush,
cap_extensions: tuple[float, float] | None = None, cap_extensions: Optional[Tuple[float, float]] = None,
offset: ArrayLike = (0.0, 0.0), offset: ArrayLike = (0.0, 0.0),
rotation: float = 0, rotation: float = 0,
mirrored: Sequence[bool] = (False, False),
layer: layer_t = 0,
dose: float = 1.0,
) -> 'Path': ) -> 'Path':
""" """
Build a path by specifying the turn angles and travel distances Build a path by specifying the turn angles and travel distances
@ -264,11 +228,16 @@ class Path(Shape):
Default `(0, 0)` or `None`, depending on cap type Default `(0, 0)` or `None`, depending on cap type
offset: Offset, default `(0, 0)` offset: Offset, default `(0, 0)`
rotation: Rotation counterclockwise, in radians. Default `0` rotation: Rotation counterclockwise, in radians. Default `0`
mirrored: Whether to mirror across the x or y axes. For example,
`mirrored=(True, False)` results in a reflection across the x-axis,
multiplying the path's y-coordinates by -1. Default `(False, False)`
layer: Layer, default `0`
dose: Dose, default `1.0`
Returns: Returns:
The resulting Path object The resulting Path object
""" """
# TODO: Path.travel() needs testing #TODO: needs testing
direction = numpy.array([1, 0]) direction = numpy.array([1, 0])
verts = [numpy.zeros(2)] verts = [numpy.zeros(2)]
@ -277,13 +246,14 @@ class Path(Shape):
verts.append(verts[-1] + direction * distance) verts.append(verts[-1] + direction * distance)
return Path(vertices=verts, width=width, cap=cap, cap_extensions=cap_extensions, return Path(vertices=verts, width=width, cap=cap, cap_extensions=cap_extensions,
offset=offset, rotation=rotation) offset=offset, rotation=rotation, mirrored=mirrored,
layer=layer, dose=dose)
def to_polygons( def to_polygons(
self, self,
num_vertices: int | None = None, poly_num_points: int = None,
max_arclen: float | None = None, poly_max_arclen: float = None,
) -> list['Polygon']: ) -> List['Polygon']:
extensions = self._calculate_cap_extensions() extensions = self._calculate_cap_extensions()
v = remove_colinear_vertices(self.vertices, closed_path=False) v = remove_colinear_vertices(self.vertices, closed_path=False)
@ -292,7 +262,7 @@ class Path(Shape):
if self.width == 0: if self.width == 0:
verts = numpy.vstack((v, v[::-1])) verts = numpy.vstack((v, v[::-1]))
return [Polygon(offset=self.offset, vertices=verts)] return [Polygon(offset=self.offset, vertices=verts, dose=self.dose, layer=self.layer)]
perp = dvdir[:, ::-1] * [[1, -1]] * self.width / 2 perp = dvdir[:, ::-1] * [[1, -1]] * self.width / 2
@ -343,33 +313,31 @@ class Path(Shape):
o1.append(v[-1] - perp[-1]) o1.append(v[-1] - perp[-1])
verts = numpy.vstack((o0, o1[::-1])) verts = numpy.vstack((o0, o1[::-1]))
polys = [Polygon(offset=self.offset, vertices=verts)] polys = [Polygon(offset=self.offset, vertices=verts, dose=self.dose, layer=self.layer)]
if self.cap == PathCap.Circle: if self.cap == PathCap.Circle:
#for vert in v: # not sure if every vertex, or just ends? #for vert in v: # not sure if every vertex, or just ends?
for vert in [v[0], v[-1]]: for vert in [v[0], v[-1]]:
circ = Circle(offset=vert, radius=self.width / 2) circ = Circle(offset=vert, radius=self.width / 2, dose=self.dose, layer=self.layer)
polys += circ.to_polygons(num_vertices=num_vertices, max_arclen=max_arclen) polys += circ.to_polygons(poly_num_points=poly_num_points, poly_max_arclen=poly_max_arclen)
return polys return polys
def get_bounds_single(self) -> NDArray[numpy.float64]: def get_bounds(self) -> NDArray[numpy.float64]:
if self.cap == PathCap.Circle: if self.cap == PathCap.Circle:
bounds = self.offset + numpy.vstack((numpy.min(self.vertices, axis=0) - self.width / 2, bounds = self.offset + numpy.vstack((numpy.min(self.vertices, axis=0) - self.width / 2,
numpy.max(self.vertices, axis=0) + self.width / 2)) numpy.max(self.vertices, axis=0) + self.width / 2))
elif self.cap in ( elif self.cap in (PathCap.Flush,
PathCap.Flush,
PathCap.Square, PathCap.Square,
PathCap.SquareCustom, PathCap.SquareCustom):
):
bounds = numpy.array([[+inf, +inf], [-inf, -inf]]) bounds = numpy.array([[+inf, +inf], [-inf, -inf]])
polys = self.to_polygons() polys = self.to_polygons()
for poly in polys: for poly in polys:
poly_bounds = poly.get_bounds_single_nonempty() poly_bounds = poly.get_bounds_nonempty()
bounds[0, :] = numpy.minimum(bounds[0, :], poly_bounds[0, :]) bounds[0, :] = numpy.minimum(bounds[0, :], poly_bounds[0, :])
bounds[1, :] = numpy.maximum(bounds[1, :], poly_bounds[1, :]) bounds[1, :] = numpy.maximum(bounds[1, :], poly_bounds[1, :])
else: else:
raise PatternError(f'get_bounds_single() not implemented for endcaps: {self.cap}') raise PatternError(f'get_bounds() not implemented for endcaps: {self.cap}')
return bounds return bounds
@ -378,7 +346,7 @@ class Path(Shape):
self.vertices = numpy.dot(rotation_matrix_2d(theta), self.vertices.T).T self.vertices = numpy.dot(rotation_matrix_2d(theta), self.vertices.T).T
return self return self
def mirror(self, axis: int = 0) -> 'Path': def mirror(self, axis: int) -> 'Path':
self.vertices[:, axis - 1] *= -1 self.vertices[:, axis - 1] *= -1
return self return self
@ -405,18 +373,15 @@ class Path(Shape):
x_min = rotated_vertices[:, 0].argmin() x_min = rotated_vertices[:, 0].argmin()
if not is_scalar(x_min): if not is_scalar(x_min):
y_min = rotated_vertices[x_min, 1].argmin() y_min = rotated_vertices[x_min, 1].argmin()
x_min = cast(Sequence, x_min)[y_min] x_min = x_min[y_min]
reordered_vertices = numpy.roll(rotated_vertices, -x_min, axis=0) reordered_vertices = numpy.roll(rotated_vertices, -x_min, axis=0)
width0 = self.width / norm_value width0 = self.width / norm_value
return ((type(self), reordered_vertices.data.tobytes(), width0, self.cap), return ((type(self), reordered_vertices.data.tobytes(), width0, self.cap, self.layer),
(offset, scale / norm_value, rotation, False), (offset, scale / norm_value, rotation, False, self.dose),
lambda: Path( lambda: Path(reordered_vertices * norm_value, width=self.width * norm_value,
reordered_vertices * norm_value, cap=self.cap, layer=self.layer))
width=self.width * norm_value,
cap=self.cap,
))
def clean_vertices(self) -> 'Path': def clean_vertices(self) -> 'Path':
""" """
@ -429,22 +394,22 @@ class Path(Shape):
return self return self
def remove_duplicate_vertices(self) -> 'Path': def remove_duplicate_vertices(self) -> 'Path':
""" '''
Removes all consecutive duplicate (repeated) vertices. Removes all consecutive duplicate (repeated) vertices.
Returns: Returns:
self self
""" '''
self.vertices = remove_duplicate_vertices(self.vertices, closed_path=False) self.vertices = remove_duplicate_vertices(self.vertices, closed_path=False)
return self return self
def remove_colinear_vertices(self) -> 'Path': def remove_colinear_vertices(self) -> 'Path':
""" '''
Removes consecutive co-linear vertices. Removes consecutive co-linear vertices.
Returns: Returns:
self self
""" '''
self.vertices = remove_colinear_vertices(self.vertices, closed_path=False) self.vertices = remove_colinear_vertices(self.vertices, closed_path=False)
return self return self
@ -452,13 +417,29 @@ class Path(Shape):
if self.cap == PathCap.Square: if self.cap == PathCap.Square:
extensions = numpy.full(2, self.width / 2) extensions = numpy.full(2, self.width / 2)
elif self.cap == PathCap.SquareCustom: elif self.cap == PathCap.SquareCustom:
assert isinstance(self.cap_extensions, numpy.ndarray) assert(isinstance(self.cap_extensions, numpy.ndarray))
extensions = self.cap_extensions extensions = self.cap_extensions
else: else:
# Flush or Circle # Flush or Circle
extensions = numpy.zeros(2) extensions = numpy.zeros(2)
return extensions return extensions
def lock(self) -> 'Path':
self.vertices.flags.writeable = False
if self.cap_extensions is not None:
self.cap_extensions.flags.writeable = False
Shape.lock(self)
return self
def unlock(self) -> 'Path':
Shape.unlock(self)
self.vertices.flags.writeable = True
if self.cap_extensions is not None:
self.cap_extensions.flags.writeable = True
return self
def __repr__(self) -> str: def __repr__(self) -> str:
centroid = self.offset + self.vertices.mean(axis=0) centroid = self.offset + self.vertices.mean(axis=0)
return f'<Path centroid {centroid} v{len(self.vertices)} w{self.width} c{self.cap}>' dose = f' d{self.dose:g}' if self.dose != 1 else ''
locked = ' L' if self.locked else ''
return f'<Path l{self.layer} centroid {centroid} v{len(self.vertices)} w{self.width} c{self.cap}{dose}{locked}>'

View File

@ -1,52 +1,41 @@
from typing import Any, cast from typing import List, Dict, Optional, Sequence, Any
from collections.abc import Sequence
import copy import copy
import functools
import numpy import numpy
from numpy import pi from numpy import pi
from numpy.typing import NDArray, ArrayLike from numpy.typing import NDArray, ArrayLike
from . import Shape, normalized_shape_tuple from . import Shape, normalized_shape_tuple
from ..error import PatternError from .. import PatternError
from ..repetition import Repetition from ..repetition import Repetition
from ..utils import is_scalar, rotation_matrix_2d, annotations_lt, annotations_eq, rep2key from ..utils import is_scalar, rotation_matrix_2d, layer_t, AutoSlots
from ..utils import remove_colinear_vertices, remove_duplicate_vertices, annotations_t from ..utils import remove_colinear_vertices, remove_duplicate_vertices, annotations_t
from ..traits import LockableImpl
@functools.total_ordering class Polygon(Shape, metaclass=AutoSlots):
class Polygon(Shape):
""" """
A polygon, consisting of a bunch of vertices (Nx2 ndarray) which specify an A polygon, consisting of a bunch of vertices (Nx2 ndarray) which specify an
implicitly-closed boundary, and an offset. implicitly-closed boundary, and an offset.
Note that the setter for `Polygon.vertices` creates a copy of the
passed vertex coordinates.
A `normalized_form(...)` is available, but can be quite slow with lots of vertices. A `normalized_form(...)` is available, but can be quite slow with lots of vertices.
""" """
__slots__ = ( __slots__ = ('_vertices',)
'_vertices',
# Inherited
'_offset', '_repetition', '_annotations',
)
_vertices: NDArray[numpy.float64] _vertices: NDArray[numpy.float64]
""" Nx2 ndarray of vertices `[[x0, y0], [x1, y1], ...]` """ """ Nx2 ndarray of vertices `[[x0, y0], [x1, y1], ...]` """
# vertices property # vertices property
@property @property
def vertices(self) -> Any: # mypy#3004 NDArray[numpy.float64]: def vertices(self) -> Any: #TODO mypy#3004 NDArray[numpy.float64]:
""" """
Vertices of the polygon (Nx2 ndarray: `[[x0, y0], [x1, y1], ...]`) Vertices of the polygon (Nx2 ndarray: `[[x0, y0], [x1, y1], ...]`)
When setting, note that a copy of the provided vertices will be made,
""" """
return self._vertices return self._vertices
@vertices.setter @vertices.setter
def vertices(self, val: ArrayLike) -> None: def vertices(self, val: ArrayLike) -> None:
val = numpy.array(val, dtype=float) val = numpy.array(val, dtype=float) # TODO document that these might not be copied
if len(val.shape) < 2 or val.shape[1] != 2: if len(val.shape) < 2 or val.shape[1] != 2:
raise PatternError('Vertices must be an Nx2 array') raise PatternError('Vertices must be an Nx2 array')
if val.shape[0] < 3: if val.shape[0] < 3:
@ -89,68 +78,55 @@ class Polygon(Shape):
*, *,
offset: ArrayLike = (0.0, 0.0), offset: ArrayLike = (0.0, 0.0),
rotation: float = 0.0, rotation: float = 0.0,
repetition: Repetition | None = None, mirrored: Sequence[bool] = (False, False),
annotations: annotations_t | None = None, layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
annotations: Optional[annotations_t] = None,
locked: bool = False,
raw: bool = False, raw: bool = False,
) -> None: ) -> None:
LockableImpl.unlock(self)
self.identifier = ()
if raw: if raw:
assert isinstance(vertices, numpy.ndarray) assert(isinstance(vertices, numpy.ndarray))
assert isinstance(offset, numpy.ndarray) assert(isinstance(offset, numpy.ndarray))
self._vertices = vertices self._vertices = vertices
self._offset = offset self._offset = offset
self._repetition = repetition self._repetition = repetition
self._annotations = annotations if annotations is not None else {} self._annotations = annotations if annotations is not None else {}
self._layer = layer
self._dose = dose
else: else:
self.vertices = vertices self.vertices = vertices
self.offset = offset self.offset = offset
self.repetition = repetition self.repetition = repetition
self.annotations = annotations if annotations is not None else {} self.annotations = annotations if annotations is not None else {}
self.layer = layer
self.dose = dose
self.rotate(rotation) self.rotate(rotation)
[self.mirror(a) for a, do in enumerate(mirrored) if do]
self.set_locked(locked)
def __deepcopy__(self, memo: dict | None = None) -> 'Polygon': def __deepcopy__(self, memo: Optional[Dict] = None) -> 'Polygon':
memo = {} if memo is None else memo memo = {} if memo is None else memo
new = copy.copy(self) new = copy.copy(self)
Shape.unlock(new)
new._offset = self._offset.copy() new._offset = self._offset.copy()
new._vertices = self._vertices.copy() new._vertices = self._vertices.copy()
new._annotations = copy.deepcopy(self._annotations) new._annotations = copy.deepcopy(self._annotations)
new.set_locked(self.locked)
return new return new
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and numpy.array_equal(self.offset, other.offset)
and numpy.array_equal(self.vertices, other.vertices)
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def __lt__(self, other: Shape) -> bool:
if type(self) is not type(other):
if repr(type(self)) != repr(type(other)):
return repr(type(self)) < repr(type(other))
return id(type(self)) < id(type(other))
other = cast(Polygon, other)
if not numpy.array_equal(self.vertices, other.vertices):
min_len = min(self.vertices.shape[0], other.vertices.shape[0])
eq_mask = self.vertices[:min_len] != other.vertices[:min_len]
eq_lt = self.vertices[:min_len] < other.vertices[:min_len]
eq_lt_masked = eq_lt[eq_mask]
if eq_lt_masked.size > 0:
return eq_lt_masked.flat[0]
return self.vertices.shape[0] < other.vertices.shape[0]
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
@staticmethod @staticmethod
def square( def square(
side_length: float, side_length: float,
*, *,
rotation: float = 0.0, rotation: float = 0.0,
offset: ArrayLike = (0.0, 0.0), offset: ArrayLike = (0.0, 0.0),
repetition: Repetition | None = None, layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
) -> 'Polygon': ) -> 'Polygon':
""" """
Draw a square given side_length, centered on the origin. Draw a square given side_length, centered on the origin.
@ -159,6 +135,8 @@ class Polygon(Shape):
side_length: Length of one side side_length: Length of one side
rotation: Rotation counterclockwise, in radians rotation: Rotation counterclockwise, in radians
offset: Offset, default `(0, 0)` offset: Offset, default `(0, 0)`
layer: Layer, default `0`
dose: Dose, default `1.0`
repetition: `Repetition` object, default `None` repetition: `Repetition` object, default `None`
Returns: Returns:
@ -169,7 +147,8 @@ class Polygon(Shape):
[+1, +1], [+1, +1],
[+1, -1]], dtype=float) [+1, -1]], dtype=float)
vertices = 0.5 * side_length * norm_square vertices = 0.5 * side_length * norm_square
poly = Polygon(vertices, offset=offset, repetition=repetition) poly = Polygon(vertices, offset=offset, layer=layer, dose=dose,
repetition=repetition)
poly.rotate(rotation) poly.rotate(rotation)
return poly return poly
@ -180,7 +159,9 @@ class Polygon(Shape):
*, *,
rotation: float = 0, rotation: float = 0,
offset: ArrayLike = (0.0, 0.0), offset: ArrayLike = (0.0, 0.0),
repetition: Repetition | None = None, layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
) -> 'Polygon': ) -> 'Polygon':
""" """
Draw a rectangle with side lengths lx and ly, centered on the origin. Draw a rectangle with side lengths lx and ly, centered on the origin.
@ -190,6 +171,8 @@ class Polygon(Shape):
ly: Length along y (before rotation) ly: Length along y (before rotation)
rotation: Rotation counterclockwise, in radians rotation: Rotation counterclockwise, in radians
offset: Offset, default `(0, 0)` offset: Offset, default `(0, 0)`
layer: Layer, default `0`
dose: Dose, default `1.0`
repetition: `Repetition` object, default `None` repetition: `Repetition` object, default `None`
Returns: Returns:
@ -199,22 +182,25 @@ class Polygon(Shape):
[-lx, +ly], [-lx, +ly],
[+lx, +ly], [+lx, +ly],
[+lx, -ly]], dtype=float) [+lx, -ly]], dtype=float)
poly = Polygon(vertices, offset=offset, repetition=repetition) poly = Polygon(vertices, offset=offset, layer=layer, dose=dose,
repetition=repetition)
poly.rotate(rotation) poly.rotate(rotation)
return poly return poly
@staticmethod @staticmethod
def rect( def rect(
*, *,
xmin: float | None = None, xmin: Optional[float] = None,
xctr: float | None = None, xctr: Optional[float] = None,
xmax: float | None = None, xmax: Optional[float] = None,
lx: float | None = None, lx: Optional[float] = None,
ymin: float | None = None, ymin: Optional[float] = None,
yctr: float | None = None, yctr: Optional[float] = None,
ymax: float | None = None, ymax: Optional[float] = None,
ly: float | None = None, ly: Optional[float] = None,
repetition: Repetition | None = None, layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
) -> 'Polygon': ) -> 'Polygon':
""" """
Draw a rectangle by specifying side/center positions. Draw a rectangle by specifying side/center positions.
@ -231,6 +217,8 @@ class Polygon(Shape):
yctr: Center y coordinate yctr: Center y coordinate
ymax: Maximum y coordinate ymax: Maximum y coordinate
ly: Length along y direction ly: Length along y direction
layer: Layer, default `0`
dose: Dose, default `1.0`
repetition: `Repetition` object, default `None` repetition: `Repetition` object, default `None`
Returns: Returns:
@ -238,76 +226,79 @@ class Polygon(Shape):
""" """
if lx is None: if lx is None:
if xctr is None: if xctr is None:
assert xmin is not None assert(xmin is not None)
assert xmax is not None assert(xmax is not None)
xctr = 0.5 * (xmax + xmin) xctr = 0.5 * (xmax + xmin)
lx = xmax - xmin lx = xmax - xmin
elif xmax is None: elif xmax is None:
assert xmin is not None assert(xmin is not None)
assert xctr is not None assert(xctr is not None)
lx = 2 * (xctr - xmin) lx = 2 * (xctr - xmin)
elif xmin is None: elif xmin is None:
assert xctr is not None assert(xctr is not None)
assert xmax is not None assert(xmax is not None)
lx = 2 * (xmax - xctr) lx = 2 * (xmax - xctr)
else: else:
raise PatternError('Two of xmin, xctr, xmax, lx must be None!') raise PatternError('Two of xmin, xctr, xmax, lx must be None!')
else: # noqa: PLR5501 else:
if xctr is not None: if xctr is not None:
pass pass
elif xmax is None: elif xmax is None:
assert xmin is not None assert(xmin is not None)
assert lx is not None assert(lx is not None)
xctr = xmin + 0.5 * lx xctr = xmin + 0.5 * lx
elif xmin is None: elif xmin is None:
assert xmax is not None assert(xmax is not None)
assert lx is not None assert(lx is not None)
xctr = xmax - 0.5 * lx xctr = xmax - 0.5 * lx
else: else:
raise PatternError('Two of xmin, xctr, xmax, lx must be None!') raise PatternError('Two of xmin, xctr, xmax, lx must be None!')
if ly is None: if ly is None:
if yctr is None: if yctr is None:
assert ymin is not None assert(ymin is not None)
assert ymax is not None assert(ymax is not None)
yctr = 0.5 * (ymax + ymin) yctr = 0.5 * (ymax + ymin)
ly = ymax - ymin ly = ymax - ymin
elif ymax is None: elif ymax is None:
assert ymin is not None assert(ymin is not None)
assert yctr is not None assert(yctr is not None)
ly = 2 * (yctr - ymin) ly = 2 * (yctr - ymin)
elif ymin is None: elif ymin is None:
assert yctr is not None assert(yctr is not None)
assert ymax is not None assert(ymax is not None)
ly = 2 * (ymax - yctr) ly = 2 * (ymax - yctr)
else: else:
raise PatternError('Two of ymin, yctr, ymax, ly must be None!') raise PatternError('Two of ymin, yctr, ymax, ly must be None!')
else: # noqa: PLR5501 else:
if yctr is not None: if yctr is not None:
pass pass
elif ymax is None: elif ymax is None:
assert ymin is not None assert(ymin is not None)
assert ly is not None assert(ly is not None)
yctr = ymin + 0.5 * ly yctr = ymin + 0.5 * ly
elif ymin is None: elif ymin is None:
assert ly is not None assert(ly is not None)
assert ymax is not None assert(ymax is not None)
yctr = ymax - 0.5 * ly yctr = ymax - 0.5 * ly
else: else:
raise PatternError('Two of ymin, yctr, ymax, ly must be None!') raise PatternError('Two of ymin, yctr, ymax, ly must be None!')
poly = Polygon.rectangle(lx, ly, offset=(xctr, yctr), repetition=repetition) poly = Polygon.rectangle(lx, ly, offset=(xctr, yctr),
layer=layer, dose=dose, repetition=repetition)
return poly return poly
@staticmethod @staticmethod
def octagon( def octagon(
*, *,
side_length: float | None = None, side_length: Optional[float] = None,
inner_radius: float | None = None, inner_radius: Optional[float] = None,
regular: bool = True, regular: bool = True,
center: ArrayLike = (0.0, 0.0), center: ArrayLike = (0.0, 0.0),
rotation: float = 0.0, rotation: float = 0.0,
repetition: Repetition | None = None, layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
) -> 'Polygon': ) -> 'Polygon':
""" """
Draw an octagon given one of (side length, inradius, circumradius). Draw an octagon given one of (side length, inradius, circumradius).
@ -325,12 +316,17 @@ class Polygon(Shape):
rotation: Rotation counterclockwise, in radians. rotation: Rotation counterclockwise, in radians.
`0` results in four axis-aligned sides (the long sides of the `0` results in four axis-aligned sides (the long sides of the
irregular octagon). irregular octagon).
layer: Layer, default `0`
dose: Dose, default `1.0`
repetition: `Repetition` object, default `None` repetition: `Repetition` object, default `None`
Returns: Returns:
A Polygon object containing the requested octagon A Polygon object containing the requested octagon
""" """
s = (1 + numpy.sqrt(2)) if regular else 2 if regular:
s = 1 + numpy.sqrt(2)
else:
s = 2
norm_oct = numpy.array([ norm_oct = numpy.array([
[-1, -s], [-1, -s],
@ -348,18 +344,19 @@ class Polygon(Shape):
side_length = 2 * inner_radius / s side_length = 2 * inner_radius / s
vertices = 0.5 * side_length * norm_oct vertices = 0.5 * side_length * norm_oct
poly = Polygon(vertices, offset=center, repetition=repetition) poly = Polygon(vertices, offset=center, layer=layer, dose=dose, repetition=repetition)
poly.rotate(rotation) poly.rotate(rotation)
return poly return poly
def to_polygons( def to_polygons(
self, self,
num_vertices: int | None = None, # unused # noqa: ARG002 poly_num_points: int = None, # unused
max_arclen: float | None = None, # unused # noqa: ARG002 poly_max_arclen: float = None, # unused
) -> list['Polygon']: ) -> List['Polygon']:
return [copy.deepcopy(self)] return [copy.deepcopy(self)]
def get_bounds_single(self) -> NDArray[numpy.float64]: # TODO note shape get_bounds doesn't include repetition def get_bounds(self) -> NDArray[numpy.float64]:
return numpy.vstack((self.offset + numpy.min(self.vertices, axis=0), return numpy.vstack((self.offset + numpy.min(self.vertices, axis=0),
self.offset + numpy.max(self.vertices, axis=0))) self.offset + numpy.max(self.vertices, axis=0)))
@ -368,7 +365,7 @@ class Polygon(Shape):
self.vertices = numpy.dot(rotation_matrix_2d(theta), self.vertices.T).T self.vertices = numpy.dot(rotation_matrix_2d(theta), self.vertices.T).T
return self return self
def mirror(self, axis: int = 0) -> 'Polygon': def mirror(self, axis: int) -> 'Polygon':
self.vertices[:, axis - 1] *= -1 self.vertices[:, axis - 1] *= -1
return self return self
@ -379,9 +376,8 @@ class Polygon(Shape):
def normalized_form(self, norm_value: float) -> normalized_shape_tuple: def normalized_form(self, norm_value: float) -> normalized_shape_tuple:
# Note: this function is going to be pretty slow for many-vertexed polygons, relative to # Note: this function is going to be pretty slow for many-vertexed polygons, relative to
# other shapes # other shapes
meanv = self.vertices.mean(axis=0) offset = self.vertices.mean(axis=0) + self.offset
zeroed_vertices = self.vertices - meanv zeroed_vertices = self.vertices - offset
offset = meanv + self.offset
scale = zeroed_vertices.std() scale = zeroed_vertices.std()
normed_vertices = zeroed_vertices / scale normed_vertices = zeroed_vertices / scale
@ -395,14 +391,14 @@ class Polygon(Shape):
x_min = rotated_vertices[:, 0].argmin() x_min = rotated_vertices[:, 0].argmin()
if not is_scalar(x_min): if not is_scalar(x_min):
y_min = rotated_vertices[x_min, 1].argmin() y_min = rotated_vertices[x_min, 1].argmin()
x_min = cast(Sequence, x_min)[y_min] x_min = x_min[y_min]
reordered_vertices = numpy.roll(rotated_vertices, -x_min, axis=0) reordered_vertices = numpy.roll(rotated_vertices, -x_min, axis=0)
# TODO: normalize mirroring? # TODO: normalize mirroring?
return ((type(self), reordered_vertices.data.tobytes()), return ((type(self), reordered_vertices.data.tobytes(), self.layer),
(offset, scale / norm_value, rotation, False), (offset, scale / norm_value, rotation, False, self.dose),
lambda: Polygon(reordered_vertices * norm_value)) lambda: Polygon(reordered_vertices * norm_value, layer=self.layer))
def clean_vertices(self) -> 'Polygon': def clean_vertices(self) -> 'Polygon':
""" """
@ -415,25 +411,37 @@ class Polygon(Shape):
return self return self
def remove_duplicate_vertices(self) -> 'Polygon': def remove_duplicate_vertices(self) -> 'Polygon':
""" '''
Removes all consecutive duplicate (repeated) vertices. Removes all consecutive duplicate (repeated) vertices.
Returns: Returns:
self self
""" '''
self.vertices = remove_duplicate_vertices(self.vertices, closed_path=True) self.vertices = remove_duplicate_vertices(self.vertices, closed_path=True)
return self return self
def remove_colinear_vertices(self) -> 'Polygon': def remove_colinear_vertices(self) -> 'Polygon':
""" '''
Removes consecutive co-linear vertices. Removes consecutive co-linear vertices.
Returns: Returns:
self self
""" '''
self.vertices = remove_colinear_vertices(self.vertices, closed_path=True) self.vertices = remove_colinear_vertices(self.vertices, closed_path=True)
return self return self
def lock(self) -> 'Polygon':
self.vertices.flags.writeable = False
Shape.lock(self)
return self
def unlock(self) -> 'Polygon':
Shape.unlock(self)
self.vertices.flags.writeable = True
return self
def __repr__(self) -> str: def __repr__(self) -> str:
centroid = self.offset + self.vertices.mean(axis=0) centroid = self.offset + self.vertices.mean(axis=0)
return f'<Polygon centroid {centroid} v{len(self.vertices)}>' dose = f' d{self.dose:g}' if self.dose != 1 else ''
locked = ' L' if self.locked else ''
return f'<Polygon l{self.layer} centroid {centroid} v{len(self.vertices)}{dose}{locked}>'

View File

@ -1,62 +1,57 @@
from typing import TYPE_CHECKING, Any from typing import List, Tuple, Callable, TypeVar, Optional, TYPE_CHECKING
from collections.abc import Callable
from abc import ABCMeta, abstractmethod from abc import ABCMeta, abstractmethod
import numpy import numpy
from numpy.typing import NDArray, ArrayLike from numpy.typing import NDArray, ArrayLike
from ..traits import ( from ..traits import (PositionableImpl, LayerableImpl, DoseableImpl,
Rotatable, Mirrorable, Copyable, Scalable, Rotatable, Mirrorable, Copyable, Scalable,
PositionableImpl, PivotableImpl, RepeatableImpl, AnnotatableImpl, PivotableImpl, LockableImpl, RepeatableImpl,
) AnnotatableImpl)
if TYPE_CHECKING: if TYPE_CHECKING:
from . import Polygon from . import Polygon
# Type definitions # Type definitions
normalized_shape_tuple = tuple[ normalized_shape_tuple = Tuple[Tuple,
tuple, Tuple[NDArray[numpy.float64], float, float, bool, float],
tuple[NDArray[numpy.float64], float, float, bool], Callable[[], 'Shape']]
Callable[[], 'Shape'],
]
# ## Module-wide defaults # ## Module-wide defaults
# Default number of points per polygon for shapes # Default number of points per polygon for shapes
DEFAULT_POLY_NUM_VERTICES = 24 DEFAULT_POLY_NUM_POINTS = 24
class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable, T = TypeVar('T', bound='Shape')
PivotableImpl, RepeatableImpl, AnnotatableImpl, metaclass=ABCMeta):
class Shape(PositionableImpl, LayerableImpl, DoseableImpl, Rotatable, Mirrorable, Copyable, Scalable,
PivotableImpl, RepeatableImpl, LockableImpl, AnnotatableImpl, metaclass=ABCMeta):
""" """
Class specifying functions common to all shapes. Abstract class specifying functions common to all shapes.
""" """
__slots__ = () # Children should use AutoSlots or set slots themselves __slots__ = () # Children should use AutoSlots
#def __copy__(self) -> Self: identifier: Tuple
# cls = self.__class__ """ An arbitrary identifier for the shape, usually empty but used by `Pattern.flatten()` """
# new = cls.__new__(cls)
# for name in self.__slots__: # type: str
# object.__setattr__(new, name, getattr(self, name))
# return new
# def __copy__(self) -> 'Shape':
# Methods (abstract) cls = self.__class__
# new = cls.__new__(cls)
@abstractmethod for name in self.__slots__: # type: str
def __eq__(self, other: Any) -> bool: object.__setattr__(new, name, getattr(self, name))
pass return new
@abstractmethod
def __lt__(self, other: 'Shape') -> bool:
pass
'''
--- Abstract methods
'''
@abstractmethod @abstractmethod
def to_polygons( def to_polygons(
self, self,
num_vertices: int | None = None, num_vertices: Optional[int] = None,
max_arclen: float | None = None, max_arclen: Optional[float] = None,
) -> list['Polygon']: ) -> List['Polygon']:
""" """
Returns a list of polygons which approximate the shape. Returns a list of polygons which approximate the shape.
@ -73,9 +68,9 @@ class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
pass pass
@abstractmethod @abstractmethod
def normalized_form(self, norm_value: int) -> normalized_shape_tuple: def normalized_form(self: T, norm_value: int) -> normalized_shape_tuple:
""" """
Writes the shape in a standardized notation, with offset, scale, and rotation Writes the shape in a standardized notation, with offset, scale, rotation, and dose
information separated out from the remaining values. information separated out from the remaining values.
Args: Args:
@ -90,20 +85,20 @@ class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
`(intrinsic, extrinsic, constructor)`. These are further broken down as: `(intrinsic, extrinsic, constructor)`. These are further broken down as:
`intrinsic`: A tuple of basic types containing all information about the instance that `intrinsic`: A tuple of basic types containing all information about the instance that
is not contained in 'extrinsic'. Usually, `intrinsic[0] == type(self)`. is not contained in 'extrinsic'. Usually, `intrinsic[0] == type(self)`.
`extrinsic`: `([x_offset, y_offset], scale, rotation, mirror_across_x_axis)` `extrinsic`: `([x_offset, y_offset], scale, rotation, mirror_across_x_axis, dose)`
`constructor`: A callable (no arguments) which returns an instance of `type(self)` with `constructor`: A callable (no arguments) which returns an instance of `type(self)` with
internal state equivalent to `intrinsic`. internal state equivalent to `intrinsic`.
""" """
pass pass
# '''
# Non-abstract methods ---- Non-abstract methods
# '''
def manhattanize_fast( def manhattanize_fast(
self, self,
grid_x: ArrayLike, grid_x: ArrayLike,
grid_y: ArrayLike, grid_y: ArrayLike,
) -> list['Polygon']: ) -> List['Polygon']:
""" """
Returns a list of polygons with grid-aligned ("Manhattan") edges approximating the shape. Returns a list of polygons with grid-aligned ("Manhattan") edges approximating the shape.
@ -127,7 +122,7 @@ class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
polygon_contours = [] polygon_contours = []
for polygon in self.to_polygons(): for polygon in self.to_polygons():
bounds = polygon.get_bounds_single() bounds = polygon.get_bounds()
if bounds is None: if bounds is None:
continue continue
@ -135,7 +130,7 @@ class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
vertex_lists = [] vertex_lists = []
p_verts = polygon.vertices + polygon.offset p_verts = polygon.vertices + polygon.offset
for v, v_next in zip(p_verts, numpy.roll(p_verts, -1, axis=0), strict=True): for v, v_next in zip(p_verts, numpy.roll(p_verts, -1, axis=0)):
dv = v_next - v dv = v_next - v
# Find x-index bounds for the line # TODO: fix this and err_xmin/xmax for grids smaller than the line / shape # Find x-index bounds for the line # TODO: fix this and err_xmin/xmax for grids smaller than the line / shape
@ -165,7 +160,7 @@ class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
m = dv[1] / dv[0] m = dv[1] / dv[0]
def get_grid_inds(xes: ArrayLike, m: float = m, v: NDArray = v) -> NDArray[numpy.float64]: def get_grid_inds(xes: ArrayLike) -> NDArray[numpy.float64]:
ys = m * (xes - v[0]) + v[1] ys = m * (xes - v[0]) + v[1]
# (inds - 1) is the index of the y-grid line below the edge's intersection with the x-grid # (inds - 1) is the index of the y-grid line below the edge's intersection with the x-grid
@ -180,14 +175,14 @@ class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
return inds return inds
# Find the y indices on all x gridlines # Find the y indices on all x gridlines
xs = gx[int(gxi_min):int(gxi_max)] xs = gx[gxi_min:gxi_max]
inds = get_grid_inds(xs) inds = get_grid_inds(xs)
# Find y-intersections for x-midpoints # Find y-intersections for x-midpoints
xs2 = (xs[:-1] + xs[1:]) / 2 xs2 = (xs[:-1] + xs[1:]) / 2
inds2 = get_grid_inds(xs2) inds2 = get_grid_inds(xs2)
xinds = numpy.rint(numpy.arange(gxi_min, gxi_max - 0.99, 1 / 3)).astype(numpy.int64) xinds = numpy.rint(numpy.arange(gxi_min, gxi_max - 0.99, 1 / 3), dtype=numpy.int64, casting='unsafe')
# interleave the results # interleave the results
yinds = xinds.copy() yinds = xinds.copy()
@ -202,7 +197,12 @@ class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
vertex_lists.append(vlist) vertex_lists.append(vlist)
polygon_contours.append(numpy.vstack(vertex_lists)) polygon_contours.append(numpy.vstack(vertex_lists))
manhattan_polygons = [Polygon(vertices=contour) for contour in polygon_contours] manhattan_polygons = []
for contour in polygon_contours:
manhattan_polygons.append(Polygon(
vertices=contour,
layer=self.layer,
dose=self.dose))
return manhattan_polygons return manhattan_polygons
@ -210,7 +210,7 @@ class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
self, self,
grid_x: ArrayLike, grid_x: ArrayLike,
grid_y: ArrayLike, grid_y: ArrayLike,
) -> list['Polygon']: ) -> List['Polygon']:
""" """
Returns a list of polygons with grid-aligned ("Manhattan") edges approximating the shape. Returns a list of polygons with grid-aligned ("Manhattan") edges approximating the shape.
@ -259,19 +259,18 @@ class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
polygon_contours = [] polygon_contours = []
for polygon in self.to_polygons(): for polygon in self.to_polygons():
# Get rid of unused gridlines (anything not within 2 lines of the polygon bounds) # Get rid of unused gridlines (anything not within 2 lines of the polygon bounds)
bounds = polygon.get_bounds_single() bounds = polygon.get_bounds()
if bounds is None: if bounds is None:
continue continue
mins, maxs = bounds mins, maxs = bounds
keep_x = numpy.logical_and(grx > mins[0], grx < maxs[0]) keep_x = numpy.logical_and(grx > mins[0], grx < maxs[0])
keep_y = numpy.logical_and(gry > mins[1], gry < maxs[1]) keep_y = numpy.logical_and(gry > mins[1], gry < maxs[1])
# Flood left & rightwards by 2 cells for k in (keep_x, keep_y):
for kk in (keep_x, keep_y): for s in (1, 2):
for ss in (1, 2): k[s:] += k[:-s]
kk[ss:] += kk[:-ss] k[:-s] += k[s:]
kk[:-ss] += kk[ss:] k = k > 0
kk[:] = kk > 0
gx = grx[keep_x] gx = grx[keep_x]
gy = gry[keep_y] gy = gry[keep_y]
@ -294,10 +293,23 @@ class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
for contour in contours: for contour in contours:
# /2 deals with supersampling # /2 deals with supersampling
# +.5 deals with the fact that our 0-edge becomes -.5 in the super-sampled contour output # +.5 deals with the fact that our 0-edge becomes -.5 in the super-sampled contour output
snapped_contour = numpy.rint((contour + .5) / 2).astype(numpy.int64) snapped_contour = numpy.rint((contour + .5) / 2, dtype=numpy.int64, casting='unsafe')
vertices = numpy.hstack((grx[snapped_contour[:, None, 0] + offset_i[0]], vertices = numpy.hstack((grx[snapped_contour[:, None, 0] + offset_i[0]],
gry[snapped_contour[:, None, 1] + offset_i[1]])) gry[snapped_contour[:, None, 1] + offset_i[1]]))
manhattan_polygons.append(Polygon(vertices=vertices)) manhattan_polygons.append(Polygon(
vertices=vertices,
layer=self.layer,
dose=self.dose))
return manhattan_polygons return manhattan_polygons
def lock(self: T) -> T:
PositionableImpl._lock(self)
LockableImpl.lock(self)
return self
def unlock(self: T) -> T:
LockableImpl.unlock(self)
PositionableImpl._unlock(self)
return self

View File

@ -1,37 +1,33 @@
from typing import Self, Any, cast from typing import List, Tuple, Dict, Sequence, Optional, Any
import copy import copy
import functools
import numpy import numpy
from numpy import pi, nan from numpy import pi, inf
from numpy.typing import NDArray, ArrayLike from numpy.typing import NDArray, ArrayLike
from . import Shape, Polygon, normalized_shape_tuple from . import Shape, Polygon, normalized_shape_tuple
from ..error import PatternError from .. import PatternError
from ..repetition import Repetition from ..repetition import Repetition
from ..traits import RotatableImpl from ..traits import RotatableImpl
from ..utils import is_scalar, get_bit, annotations_t, annotations_lt, annotations_eq, rep2key from ..utils import is_scalar, get_bit, normalize_mirror, layer_t, AutoSlots
from ..utils import annotations_t
from ..traits import LockableImpl
# Loaded on use: # Loaded on use:
# from freetype import Face # from freetype import Face
# from matplotlib.path import Path # from matplotlib.path import Path
@functools.total_ordering class Text(RotatableImpl, Shape, metaclass=AutoSlots):
class Text(RotatableImpl, Shape):
""" """
Text (to be printed e.g. as a set of polygons). Text (to be printed e.g. as a set of polygons).
This is distinct from non-printed Label objects. This is distinct from non-printed Label objects.
""" """
__slots__ = ( __slots__ = ('_string', '_height', '_mirrored', 'font_path')
'_string', '_height', '_mirrored', 'font_path',
# Inherited
'_offset', '_repetition', '_annotations', '_rotation',
)
_string: str _string: str
_height: float _height: float
_mirrored: bool _mirrored: NDArray[numpy.bool_]
font_path: str font_path: str
# vertices property # vertices property
@ -54,13 +50,16 @@ class Text(RotatableImpl, Shape):
raise PatternError('Height must be a scalar') raise PatternError('Height must be a scalar')
self._height = val self._height = val
# Mirrored property
@property @property
def mirrored(self) -> bool: # mypy#3004, should be bool def mirrored(self) -> Any: #TODO mypy#3004 NDArray[numpy.bool_]:
return self._mirrored return self._mirrored
@mirrored.setter @mirrored.setter
def mirrored(self, val: bool) -> None: def mirrored(self, val: Sequence[bool]) -> None:
self._mirrored = bool(val) if is_scalar(val):
raise PatternError('Mirrored must be a 2-element list of booleans')
self._mirrored = numpy.array(val, dtype=bool, copy=True)
def __init__( def __init__(
self, self,
@ -70,71 +69,56 @@ class Text(RotatableImpl, Shape):
*, *,
offset: ArrayLike = (0.0, 0.0), offset: ArrayLike = (0.0, 0.0),
rotation: float = 0.0, rotation: float = 0.0,
repetition: Repetition | None = None, mirrored: ArrayLike = (False, False),
annotations: annotations_t | None = None, layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
annotations: Optional[annotations_t] = None,
locked: bool = False,
raw: bool = False, raw: bool = False,
) -> None: ) -> None:
LockableImpl.unlock(self)
self.identifier = ()
if raw: if raw:
assert isinstance(offset, numpy.ndarray) assert(isinstance(offset, numpy.ndarray))
assert(isinstance(mirrored, numpy.ndarray))
self._offset = offset self._offset = offset
self._layer = layer
self._dose = dose
self._string = string self._string = string
self._height = height self._height = height
self._rotation = rotation self._rotation = rotation
self._mirrored = mirrored
self._repetition = repetition self._repetition = repetition
self._annotations = annotations if annotations is not None else {} self._annotations = annotations if annotations is not None else {}
else: else:
self.offset = offset self.offset = offset
self.layer = layer
self.dose = dose
self.string = string self.string = string
self.height = height self.height = height
self.rotation = rotation self.rotation = rotation
self.mirrored = mirrored
self.repetition = repetition self.repetition = repetition
self.annotations = annotations if annotations is not None else {} self.annotations = annotations if annotations is not None else {}
self.font_path = font_path self.font_path = font_path
self.set_locked(locked)
def __deepcopy__(self, memo: dict | None = None) -> Self: def __deepcopy__(self, memo: Dict = None) -> 'Text':
memo = {} if memo is None else memo memo = {} if memo is None else memo
new = copy.copy(self) new = copy.copy(self)
Shape.unlock(new)
new._offset = self._offset.copy() new._offset = self._offset.copy()
new._mirrored = copy.deepcopy(self._mirrored, memo)
new._annotations = copy.deepcopy(self._annotations) new._annotations = copy.deepcopy(self._annotations)
new.set_locked(self.locked)
return new return new
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and numpy.array_equal(self.offset, other.offset)
and self.string == other.string
and self.height == other.height
and self.font_path == other.font_path
and self.rotation == other.rotation
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def __lt__(self, other: Shape) -> bool:
if type(self) is not type(other):
if repr(type(self)) != repr(type(other)):
return repr(type(self)) < repr(type(other))
return id(type(self)) < id(type(other))
other = cast(Text, other)
if not self.height == other.height:
return self.height < other.height
if not self.string == other.string:
return self.string < other.string
if not self.font_path == other.font_path:
return self.font_path < other.font_path
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.rotation != other.rotation:
return self.rotation < other.rotation
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
def to_polygons( def to_polygons(
self, self,
num_vertices: int | None = None, # unused # noqa: ARG002 poly_num_points: Optional[int] = None, # unused
max_arclen: float | None = None, # unused # noqa: ARG002 poly_max_arclen: Optional[float] = None, # unused
) -> list[Polygon]: ) -> List[Polygon]:
all_polygons = [] all_polygons = []
total_advance = 0.0 total_advance = 0.0
for char in self.string: for char in self.string:
@ -142,9 +126,8 @@ class Text(RotatableImpl, Shape):
# Move these polygons to the right of the previous letter # Move these polygons to the right of the previous letter
for xys in raw_polys: for xys in raw_polys:
poly = Polygon(xys) poly = Polygon(xys, dose=self.dose, layer=self.layer)
if self.mirrored: poly.mirror2d(self.mirrored)
poly.mirror()
poly.scale_by(self.height) poly.scale_by(self.height)
poly.offset = self.offset + [total_advance, 0] poly.offset = self.offset + [total_advance, 0]
poly.rotate_around(self.offset, self.rotation) poly.rotate_around(self.offset, self.rotation)
@ -155,53 +138,45 @@ class Text(RotatableImpl, Shape):
return all_polygons return all_polygons
def mirror(self, axis: int = 0) -> Self: def mirror(self, axis: int) -> 'Text':
self.mirrored = not self.mirrored self.mirrored[axis] = not self.mirrored[axis]
if axis == 1:
self.rotation += pi
return self return self
def scale_by(self, c: float) -> Self: def scale_by(self, c: float) -> 'Text':
self.height *= c self.height *= c
return self return self
def normalized_form(self, norm_value: float) -> normalized_shape_tuple: def normalized_form(self, norm_value: float) -> normalized_shape_tuple:
rotation = self.rotation % (2 * pi) mirror_x, rotation = normalize_mirror(self.mirrored)
return ((type(self), self.string, self.font_path), rotation += self.rotation
(self.offset, self.height / norm_value, rotation, bool(self.mirrored)), rotation %= 2 * pi
lambda: Text( return ((type(self), self.string, self.font_path, self.layer),
string=self.string, (self.offset, self.height / norm_value, rotation, mirror_x, self.dose),
lambda: Text(string=self.string,
height=self.height * norm_value, height=self.height * norm_value,
font_path=self.font_path, font_path=self.font_path,
rotation=rotation, rotation=rotation,
).mirror2d(across_x=self.mirrored), mirrored=(mirror_x, False),
) layer=self.layer))
def get_bounds_single(self) -> NDArray[numpy.float64]: def get_bounds(self) -> NDArray[numpy.float64]:
# rotation makes this a huge pain when using slot.advance and glyph.bbox(), so # rotation makes this a huge pain when using slot.advance and glyph.bbox(), so
# just convert to polygons instead # just convert to polygons instead
bounds = numpy.array([[+inf, +inf], [-inf, -inf]])
polys = self.to_polygons() polys = self.to_polygons()
pbounds = numpy.full((len(polys), 2, 2), nan) for poly in polys:
for pp, poly in enumerate(polys): poly_bounds = poly.get_bounds()
pbounds[pp] = poly.get_bounds_nonempty() bounds[0, :] = numpy.minimum(bounds[0, :], poly_bounds[0, :])
bounds = numpy.vstack(( bounds[1, :] = numpy.maximum(bounds[1, :], poly_bounds[1, :])
numpy.min(pbounds[: 0, :], axis=0),
numpy.max(pbounds[: 1, :], axis=0),
))
return bounds return bounds
def __repr__(self) -> str:
rotation = f'{numpy.rad2deg(self.rotation):g}' if self.rotation != 0 else ''
mirrored = ' m{:d}' if self.mirrored else ''
return f'<TextShape "{self.string}" o{self.offset} h{self.height:g}{rotation}{mirrored}>'
def get_char_as_polygons( def get_char_as_polygons(
font_path: str, font_path: str,
char: str, char: str,
resolution: float = 48 * 64, resolution: float = 48 * 64,
) -> tuple[list[list[list[float]]], float]: ) -> Tuple[List[List[List[float]]], float]:
from freetype import Face # type: ignore from freetype import Face # type: ignore
from matplotlib.path import Path # type: ignore from matplotlib.path import Path # type: ignore
@ -221,7 +196,7 @@ def get_char_as_polygons(
'advance' distance (distance from the start of this glyph to the start of the next one) 'advance' distance (distance from the start of this glyph to the start of the next one)
""" """
if len(char) != 1: if len(char) != 1:
raise PatternError('get_char_as_polygons called with non-char') raise Exception('get_char_as_polygons called with non-char')
face = Face(font_path) face = Face(font_path)
face.set_char_size(resolution) face.set_char_size(resolution)
@ -230,8 +205,7 @@ def get_char_as_polygons(
outline = slot.outline outline = slot.outline
start = 0 start = 0
all_verts_list = [] all_verts_list, all_codes = [], []
all_codes = []
for end in outline.contours: for end in outline.contours:
points = outline.points[start:end + 1] points = outline.points[start:end + 1]
points.append(points[0]) points.append(points[0])
@ -239,7 +213,7 @@ def get_char_as_polygons(
tags = outline.tags[start:end + 1] tags = outline.tags[start:end + 1]
tags.append(tags[0]) tags.append(tags[0])
segments: list[list[list[float]]] = [] segments: List[List[List[float]]] = []
for j, point in enumerate(points): for j, point in enumerate(points):
# If we already have a segment, add this point to it # If we already have a segment, add this point to it
if j > 0: if j > 0:
@ -284,3 +258,20 @@ def get_char_as_polygons(
polygons = path.to_polygons() polygons = path.to_polygons()
return polygons, advance return polygons, advance
def lock(self) -> 'Text':
self.mirrored.flags.writeable = False
Shape.lock(self)
return self
def unlock(self) -> 'Text':
Shape.unlock(self)
self.mirrored.flags.writeable = True
return self
def __repr__(self) -> str:
rotation = f'{self.rotation*180/pi:g}' if self.rotation != 0 else ''
dose = f' d{self.dose:g}' if self.dose != 1 else ''
locked = ' L' if self.locked else ''
mirrored = ' m{:d}{:d}'.format(*self.mirrored) if self.mirrored.any() else ''
return f'<TextShape "{self.string}" l{self.layer} o{self.offset} h{self.height:g}{rotation}{mirrored}{dose}{locked}>'

248
masque/subpattern.py Normal file
View File

@ -0,0 +1,248 @@
"""
SubPattern provides basic support for nesting Pattern objects within each other, by adding
offset, rotation, scaling, and other such properties to the reference.
"""
#TODO more top-level documentation
from typing import Dict, Tuple, Optional, Sequence, TYPE_CHECKING, Any, TypeVar
import copy
import numpy
from numpy import pi
from numpy.typing import NDArray, ArrayLike
from .error import PatternError
from .utils import is_scalar, AutoSlots, annotations_t
from .repetition import Repetition
from .traits import (PositionableImpl, DoseableImpl, RotatableImpl, ScalableImpl,
Mirrorable, PivotableImpl, Copyable, LockableImpl, RepeatableImpl,
AnnotatableImpl)
if TYPE_CHECKING:
from . import Pattern
S = TypeVar('S', bound='SubPattern')
class SubPattern(PositionableImpl, DoseableImpl, RotatableImpl, ScalableImpl, Mirrorable,
PivotableImpl, Copyable, RepeatableImpl, LockableImpl, AnnotatableImpl,
metaclass=AutoSlots):
"""
SubPattern provides basic support for nesting Pattern objects within each other, by adding
offset, rotation, scaling, and associated methods.
"""
__slots__ = ('_pattern',
'_mirrored',
'identifier',
)
_pattern: Optional['Pattern']
""" The `Pattern` being instanced """
_mirrored: NDArray[numpy.bool_]
""" Whether to mirror the instance across the x and/or y axes. """
identifier: Tuple[Any, ...]
""" Arbitrary identifier, used internally by some `masque` functions. """
def __init__(
self,
pattern: Optional['Pattern'],
*,
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0.0,
mirrored: Optional[Sequence[bool]] = None,
dose: float = 1.0,
scale: float = 1.0,
repetition: Optional[Repetition] = None,
annotations: Optional[annotations_t] = None,
locked: bool = False,
identifier: Tuple[Any, ...] = (),
) -> None:
"""
Args:
pattern: Pattern to reference.
offset: (x, y) offset applied to the referenced pattern. Not affected by rotation etc.
rotation: Rotation (radians, counterclockwise) relative to the referenced pattern's (0, 0).
mirrored: Whether to mirror the referenced pattern across its x and y axes.
dose: Scaling factor applied to the dose.
scale: Scaling factor applied to the pattern's geometry.
repetition: TODO
locked: Whether the `SubPattern` is locked after initialization.
identifier: Arbitrary tuple, used internally by some `masque` functions.
"""
LockableImpl.unlock(self)
self.identifier = identifier
self.pattern = pattern
self.offset = offset
self.rotation = rotation
self.dose = dose
self.scale = scale
if mirrored is None:
mirrored = (False, False)
self.mirrored = mirrored
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
self.set_locked(locked)
def __copy__(self) -> 'SubPattern':
new = SubPattern(pattern=self.pattern,
offset=self.offset.copy(),
rotation=self.rotation,
dose=self.dose,
scale=self.scale,
mirrored=self.mirrored.copy(),
repetition=copy.deepcopy(self.repetition),
annotations=copy.deepcopy(self.annotations),
locked=self.locked)
return new
def __deepcopy__(self, memo: Dict = None) -> 'SubPattern':
memo = {} if memo is None else memo
new = copy.copy(self)
LockableImpl.unlock(new)
new.pattern = copy.deepcopy(self.pattern, memo)
new.repetition = copy.deepcopy(self.repetition, memo)
new.annotations = copy.deepcopy(self.annotations, memo)
new.set_locked(self.locked)
return new
# pattern property
@property
def pattern(self) -> Optional['Pattern']:
return self._pattern
@pattern.setter
def pattern(self, val: Optional['Pattern']) -> None:
from .pattern import Pattern
if val is not None and not isinstance(val, Pattern):
raise PatternError(f'Provided pattern {val} is not a Pattern object or None!')
self._pattern = val
# Mirrored property
@property
def mirrored(self) -> Any: #TODO mypy#3004 NDArray[numpy.bool_]:
return self._mirrored
@mirrored.setter
def mirrored(self, val: ArrayLike) -> None:
if is_scalar(val):
raise PatternError('Mirrored must be a 2-element list of booleans')
self._mirrored = numpy.array(val, dtype=bool, copy=True)
def as_pattern(self) -> 'Pattern':
"""
Returns:
A copy of self.pattern which has been scaled, rotated, etc. according to this
`SubPattern`'s properties.
"""
assert(self.pattern is not None)
pattern = self.pattern.deepcopy().deepunlock()
if self.scale != 1:
pattern.scale_by(self.scale)
if numpy.any(self.mirrored):
pattern.mirror2d(self.mirrored)
if self.rotation % (2 * pi) != 0:
pattern.rotate_around((0.0, 0.0), self.rotation)
if numpy.any(self.offset):
pattern.translate_elements(self.offset)
if self.dose != 1:
pattern.scale_element_doses(self.dose)
if self.repetition is not None:
combined = type(pattern)(name='__repetition__')
for dd in self.repetition.displacements:
temp_pat = pattern.deepcopy()
temp_pat.translate_elements(dd)
combined.append(temp_pat)
pattern = combined
return pattern
def rotate(self: S, rotation: float) -> S:
self.rotation += rotation
if self.repetition is not None:
self.repetition.rotate(rotation)
return self
def mirror(self: S, axis: int) -> S:
self.mirrored[axis] = not self.mirrored[axis]
self.rotation *= -1
if self.repetition is not None:
self.repetition.mirror(axis)
return self
def get_bounds(self) -> Optional[NDArray[numpy.float64]]:
"""
Return a `numpy.ndarray` containing `[[x_min, y_min], [x_max, y_max]]`, corresponding to the
extent of the `SubPattern` in each dimension.
Returns `None` if the contained `Pattern` is empty.
Returns:
`[[x_min, y_min], [x_max, y_max]]` or `None`
"""
if self.pattern is None:
return None
return self.as_pattern().get_bounds()
def lock(self: S) -> S:
"""
Lock the SubPattern, disallowing changes
Returns:
self
"""
self.mirrored.flags.writeable = False
PositionableImpl._lock(self)
LockableImpl.lock(self)
return self
def unlock(self: S) -> S:
"""
Unlock the SubPattern
Returns:
self
"""
LockableImpl.unlock(self)
PositionableImpl._unlock(self)
self.mirrored.flags.writeable = True
return self
def deeplock(self: S) -> S:
"""
Recursively lock the SubPattern and its contained pattern
Returns:
self
"""
assert(self.pattern is not None)
self.lock()
self.pattern.deeplock()
return self
def deepunlock(self: S) -> S:
"""
Recursively unlock the SubPattern and its contained pattern
This is dangerous unless you have just performed a deepcopy, since
the subpattern and its components may be used in more than one once!
Returns:
self
"""
assert(self.pattern is not None)
self.unlock()
self.pattern.deepunlock()
return self
def __repr__(self) -> str:
name = self.pattern.name if self.pattern is not None else None
rotation = f' r{self.rotation*180/pi:g}' if self.rotation != 0 else ''
scale = f' d{self.scale:g}' if self.scale != 1 else ''
mirrored = ' m{:d}{:d}'.format(*self.mirrored) if self.mirrored.any() else ''
dose = f' d{self.dose:g}' if self.dose != 1 else ''
locked = ' L' if self.locked else ''
return f'<SubPattern "{name}" at {self.offset}{rotation}{scale}{mirrored}{dose}{locked}>'

View File

@ -1,34 +1,13 @@
""" """
Traits (mixins) and default implementations Traits (mixins) and default implementations
Traits and mixins should set `__slots__ = ()` to enable use of `__slots__` in subclasses.
""" """
from .positionable import ( from .positionable import Positionable, PositionableImpl
Positionable as Positionable, from .layerable import Layerable, LayerableImpl
PositionableImpl as PositionableImpl, from .doseable import Doseable, DoseableImpl
Bounded as Bounded, from .rotatable import Rotatable, RotatableImpl, Pivotable, PivotableImpl
) from .repeatable import Repeatable, RepeatableImpl
from .layerable import ( from .scalable import Scalable, ScalableImpl
Layerable as Layerable, from .mirrorable import Mirrorable
LayerableImpl as LayerableImpl, from .copyable import Copyable
) from .lockable import Lockable, LockableImpl
from .rotatable import ( from .annotatable import Annotatable, AnnotatableImpl
Rotatable as Rotatable,
RotatableImpl as RotatableImpl,
Pivotable as Pivotable,
PivotableImpl as PivotableImpl,
)
from .repeatable import (
Repeatable as Repeatable,
RepeatableImpl as RepeatableImpl,
)
from .scalable import (
Scalable as Scalable,
ScalableImpl as ScalableImpl,
)
from .mirrorable import Mirrorable as Mirrorable
from .copyable import Copyable as Copyable
from .annotatable import (
Annotatable as Annotatable,
AnnotatableImpl as AnnotatableImpl,
)

View File

@ -1,3 +1,4 @@
from typing import TypeVar
#from types import MappingProxyType #from types import MappingProxyType
from abc import ABCMeta, abstractmethod from abc import ABCMeta, abstractmethod
@ -5,19 +6,20 @@ from ..utils import annotations_t
from ..error import MasqueError from ..error import MasqueError
_empty_slots = () # Workaround to get mypy to ignore intentionally empty slots for superclass T = TypeVar('T', bound='Annotatable')
I = TypeVar('I', bound='AnnotatableImpl')
class Annotatable(metaclass=ABCMeta): class Annotatable(metaclass=ABCMeta):
""" """
Trait class for all annotatable entities Abstract class for all annotatable entities
Annotations correspond to GDS/OASIS "properties" Annotations correspond to GDS/OASIS "properties"
""" """
__slots__ = () __slots__ = ()
# '''
# Properties ---- Properties
# '''
@property @property
@abstractmethod @abstractmethod
def annotations(self) -> annotations_t: def annotations(self) -> annotations_t:
@ -31,20 +33,23 @@ class AnnotatableImpl(Annotatable, metaclass=ABCMeta):
""" """
Simple implementation of `Annotatable`. Simple implementation of `Annotatable`.
""" """
__slots__ = _empty_slots __slots__ = ()
_annotations: annotations_t _annotations: annotations_t
""" Dictionary storing annotation name/value pairs """ """ Dictionary storing annotation name/value pairs """
# '''
# Non-abstract properties ---- Non-abstract properties
# '''
@property @property
def annotations(self) -> annotations_t: def annotations(self) -> annotations_t:
return self._annotations return self._annotations
# # TODO: Find a way to make sure the subclass implements Lockable without dealing with diamond inheritance or this extra hasattr
# if hasattr(self, 'is_locked') and self.is_locked():
# return MappingProxyType(self._annotations)
@annotations.setter @annotations.setter
def annotations(self, annotations: annotations_t) -> None: def annotations(self, annotations: annotations_t):
if not isinstance(annotations, dict): if not isinstance(annotations, dict):
raise MasqueError(f'annotations expected dict, got {type(annotations)}') raise MasqueError(f'annotations expected dict, got {type(annotations)}')
self._annotations = annotations self._annotations = annotations

View File

@ -1,17 +1,21 @@
from typing import Self from typing import TypeVar
from abc import ABCMeta
import copy import copy
class Copyable: T = TypeVar('T', bound='Copyable')
class Copyable(metaclass=ABCMeta):
""" """
Trait class which adds .copy() and .deepcopy() Abstract class which adds .copy() and .deepcopy()
""" """
__slots__ = () __slots__ = ()
# '''
# Non-abstract methods ---- Non-abstract methods
# '''
def copy(self) -> Self: def copy(self: T) -> T:
""" """
Return a shallow copy of the object. Return a shallow copy of the object.
@ -20,7 +24,7 @@ class Copyable:
""" """
return copy.copy(self) return copy.copy(self)
def deepcopy(self) -> Self: def deepcopy(self: T) -> T:
""" """
Return a deep copy of the object. Return a deep copy of the object.

76
masque/traits/doseable.py Normal file
View File

@ -0,0 +1,76 @@
from typing import TypeVar
from abc import ABCMeta, abstractmethod
from ..error import MasqueError
T = TypeVar('T', bound='Doseable')
I = TypeVar('I', bound='DoseableImpl')
class Doseable(metaclass=ABCMeta):
"""
Abstract class for all doseable entities
"""
__slots__ = ()
'''
---- Properties
'''
@property
@abstractmethod
def dose(self) -> float:
"""
Dose (float >= 0)
"""
pass
# @dose.setter
# @abstractmethod
# def dose(self, val: float):
# pass
'''
---- Methods
'''
def set_dose(self: T, dose: float) -> T:
"""
Set the dose
Args:
dose: new value for dose
Returns:
self
"""
pass
class DoseableImpl(Doseable, metaclass=ABCMeta):
"""
Simple implementation of Doseable
"""
__slots__ = ()
_dose: float
""" Dose """
'''
---- Non-abstract properties
'''
@property
def dose(self) -> float:
return self._dose
@dose.setter
def dose(self, val: float):
if not val >= 0:
raise MasqueError('Dose must be non-negative')
self._dose = val
'''
---- Non-abstract methods
'''
def set_dose(self: I, dose: float) -> I:
self.dose = dose
return self

View File

@ -1,21 +1,21 @@
from typing import Self from typing import TypeVar
from abc import ABCMeta, abstractmethod from abc import ABCMeta, abstractmethod
from ..utils import layer_t from ..utils import layer_t
_empty_slots = () # Workaround to get mypy to ignore intentionally empty slots for superclass T = TypeVar('T', bound='Layerable')
I = TypeVar('I', bound='LayerableImpl')
class Layerable(metaclass=ABCMeta): class Layerable(metaclass=ABCMeta):
""" """
Trait class for all layerable entities Abstract class for all layerable entities
""" """
__slots__ = () __slots__ = ()
'''
# ---- Properties
# Properties '''
#
@property @property
@abstractmethod @abstractmethod
def layer(self) -> layer_t: def layer(self) -> layer_t:
@ -29,11 +29,10 @@ class Layerable(metaclass=ABCMeta):
# def layer(self, val: layer_t): # def layer(self, val: layer_t):
# pass # pass
# '''
# Methods ---- Methods
# '''
@abstractmethod def set_layer(self: T, layer: layer_t) -> T:
def set_layer(self, layer: layer_t) -> Self:
""" """
Set the layer Set the layer
@ -50,25 +49,25 @@ class LayerableImpl(Layerable, metaclass=ABCMeta):
""" """
Simple implementation of Layerable Simple implementation of Layerable
""" """
__slots__ = _empty_slots __slots__ = ()
_layer: layer_t _layer: layer_t
""" Layer number, pair, or name """ """ Layer number, pair, or name """
# '''
# Non-abstract properties ---- Non-abstract properties
# '''
@property @property
def layer(self) -> layer_t: def layer(self) -> layer_t:
return self._layer return self._layer
@layer.setter @layer.setter
def layer(self, val: layer_t) -> None: def layer(self, val: layer_t):
self._layer = val self._layer = val
# '''
# Non-abstract methods ---- Non-abstract methods
# '''
def set_layer(self, layer: layer_t) -> Self: def set_layer(self: I, layer: layer_t) -> I:
self.layer = layer self.layer = layer
return self return self

103
masque/traits/lockable.py Normal file
View File

@ -0,0 +1,103 @@
from typing import TypeVar, Dict, Tuple, Any
from abc import ABCMeta, abstractmethod
from ..error import PatternLockedError
T = TypeVar('T', bound='Lockable')
I = TypeVar('I', bound='LockableImpl')
class Lockable(metaclass=ABCMeta):
"""
Abstract class for all lockable entities
"""
__slots__ = () # type: Tuple[str, ...]
'''
---- Methods
'''
@abstractmethod
def lock(self: T) -> T:
"""
Lock the object, disallowing further changes
Returns:
self
"""
pass
@abstractmethod
def unlock(self: T) -> T:
"""
Unlock the object, reallowing changes
Returns:
self
"""
pass
@abstractmethod
def is_locked(self) -> bool:
"""
Returns:
True if the object is locked
"""
pass
def set_locked(self: T, locked: bool) -> T:
"""
Locks or unlocks based on the argument.
No action if already in the requested state.
Args:
locked: State to set.
Returns:
self
"""
if locked != self.is_locked():
if locked:
self.lock()
else:
self.unlock()
return self
class LockableImpl(Lockable, metaclass=ABCMeta):
"""
Simple implementation of Lockable
"""
__slots__ = () # type: Tuple[str, ...]
locked: bool
""" If `True`, disallows changes to the object """
'''
---- Non-abstract methods
'''
def __setattr__(self, name, value):
if self.locked and name != 'locked':
raise PatternLockedError()
object.__setattr__(self, name, value)
def __getstate__(self) -> Dict[str, Any]:
if hasattr(self, '__slots__'):
return {key: getattr(self, key) for key in self.__slots__}
else:
return self.__dict__
def __setstate__(self, state: Dict[str, Any]) -> None:
for k, v in state.items():
object.__setattr__(self, k, v)
def lock(self: I) -> I:
object.__setattr__(self, 'locked', True)
return self
def unlock(self: I) -> I:
object.__setattr__(self, 'locked', False)
return self
def is_locked(self) -> bool:
return self.locked

View File

@ -1,15 +1,22 @@
from typing import Self from typing import TypeVar, Tuple
from abc import ABCMeta, abstractmethod from abc import ABCMeta, abstractmethod
T = TypeVar('T', bound='Mirrorable')
#I = TypeVar('I', bound='MirrorableImpl')
class Mirrorable(metaclass=ABCMeta): class Mirrorable(metaclass=ABCMeta):
""" """
Trait class for all mirrorable entities Abstract class for all mirrorable entities
""" """
__slots__ = () __slots__ = ()
'''
---- Abstract methods
'''
@abstractmethod @abstractmethod
def mirror(self, axis: int = 0) -> Self: def mirror(self: T, axis: int) -> T:
""" """
Mirror the entity across an axis. Mirror the entity across an axis.
@ -21,7 +28,7 @@ class Mirrorable(metaclass=ABCMeta):
""" """
pass pass
def mirror2d(self, across_x: bool = False, across_y: bool = False) -> Self: def mirror2d(self: T, axes: Tuple[bool, bool]) -> T:
""" """
Optionally mirror the entity across both axes Optionally mirror the entity across both axes
@ -31,9 +38,9 @@ class Mirrorable(metaclass=ABCMeta):
Returns: Returns:
self self
""" """
if across_x: if axes[0]:
self.mirror(0) self.mirror(0)
if across_y: if axes[1]:
self.mirror(1) self.mirror(1)
return self return self
@ -44,24 +51,24 @@ class Mirrorable(metaclass=ABCMeta):
# """ # """
# __slots__ = () # __slots__ = ()
# #
# _mirrored: NDArray[numpy.bool] # _mirrored: numpy.ndarray # ndarray[bool]
# """ Whether to mirror the instance across the x and/or y axes. """ # """ Whether to mirror the instance across the x and/or y axes. """
# #
# # # '''
# # Properties # ---- Properties
# # # '''
# # Mirrored property # # Mirrored property
# @property # @property
# def mirrored(self) -> NDArray[numpy.bool]: # def mirrored(self) -> numpy.ndarray: # ndarray[bool]
# """ Whether to mirror across the [x, y] axes, respectively """ # """ Whether to mirror across the [x, y] axes, respectively """
# return self._mirrored # return self._mirrored
# #
# @mirrored.setter # @mirrored.setter
# def mirrored(self, val: Sequence[bool]) -> None: # def mirrored(self, val: Sequence[bool]):
# if is_scalar(val): # if is_scalar(val):
# raise MasqueError('Mirrored must be a 2-element list of booleans') # raise MasqueError('Mirrored must be a 2-element list of booleans')
# self._mirrored = numpy.array(val, dtype=bool) # self._mirrored = numpy.array(val, dtype=bool, copy=True)
# #
# # # '''
# # Methods # ---- Methods
# # # '''

View File

@ -1,4 +1,6 @@
from typing import Self, Any # TODO top-level comment about how traits should set __slots__ = (), and how to use AutoSlots
from typing import TypeVar, Any, Optional
from abc import ABCMeta, abstractmethod from abc import ABCMeta, abstractmethod
import numpy import numpy
@ -7,18 +9,19 @@ from numpy.typing import NDArray, ArrayLike
from ..error import MasqueError from ..error import MasqueError
_empty_slots = () # Workaround to get mypy to ignore intentionally empty slots for superclass T = TypeVar('T', bound='Positionable')
I = TypeVar('I', bound='PositionableImpl')
class Positionable(metaclass=ABCMeta): class Positionable(metaclass=ABCMeta):
""" """
Trait class for all positionable entities Abstract class for all positionable entities
""" """
__slots__ = () __slots__ = ()
# '''
# Properties ---- Abstract properties
# '''
@property @property
@abstractmethod @abstractmethod
def offset(self) -> NDArray[numpy.float64]: def offset(self) -> NDArray[numpy.float64]:
@ -27,13 +30,13 @@ class Positionable(metaclass=ABCMeta):
""" """
pass pass
@offset.setter # @offset.setter
@abstractmethod # @abstractmethod
def offset(self, val: ArrayLike) -> None: # def offset(self, val: ArrayLike):
pass # pass
@abstractmethod @abstractmethod
def set_offset(self, offset: ArrayLike) -> Self: def set_offset(self: T, offset: ArrayLike) -> T:
""" """
Set the offset Set the offset
@ -46,7 +49,7 @@ class Positionable(metaclass=ABCMeta):
pass pass
@abstractmethod @abstractmethod
def translate(self, offset: ArrayLike) -> Self: def translate(self: T, offset: ArrayLike) -> T:
""" """
Translate the entity by the given offset Translate the entity by the given offset
@ -58,22 +61,41 @@ class Positionable(metaclass=ABCMeta):
""" """
pass pass
@abstractmethod
def get_bounds(self) -> Optional[NDArray[numpy.float64]]:
"""
Returns `[[x_min, y_min], [x_max, y_max]]` which specify a minimal bounding box for the entity.
Returns `None` for an empty entity.
"""
pass
def get_bounds_nonempty(self) -> NDArray[numpy.float64]:
"""
Returns `[[x_min, y_min], [x_max, y_max]]` which specify a minimal bounding box for the entity.
Asserts that the entity is non-empty (i.e., `get_bounds()` does not return None).
This is handy for destructuring like `xy_min, xy_max = entity.get_bounds_nonempty()`
"""
bounds = self.get_bounds()
assert(bounds is not None)
return bounds
class PositionableImpl(Positionable, metaclass=ABCMeta): class PositionableImpl(Positionable, metaclass=ABCMeta):
""" """
Simple implementation of Positionable Simple implementation of Positionable
""" """
__slots__ = _empty_slots __slots__ = ()
_offset: NDArray[numpy.float64] _offset: NDArray[numpy.float64]
""" `[x_offset, y_offset]` """ """ `[x_offset, y_offset]` """
# '''
# Properties ---- Properties
# '''
# offset property # offset property
@property @property
def offset(self) -> Any: # mypy#3004 NDArray[numpy.float64]: def offset(self) -> Any: #TODO mypy#3003 NDArray[numpy.float64]:
""" """
[x, y] offset [x, y] offset
""" """
@ -81,42 +103,40 @@ class PositionableImpl(Positionable, metaclass=ABCMeta):
@offset.setter @offset.setter
def offset(self, val: ArrayLike) -> None: def offset(self, val: ArrayLike) -> None:
if not isinstance(val, numpy.ndarray) or val.dtype != numpy.float64:
val = numpy.array(val, dtype=float) val = numpy.array(val, dtype=float)
if val.size != 2: if val.size != 2:
raise MasqueError('Offset must be convertible to size-2 ndarray') raise MasqueError('Offset must be convertible to size-2 ndarray')
self._offset = val.flatten() self._offset = val.flatten()
# '''
# Methods ---- Methods
# '''
def set_offset(self, offset: ArrayLike) -> Self: def set_offset(self: I, offset: ArrayLike) -> I:
self.offset = offset self.offset = offset
return self return self
def translate(self, offset: ArrayLike) -> Self: def translate(self: I, offset: ArrayLike) -> I:
self._offset += offset # type: ignore # NDArray += ArrayLike should be fine?? self._offset += offset # type: ignore # NDArray += ArrayLike should be fine??
return self return self
def _lock(self: I) -> I:
class Bounded(metaclass=ABCMeta):
@abstractmethod
def get_bounds(self, *args, **kwargs) -> NDArray[numpy.float64] | None:
""" """
Returns `[[x_min, y_min], [x_max, y_max]]` which specify a minimal bounding box for the entity. Lock the entity, disallowing further changes
Returns `None` for an empty entity.
Returns:
self
""" """
pass self._offset.flags.writeable = False
return self
def get_bounds_nonempty(self, *args, **kwargs) -> NDArray[numpy.float64]: def _unlock(self: I) -> I:
""" """
Returns `[[x_min, y_min], [x_max, y_max]]` which specify a minimal bounding box for the entity. Unlock the entity
Asserts that the entity is non-empty (i.e., `get_bounds()` does not return None).
This is handy for destructuring like `xy_min, xy_max = entity.get_bounds_nonempty()` Returns:
self
""" """
bounds = self.get_bounds(*args, **kwargs) self._offset.flags.writeable = True
assert bounds is not None return self
return bounds

View File

@ -1,32 +1,29 @@
from typing import Self, TYPE_CHECKING from typing import TypeVar, Optional, TYPE_CHECKING
from abc import ABCMeta, abstractmethod from abc import ABCMeta, abstractmethod
import numpy
from numpy.typing import NDArray
from ..error import MasqueError from ..error import MasqueError
from .positionable import Bounded
_empty_slots = () # Workaround to get mypy to ignore intentionally empty slots for superclass
if TYPE_CHECKING: if TYPE_CHECKING:
from ..repetition import Repetition from ..repetition import Repetition
T = TypeVar('T', bound='Repeatable')
I = TypeVar('I', bound='RepeatableImpl')
class Repeatable(metaclass=ABCMeta): class Repeatable(metaclass=ABCMeta):
""" """
Trait class for all repeatable entities Abstract class for all repeatable entities
""" """
__slots__ = () __slots__ = ()
# '''
# Properties ---- Properties
# '''
@property @property
@abstractmethod @abstractmethod
def repetition(self) -> 'Repetition | None': def repetition(self) -> Optional['Repetition']:
""" """
Repetition object, or None (single instance only) Repetition object, or None (single instance only)
""" """
@ -34,14 +31,14 @@ class Repeatable(metaclass=ABCMeta):
# @repetition.setter # @repetition.setter
# @abstractmethod # @abstractmethod
# def repetition(self, repetition: 'Repetition | None') -> None: # def repetition(self, repetition: Optional['Repetition']):
# pass # pass
# '''
# Methods ---- Methods
# '''
@abstractmethod @abstractmethod
def set_repetition(self, repetition: 'Repetition | None') -> Self: def set_repetition(self: T, repetition: Optional['Repetition']) -> T:
""" """
Set the repetition Set the repetition
@ -54,57 +51,32 @@ class Repeatable(metaclass=ABCMeta):
pass pass
class RepeatableImpl(Repeatable, Bounded, metaclass=ABCMeta): class RepeatableImpl(Repeatable, metaclass=ABCMeta):
""" """
Simple implementation of `Repeatable` and extension of `Bounded` to include repetition bounds. Simple implementation of `Repeatable`
""" """
__slots__ = _empty_slots __slots__ = ()
_repetition: 'Repetition | None' _repetition: Optional['Repetition']
""" Repetition object, or None (single instance only) """ """ Repetition object, or None (single instance only) """
@abstractmethod '''
def get_bounds_single(self, *args, **kwargs) -> NDArray[numpy.float64] | None: ---- Non-abstract properties
pass '''
#
# Non-abstract properties
#
@property @property
def repetition(self) -> 'Repetition | None': def repetition(self) -> Optional['Repetition']:
return self._repetition return self._repetition
@repetition.setter @repetition.setter
def repetition(self, repetition: 'Repetition | None') -> None: def repetition(self, repetition: Optional['Repetition']):
from ..repetition import Repetition from ..repetition import Repetition
if repetition is not None and not isinstance(repetition, Repetition): if repetition is not None and not isinstance(repetition, Repetition):
raise MasqueError(f'{repetition} is not a valid Repetition object!') raise MasqueError(f'{repetition} is not a valid Repetition object!')
self._repetition = repetition self._repetition = repetition
# '''
# Non-abstract methods ---- Non-abstract methods
# '''
def set_repetition(self, repetition: 'Repetition | None') -> Self: def set_repetition(self: I, repetition: Optional['Repetition']) -> I:
self.repetition = repetition self.repetition = repetition
return self return self
def get_bounds_single_nonempty(self, *args, **kwargs) -> NDArray[numpy.float64]:
"""
Returns `[[x_min, y_min], [x_max, y_max]]` which specify a minimal bounding box for the entity.
Asserts that the entity is non-empty (i.e., `get_bounds()` does not return None).
This is handy for destructuring like `xy_min, xy_max = entity.get_bounds_nonempty()`
"""
bounds = self.get_bounds_single(*args, **kwargs)
assert bounds is not None
return bounds
def get_bounds(self, *args, **kwargs) -> NDArray[numpy.float64] | None:
bounds = self.get_bounds_single(*args, **kwargs)
if bounds is not None and self.repetition is not None:
rep_bounds = self.repetition.get_bounds()
if rep_bounds is None:
return None
bounds += rep_bounds
return bounds

View File

@ -1,29 +1,31 @@
from typing import Self, cast, Any from typing import TypeVar
from abc import ABCMeta, abstractmethod from abc import ABCMeta, abstractmethod
import numpy import numpy
from numpy import pi from numpy import pi
from numpy.typing import ArrayLike from numpy.typing import ArrayLike, NDArray
from .positionable import Positionable #from .positionable import Positionable
from ..error import MasqueError from ..error import MasqueError
from ..utils import rotation_matrix_2d from ..utils import is_scalar, rotation_matrix_2d
T = TypeVar('T', bound='Rotatable')
_empty_slots = () # Workaround to get mypy to ignore intentionally empty slots for superclass I = TypeVar('I', bound='RotatableImpl')
P = TypeVar('P', bound='Pivotable')
J = TypeVar('J', bound='PivotableImpl')
class Rotatable(metaclass=ABCMeta): class Rotatable(metaclass=ABCMeta):
""" """
Trait class for all rotatable entities Abstract class for all rotatable entities
""" """
__slots__ = () __slots__ = ()
# '''
# Methods ---- Abstract methods
# '''
@abstractmethod @abstractmethod
def rotate(self, val: float) -> Self: def rotate(self: T, val: float) -> T:
""" """
Rotate the shape around its origin (0, 0), ignoring its offset. Rotate the shape around its origin (0, 0), ignoring its offset.
@ -40,33 +42,33 @@ class RotatableImpl(Rotatable, metaclass=ABCMeta):
""" """
Simple implementation of `Rotatable` Simple implementation of `Rotatable`
""" """
__slots__ = _empty_slots __slots__ = ()
_rotation: float _rotation: float
""" rotation for the object, radians counterclockwise """ """ rotation for the object, radians counterclockwise """
# '''
# Properties ---- Properties
# '''
@property @property
def rotation(self) -> float: def rotation(self) -> float:
""" Rotation, radians counterclockwise """ """ Rotation, radians counterclockwise """
return self._rotation return self._rotation
@rotation.setter @rotation.setter
def rotation(self, val: float) -> None: def rotation(self, val: float):
if not numpy.size(val) == 1: if not numpy.size(val) == 1:
raise MasqueError('Rotation must be a scalar') raise MasqueError('Rotation must be a scalar')
self._rotation = val % (2 * pi) self._rotation = val % (2 * pi)
# '''
# Methods ---- Methods
# '''
def rotate(self, rotation: float) -> Self: def rotate(self: I, rotation: float) -> I:
self.rotation += rotation self.rotation += rotation
return self return self
def set_rotation(self, rotation: float) -> Self: def set_rotation(self: I, rotation: float) -> I:
""" """
Set the rotation to a value Set the rotation to a value
@ -82,13 +84,13 @@ class RotatableImpl(Rotatable, metaclass=ABCMeta):
class Pivotable(metaclass=ABCMeta): class Pivotable(metaclass=ABCMeta):
""" """
Trait class for entites which can be rotated around a point. Abstract class for entites which can be rotated around a point.
This requires that they are `Positionable` but not necessarily `Rotatable` themselves. This requires that they are `Positionable` but not necessarily `Rotatable` themselves.
""" """
__slots__ = () __slots__ = ()
@abstractmethod @abstractmethod
def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self: def rotate_around(self: P, pivot: ArrayLike, rotation: float) -> P:
""" """
Rotate the object around a point. Rotate the object around a point.
@ -108,14 +110,11 @@ class PivotableImpl(Pivotable, metaclass=ABCMeta):
""" """
__slots__ = () __slots__ = ()
offset: Any # TODO see if we can get around defining `offset` in PivotableImpl def rotate_around(self: J, pivot: ArrayLike, rotation: float) -> J:
""" `[x_offset, y_offset]` """ pivot = numpy.array(pivot, dtype=float)
self.translate(-pivot)
def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self: self.rotate(rotation)
pivot = numpy.asarray(pivot, dtype=float) self.offset = numpy.dot(rotation_matrix_2d(rotation), self.offset) #type: ignore #TODO: mypy#3004
cast(Positionable, self).translate(-pivot) self.translate(+pivot)
cast(Rotatable, self).rotate(rotation)
self.offset = numpy.dot(rotation_matrix_2d(rotation), self.offset) # type: ignore # mypy#3004
cast(Positionable, self).translate(+pivot)
return self return self

View File

@ -1,24 +1,25 @@
from typing import Self from typing import TypeVar
from abc import ABCMeta, abstractmethod from abc import ABCMeta, abstractmethod
from ..error import MasqueError from ..error import MasqueError
from ..utils import is_scalar from ..utils import is_scalar
_empty_slots = () # Workaround to get mypy to ignore intentionally empty slots for superclass T = TypeVar('T', bound='Scalable')
I = TypeVar('I', bound='ScalableImpl')
class Scalable(metaclass=ABCMeta): class Scalable(metaclass=ABCMeta):
""" """
Trait class for all scalable entities Abstract class for all scalable entities
""" """
__slots__ = () __slots__ = ()
# '''
# Methods ---- Abstract methods
# '''
@abstractmethod @abstractmethod
def scale_by(self, c: float) -> Self: def scale_by(self: T, c: float) -> T:
""" """
Scale the entity by a factor Scale the entity by a factor
@ -35,34 +36,34 @@ class ScalableImpl(Scalable, metaclass=ABCMeta):
""" """
Simple implementation of Scalable Simple implementation of Scalable
""" """
__slots__ = _empty_slots __slots__ = ()
_scale: float _scale: float
""" scale factor for the entity """ """ scale factor for the entity """
# '''
# Properties ---- Properties
# '''
@property @property
def scale(self) -> float: def scale(self) -> float:
return self._scale return self._scale
@scale.setter @scale.setter
def scale(self, val: float) -> None: def scale(self, val: float):
if not is_scalar(val): if not is_scalar(val):
raise MasqueError('Scale must be a scalar') raise MasqueError('Scale must be a scalar')
if not val > 0: if not val > 0:
raise MasqueError('Scale must be positive') raise MasqueError('Scale must be positive')
self._scale = val self._scale = val
# '''
# Methods ---- Methods
# '''
def scale_by(self, c: float) -> Self: def scale_by(self: I, c: float) -> I:
self.scale *= c self.scale *= c
return self return self
def set_scale(self, scale: float) -> Self: def set_scale(self: I, scale: float) -> I:
""" """
Set the sclae to a value Set the sclae to a value

165
masque/utils.py Normal file
View File

@ -0,0 +1,165 @@
"""
Various helper functions
"""
from typing import Any, Union, Tuple, Sequence, Dict, List
from abc import ABCMeta
import numpy
from numpy.typing import NDArray, ArrayLike
# Type definitions
layer_t = Union[int, Tuple[int, int], str]
annotations_t = Dict[str, List[Union[int, float, str]]]
def is_scalar(var: Any) -> bool:
"""
Alias for 'not hasattr(var, "__len__")'
Args:
var: Checks if `var` has a length.
"""
return not hasattr(var, "__len__")
def get_bit(bit_string: Any, bit_id: int) -> bool:
"""
Interprets bit number `bit_id` from the right (lsb) of `bit_string` as a boolean
Args:
bit_string: Bit string to test
bit_id: Bit number, 0-indexed from the right (lsb)
Returns:
Boolean value of the requested bit
"""
return bit_string & (1 << bit_id) != 0
def set_bit(bit_string: Any, bit_id: int, value: bool) -> Any:
"""
Returns `bit_string`, with bit number `bit_id` set to boolean `value`.
Args:
bit_string: Bit string to alter
bit_id: Bit number, 0-indexed from right (lsb)
value: Boolean value to set bit to
Returns:
Altered `bit_string`
"""
mask = (1 << bit_id)
bit_string &= ~mask
if value:
bit_string |= mask
return bit_string
def rotation_matrix_2d(theta: float) -> NDArray[numpy.float64]:
"""
2D rotation matrix for rotating counterclockwise around the origin.
Args:
theta: Angle to rotate, in radians
Returns:
rotation matrix
"""
return numpy.array([[numpy.cos(theta), -numpy.sin(theta)],
[numpy.sin(theta), +numpy.cos(theta)]])
def normalize_mirror(mirrored: Sequence[bool]) -> Tuple[bool, float]:
"""
Converts 0-2 mirror operations `(mirror_across_x_axis, mirror_across_y_axis)`
into 0-1 mirror operations and a rotation
Args:
mirrored: `(mirror_across_x_axis, mirror_across_y_axis)`
Returns:
`mirror_across_x_axis` (bool) and
`angle_to_rotate` in radians
"""
mirrored_x, mirrored_y = mirrored
mirror_x = (mirrored_x != mirrored_y) # XOR
angle = numpy.pi if mirrored_y else 0
return mirror_x, angle
def remove_duplicate_vertices(vertices: ArrayLike, closed_path: bool = True) -> NDArray[numpy.float64]:
"""
Given a list of vertices, remove any consecutive duplicates.
Args:
vertices: `[[x0, y0], [x1, y1], ...]`
closed_path: If True, `vertices` is interpreted as an implicity-closed path
(i.e. the last vertex will be removed if it is the same as the first)
Returns:
`vertices` with no consecutive duplicates.
"""
vertices = numpy.array(vertices)
duplicates = (vertices == numpy.roll(vertices, 1, axis=0)).all(axis=1)
if not closed_path:
duplicates[0] = False
return vertices[~duplicates]
def remove_colinear_vertices(vertices: ArrayLike, closed_path: bool = True) -> NDArray[numpy.float64]:
"""
Given a list of vertices, remove any superflous vertices (i.e.
those which lie along the line formed by their neighbors)
Args:
vertices: Nx2 ndarray of vertices
closed_path: If `True`, the vertices are assumed to represent an implicitly
closed path. If `False`, the path is assumed to be open. Default `True`.
Returns:
`vertices` with colinear (superflous) vertices removed.
"""
vertices = remove_duplicate_vertices(vertices)
# Check for dx0/dy0 == dx1/dy1
dv = numpy.roll(vertices, -1, axis=0) - vertices # [y1-y0, y2-y1, ...]
dxdy = dv * numpy.roll(dv, 1, axis=0)[:, ::-1] # [[dx0*(dy_-1), (dx_-1)*dy0], dx1*dy0, dy1*dx0]]
dxdy_diff = numpy.abs(numpy.diff(dxdy, axis=1))[:, 0]
err_mult = 2 * numpy.abs(dxdy).sum(axis=1) + 1e-40
slopes_equal = (dxdy_diff / err_mult) < 1e-15
if not closed_path:
slopes_equal[[0, -1]] = False
return vertices[~slopes_equal]
class AutoSlots(ABCMeta):
"""
Metaclass for automatically generating __slots__ based on superclass type annotations.
Superclasses must set `__slots__ = ()` to make this work properly.
This is a workaround for the fact that non-empty `__slots__` can't be used
with multiple inheritance. Since we only use multiple inheritance with abstract
classes, they can have empty `__slots__` and their attribute type annotations
can be used to generate a full `__slots__` for the concrete class.
"""
def __new__(cls, name, bases, dctn):
parents = set()
for base in bases:
parents |= set(base.mro())
slots = tuple(dctn.get('__slots__', tuple()))
for parent in parents:
if not hasattr(parent, '__annotations__'):
continue
slots += tuple(getattr(parent, '__annotations__').keys())
dctn['__slots__'] = slots
return super().__new__(cls, name, bases, dctn)

View File

@ -1,41 +1,15 @@
""" """
Various helper functions, type definitions, etc. Various helper functions, type definitions, etc.
""" """
from .types import ( from .types import layer_t, annotations_t
layer_t as layer_t,
annotations_t as annotations_t,
SupportsBool as SupportsBool,
)
from .array import is_scalar as is_scalar
from .autoslots import AutoSlots as AutoSlots
from .deferreddict import DeferredDict as DeferredDict
from .decorators import oneshot as oneshot
from .bitwise import ( from .array import is_scalar
get_bit as get_bit, from .autoslots import AutoSlots
set_bit as set_bit,
) from .bitwise import get_bit, set_bit
from .vertices import ( from .vertices import (
remove_duplicate_vertices as remove_duplicate_vertices, remove_duplicate_vertices, remove_colinear_vertices, poly_contains_points
remove_colinear_vertices as remove_colinear_vertices,
poly_contains_points as poly_contains_points,
)
from .transform import (
rotation_matrix_2d as rotation_matrix_2d,
normalize_mirror as normalize_mirror,
rotate_offsets_around as rotate_offsets_around,
apply_transforms as apply_transforms,
)
from .comparisons import (
annotation2key as annotation2key,
annotations_lt as annotations_lt,
annotations_eq as annotations_eq,
layer2key as layer2key,
ports_lt as ports_lt,
ports_eq as ports_eq,
rep2key as rep2key,
) )
from .transform import rotation_matrix_2d, normalize_mirror
from . import ports2data as ports2data #from . import pack2d
from . import pack2d as pack2d

View File

@ -12,16 +12,16 @@ class AutoSlots(ABCMeta):
classes, they can have empty `__slots__` and their attribute type annotations classes, they can have empty `__slots__` and their attribute type annotations
can be used to generate a full `__slots__` for the concrete class. can be used to generate a full `__slots__` for the concrete class.
""" """
def __new__(cls, name, bases, dctn): # noqa: ANN001,ANN204 def __new__(cls, name, bases, dctn):
parents = set() parents = set()
for base in bases: for base in bases:
parents |= set(base.mro()) parents |= set(base.mro())
slots = tuple(dctn.get('__slots__', ())) slots = tuple(dctn.get('__slots__', tuple()))
for parent in parents: for parent in parents:
if not hasattr(parent, '__annotations__'): if not hasattr(parent, '__annotations__'):
continue continue
slots += tuple(parent.__annotations__.keys()) slots += tuple(getattr(parent, '__annotations__').keys())
dctn['__slots__'] = slots dctn['__slots__'] = slots
return super().__new__(cls, name, bases, dctn) return super().__new__(cls, name, bases, dctn)

View File

@ -1,106 +0,0 @@
from typing import Any
from .types import annotations_t, layer_t
from ..ports import Port
from ..repetition import Repetition
def annotation2key(aaa: int | float | str) -> tuple[bool, Any]:
return (isinstance(aaa, str), aaa)
def annotations_lt(aa: annotations_t, bb: annotations_t) -> bool:
if aa is None:
return bb is not None
elif bb is None: # noqa: RET505
return False
if len(aa) != len(bb):
return len(aa) < len(bb)
keys_a = tuple(sorted(aa.keys()))
keys_b = tuple(sorted(bb.keys()))
if keys_a != keys_b:
return keys_a < keys_b
for key in keys_a:
va = aa[key]
vb = bb[key]
if len(va) != len(vb):
return len(va) < len(vb)
for aaa, bbb in zip(va, vb, strict=True):
if aaa != bbb:
return annotation2key(aaa) < annotation2key(bbb)
return False
def annotations_eq(aa: annotations_t, bb: annotations_t) -> bool:
if aa is None:
return bb is None
elif bb is None: # noqa: RET505
return False
if len(aa) != len(bb):
return False
keys_a = tuple(sorted(aa.keys()))
keys_b = tuple(sorted(bb.keys()))
if keys_a != keys_b:
return keys_a < keys_b
for key in keys_a:
va = aa[key]
vb = bb[key]
if len(va) != len(vb):
return False
for aaa, bbb in zip(va, vb, strict=True):
if aaa != bbb:
return False
return True
def layer2key(layer: layer_t) -> tuple[bool, bool, Any]:
is_int = isinstance(layer, int)
is_str = isinstance(layer, str)
layer_tup = (layer) if (is_str or is_int) else layer
tup = (
is_str,
not is_int,
layer_tup,
)
return tup
def rep2key(repetition: Repetition | None) -> tuple[bool, Repetition | None]:
return (repetition is None, repetition)
def ports_eq(aa: dict[str, Port], bb: dict[str, Port]) -> bool:
if len(aa) != len(bb):
return False
keys = sorted(aa.keys())
if keys != sorted(bb.keys()):
return False
return all(aa[kk] == bb[kk] for kk in keys)
def ports_lt(aa: dict[str, Port], bb: dict[str, Port]) -> bool:
if len(aa) != len(bb):
return len(aa) < len(bb)
aa_keys = tuple(sorted(aa.keys()))
bb_keys = tuple(sorted(bb.keys()))
if aa_keys != bb_keys:
return aa_keys < bb_keys
for key in aa_keys:
pa = aa[key]
pb = bb[key]
if pa != pb:
return pa < pb
return False

View File

@ -1,21 +0,0 @@
from collections.abc import Callable
from functools import wraps
from ..error import OneShotError
def oneshot(func: Callable) -> Callable:
"""
Raises a OneShotError if the decorated function is called more than once
"""
expired = False
@wraps(func)
def wrapper(*args, **kwargs): # noqa: ANN202
nonlocal expired
if expired:
raise OneShotError(func.__name__)
expired = True
return func(*args, **kwargs)
return wrapper

View File

@ -1,13 +1,37 @@
""" """
2D bin-packing 2D bin-packing
""" """
from collections.abc import Sequence, Mapping, Callable from typing import Tuple, List, Set, Sequence, Callable
import numpy import numpy
from numpy.typing import NDArray, ArrayLike from numpy.typing import NDArray, ArrayLike
from ..error import MasqueError from ..error import MasqueError
from ..pattern import Pattern from ..pattern import Pattern
from ..subpattern import SubPattern
def pack_patterns(patterns: Sequence[Pattern],
regions: numpy.ndarray,
spacing: Tuple[float, float],
presort: bool = True,
allow_rejects: bool = True,
packer: Callable = maxrects_bssf,
) -> Tuple[Pattern, List[Pattern]]:
half_spacing = numpy.array(spacing) / 2
bounds = [pp.get_bounds() for pp in patterns]
sizes = [bb[1] - bb[0] + spacing if bb is not None else spacing for bb in bounds]
offsets = [half_spacing - bb[0] if bb is not None else (0, 0) for bb in bounds]
locations, reject_inds = packer(sizes, regions, presort=presort, allow_rejects=allow_rejects)
pat = Pattern()
pat.subpatterns = [SubPattern(pp, offset=oo + loc)
for pp, oo, loc in zip(patterns, offsets, locations)]
rejects = [patterns[ii] for ii in reject_inds]
return pat, rejects
def maxrects_bssf( def maxrects_bssf(
@ -15,36 +39,18 @@ def maxrects_bssf(
containers: ArrayLike, containers: ArrayLike,
presort: bool = True, presort: bool = True,
allow_rejects: bool = True, allow_rejects: bool = True,
) -> tuple[NDArray[numpy.float64], set[int]]: ) -> Tuple[NDArray[numpy.float64], Set[int]]:
""" """
Pack rectangles `rects` into regions `containers` using the "maximal rectangles best short side fit" sizes should be Nx2
algorithm (maxrects_bssf) from "A thousand ways to pack the bin", Jukka Jylanki, 2010. regions should be Mx4 (xmin, ymin, xmax, ymax)
This algorithm gives the best results, but is asymptotically slower than `guillotine_bssf_sas`.
Args:
rects: Nx2 array of rectangle sizes `[[x_size0, y_size0], ...]`.
containers: Mx4 array of regions into which `rects` will be placed, specified using their
corner coordinates ` [[x_min0, y_min0, x_max0, y_max0], ...]`.
presort: If `True` (default), largest-shortest-side rectangles will be placed
first. Otherwise, they will be placed in the order provided.
allow_rejects: If `False`, `MasqueError` will be raised if any rectangle cannot be placed.
Returns:
`[[x_min0, y_min0], ...]` placement locations for `rects`, with the same ordering.
The second argument is a set of indicies of `rects` entries which were rejected; their
corresponding placement locations should be ignored.
Raises:
MasqueError if `allow_rejects` is `True` but some `rects` could not be placed.
""" """
regions = numpy.asarray(containers, dtype=float) regions = numpy.array(containers, copy=False, dtype=float)
rect_sizes = numpy.asarray(rects, dtype=float) rect_sizes = numpy.array(rects, copy=False, dtype=float)
rect_locs = numpy.zeros_like(rect_sizes) rect_locs = numpy.zeros_like(rect_sizes)
rejected_inds = set() rejected_inds = set()
if presort: if presort:
rotated_sizes = numpy.sort(rect_sizes, axis=1) # shortest side first rotated_sizes = numpy.sort(rect_sizes, axis=0) # shortest side first
rect_order = numpy.lexsort(rotated_sizes.T)[::-1] # Descending shortest side rect_order = numpy.lexsort(rotated_sizes.T)[::-1] # Descending shortest side
rect_sizes = rect_sizes[rect_order] rect_sizes = rect_sizes[rect_order]
@ -62,14 +68,14 @@ def maxrects_bssf(
''' Place the rect ''' ''' Place the rect '''
# Best short-side fit (bssf) to pick a region # Best short-side fit (bssf) to pick a region
region_sizes = regions[:, 2:] - regions[:, :2] bssf_scores = ((regions[:, 2:] - regions[:, :2]) - rect_size).min(axis=1).astype(float)
bssf_scores = (region_sizes - rect_size).min(axis=1).astype(float)
bssf_scores[bssf_scores < 0] = numpy.inf # doesn't fit! bssf_scores[bssf_scores < 0] = numpy.inf # doesn't fit!
rr = bssf_scores.argmin() rr = bssf_scores.argmin()
if numpy.isinf(bssf_scores[rr]): if numpy.isinf(bssf_scores[rr]):
if allow_rejects: if allow_rejects:
rejected_inds.add(rect_ind) rejected_inds.add(rect_ind)
continue continue
else:
raise MasqueError(f'Failed to find a suitable location for rectangle {rect_ind}') raise MasqueError(f'Failed to find a suitable location for rectangle {rect_ind}')
# Read out location # Read out location
@ -98,146 +104,62 @@ def maxrects_bssf(
r_top[:, 1] = loc[1] + rect_size[1] r_top[:, 1] = loc[1] + rect_size[1]
regions = numpy.vstack((regions[~intersects], r_lft, r_bot, r_rgt, r_top)) regions = numpy.vstack((regions[~intersects], r_lft, r_bot, r_rgt, r_top))
if presort:
unsort_order = rect_order.argsort()
rect_locs = rect_locs[unsort_order]
rejected_inds = set(unsort_order[list(rejected_inds)])
return rect_locs, rejected_inds return rect_locs, rejected_inds
def guillotine_bssf_sas( def guillotine_bssf_sas(rect_sizes: numpy.ndarray,
rects: ArrayLike, regions: numpy.ndarray,
containers: ArrayLike,
presort: bool = True, presort: bool = True,
allow_rejects: bool = True, allow_rejects: bool = True,
) -> tuple[NDArray[numpy.float64], set[int]]: ) -> Tuple[numpy.ndarray, Set[int]]:
""" """
Pack rectangles `rects` into regions `containers` using the "guillotine best short side fit with sizes should be Nx2
shorter axis split rule" algorithm (guillotine-BSSF-SAS) from "A thousand ways to pack the bin", regions should be Mx4 (xmin, ymin, xmax, ymax)
Jukka Jylanki, 2010. #TODO: test me!
# TODO add rectangle-merge?
This algorithm gives the worse results than `maxrects_bssf`, but is asymptotically faster.
# TODO consider adding rectangle-merge?
# TODO guillotine could use some additional testing
Args:
rects: Nx2 array of rectangle sizes `[[x_size0, y_size0], ...]`.
containers: Mx4 array of regions into which `rects` will be placed, specified using their
corner coordinates ` [[x_min0, y_min0, x_max0, y_max0], ...]`.
presort: If `True` (default), largest-shortest-side rectangles will be placed
first. Otherwise, they will be placed in the order provided.
allow_rejects: If `False`, `MasqueError` will be raised if any rectangle cannot be placed.
Returns:
`[[x_min0, y_min0], ...]` placement locations for `rects`, with the same ordering.
The second argument is a set of indicies of `rects` entries which were rejected; their
corresponding placement locations should be ignored.
Raises:
MasqueError if `allow_rejects` is `True` but some `rects` could not be placed.
""" """
regions = numpy.asarray(containers, dtype=float) rect_sizes = numpy.array(rect_sizes)
rect_sizes = numpy.asarray(rects, dtype=float)
rect_locs = numpy.zeros_like(rect_sizes) rect_locs = numpy.zeros_like(rect_sizes)
rejected_inds = set() rejected_inds = set()
if presort: if presort:
rotated_sizes = numpy.sort(rect_sizes, axis=1) # shortest side first rotated_sizes = numpy.sort(rect_sizes, axis=0) # shortest side first
rect_order = numpy.lexsort(rotated_sizes.T)[::-1] # Descending shortest side rect_order = numpy.lexsort(rotated_sizes.T)[::-1] # Descending shortest side
rect_sizes = rect_sizes[rect_order] rect_sizes = rect_sizes[rect_order]
for rect_ind, rect_size in enumerate(rect_sizes): for rect_ind, rect_size in enumerate(rect_sizes):
''' Place the rect ''' ''' Place the rect '''
# Best short-side fit (bssf) to pick a region # Best short-side fit (bssf) to pick a region
region_sizes = regions[:, 2:] - regions[:, :2] bssf_scores = ((regions[:, 2:] - regions[:, :2]) - rect_size).min(axis=1).astype(float)
bssf_scores = (region_sizes - rect_size).min(axis=1).astype(float)
bssf_scores[bssf_scores < 0] = numpy.inf # doesn't fit! bssf_scores[bssf_scores < 0] = numpy.inf # doesn't fit!
rr = bssf_scores.argmin() rr = bssf_scores.argmin()
if numpy.isinf(bssf_scores[rr]): if numpy.isinf(bssf_scores[rr]):
if allow_rejects: if allow_rejects:
rejected_inds.add(rect_ind) rejected_inds.add(rect_ind)
continue continue
else:
raise MasqueError(f'Failed to find a suitable location for rectangle {rect_ind}') raise MasqueError(f'Failed to find a suitable location for rectangle {rect_ind}')
# Read out location # Read out location
loc = regions[rr, :2] loc = regions[rr, :2]
rect_locs[rect_ind] = loc rect_locs[rect_ind] = loc
region_size = region_sizes[rr] region_size = regions[rr, 2:] - loc
split_horiz = region_size[0] < region_size[1] split_horiz = region_size[0] < region_size[1]
new_region0 = regions[rr].copy() new_region0 = regions[rr].copy()
new_region1 = new_region0.copy() new_region1 = new_region0.copy()
split_vertex = loc + rect_size split_vert = loc + rect_size
if split_horiz: if split_horiz:
new_region0[2] = split_vertex[0] new_region0[2] = split_vert[0]
new_region0[1] = split_vertex[1] new_region0[1] = split_vert[1]
new_region1[0] = split_vertex[0] new_region1[0] = split_vert[0]
else: else:
new_region0[3] = split_vertex[1] new_region0[3] = split_vert[1]
new_region0[0] = split_vertex[0] new_region0[0] = split_vert[0]
new_region1[1] = split_vertex[1] new_region1[1] = split_vert[1]
regions = numpy.vstack((regions[:rr], regions[rr + 1:], regions = numpy.vstack((regions[:rr], regions[rr + 1:],
new_region0, new_region1)) new_region0, new_region1))
if presort:
unsort_order = rect_order.argsort()
rect_locs = rect_locs[unsort_order]
rejected_inds = set(unsort_order[list(rejected_inds)])
return rect_locs, rejected_inds return rect_locs, rejected_inds
def pack_patterns(
library: Mapping[str, Pattern],
patterns: Sequence[str],
containers: ArrayLike,
spacing: tuple[float, float],
presort: bool = True,
allow_rejects: bool = True,
packer: Callable = maxrects_bssf,
) -> tuple[Pattern, list[str]]:
"""
Pick placement locations for `patterns` inside the regions specified by `containers`.
No rotations are performed.
Args:
library: Library from which `Pattern` objects will be drawn.
patterns: Sequence of pattern names which are to be placed.
containers: Mx4 array of regions into which `patterns` will be placed, specified using their
corner coordinates ` [[x_min0, y_min0, x_max0, y_max0], ...]`.
spacing: (x, y) spacing between adjacent patterns. Patterns are effectively expanded outwards
by `spacing / 2` prior to placement, so this also affects pattern position relative to
container edges.
presort: If `True` (default), largest-shortest-side rectangles will be placed
first. Otherwise, they will be placed in the order provided.
allow_rejects: If `False`, `MasqueError` will be raised if any rectangle cannot be placed.
packer: Bin-packing method; see the other functions in this module (namely `maxrects_bssf`
and `guillotine_bssf_sas`).
Returns:
A `Pattern` containing one `Ref` for each entry in `patterns`.
A list of "rejected" pattern names, for which a valid placement location could not be found.
Raises:
MasqueError if `allow_rejects` is `True` but some `rects` could not be placed.
"""
half_spacing = numpy.asarray(spacing, dtype=float) / 2
bounds = [library[pp].get_bounds() for pp in patterns]
sizes = [bb[1] - bb[0] + spacing if bb is not None else spacing for bb in bounds]
offsets = [half_spacing - bb[0] if bb is not None else (0, 0) for bb in bounds]
locations, reject_inds = packer(sizes, containers, presort=presort, allow_rejects=allow_rejects)
pat = Pattern()
for pp, oo, loc in zip(patterns, offsets, locations, strict=True):
pat.ref(pp, offset=oo + loc)
rejects = [patterns[ii] for ii in reject_inds]
return pat, rejects

View File

@ -1,178 +0,0 @@
"""
Functions for writing port data into Pattern geometry/annotations/labels (`ports_to_data`)
and retrieving it (`data_to_ports`).
These use the format 'name:ptype angle_deg' written into labels, which are placed at
the port locations. This particular approach is just a sensible default; feel free to
to write equivalent functions for your own format or alternate storage methods.
"""
from collections.abc import Sequence, Mapping
import logging
from itertools import chain
import numpy
from ..pattern import Pattern
from ..utils import layer_t
from ..ports import Port
from ..error import PatternError
from ..library import ILibraryView, LibraryView
logger = logging.getLogger(__name__)
def ports_to_data(pattern: Pattern, layer: layer_t) -> Pattern:
"""
Place a text label at each port location, specifying the port data in the format
'name:ptype angle_deg'
This can be used to debug port locations or to automatically generate ports
when reading in a GDS file.
NOTE that `pattern` is modified by this function
Args:
pattern: The pattern which is to have its ports labeled. MODIFIED in-place.
layer: The layer on which the labels will be placed.
Returns:
`pattern`
"""
for name, port in pattern.ports.items():
if port.rotation is None:
angle_deg = numpy.inf
else:
angle_deg = numpy.rad2deg(port.rotation)
pattern.label(layer=layer, string=f'{name}:{port.ptype} {angle_deg:g}', offset=port.offset)
return pattern
def data_to_ports(
layers: Sequence[layer_t],
library: Mapping[str, Pattern],
pattern: Pattern, # Pattern is good since we don't want to do library[name] to avoid infinite recursion.
# LazyLibrary protects against library[ref.target] causing a circular lookup.
# For others, maybe check for cycles up front? TODO
name: str | None = None, # Note: name optional, but arg order different from read(postprocess=)
max_depth: int = 0,
skip_subcells: bool = True,
# TODO missing ok?
) -> Pattern:
"""
# TODO fixup documentation in ports2data
# TODO move to utils.file?
Examine `pattern` for labels specifying port info, and use that info
to fill out its `ports` attribute.
Labels are assumed to be placed at the port locations, and have the format
'name:ptype angle_deg'
Args:
layers: Search for labels on all the given layers.
pattern: Pattern object to scan for labels.
max_depth: Maximum hierarcy depth to search. Default 999_999.
Reduce this to 0 to avoid ever searching subcells.
skip_subcells: If port labels are found at a given hierarcy level,
do not continue searching at deeper levels. This allows subcells
to contain their own port info without interfering with supercells'
port data.
Default True.
Returns:
The updated `pattern`. Port labels are not removed.
"""
if pattern.ports:
logger.warning(f'Pattern {name if name else pattern} already had ports, skipping data_to_ports')
return pattern
if not isinstance(library, ILibraryView):
library = LibraryView(library)
data_to_ports_flat(layers, pattern, name)
if (skip_subcells and pattern.ports) or max_depth == 0:
return pattern
# Load ports for all subpatterns, and use any we find
found_ports = False
for target in pattern.refs:
if target is None:
continue
pp = data_to_ports(
layers=layers,
library=library,
pattern=library[target],
name=target,
max_depth=max_depth - 1,
skip_subcells=skip_subcells,
)
found_ports |= bool(pp.ports)
if not found_ports:
return pattern
for target, refs in pattern.refs.items():
if target is None:
continue
if not refs:
continue
for ref in refs:
aa = library.abstract(target)
if not aa.ports:
break
aa.apply_ref_transform(ref)
pattern.check_ports(other_names=aa.ports.keys())
pattern.ports.update(aa.ports)
return pattern
def data_to_ports_flat(
layers: Sequence[layer_t],
pattern: Pattern,
cell_name: str | None = None,
) -> Pattern:
"""
Examine `pattern` for labels specifying port info, and use that info
to fill out its `ports` attribute.
Labels are assumed to be placed at the port locations, and have the format
'name:ptype angle_deg'
The pattern is assumed to be flat (have no `refs`) and have no pre-existing ports.
Args:
layers: Search for labels on all the given layers.
pattern: Pattern object to scan for labels.
cell_name: optional, used for warning message only
Returns:
The updated `pattern`. Port labels are not removed.
"""
labels = list(chain.from_iterable(pattern.labels[layer] for layer in layers))
if not labels:
return pattern
pstr = cell_name if cell_name is not None else repr(pattern)
if pattern.ports:
raise PatternError(f'Pattern "{pstr}" has pre-existing ports!')
local_ports = {}
for label in labels:
name, property_string = label.string.split(':')
properties = property_string.split(' ')
ptype = properties[0]
angle_deg = float(properties[1]) if len(ptype) else 0
xy = label.offset
angle = numpy.deg2rad(angle_deg)
if name in local_ports:
logger.warning(f'Duplicate port "{name}" in pattern "{pstr}"')
local_ports[name] = Port(offset=xy, rotation=angle, ptype=ptype)
pattern.ports.update(local_ports)
return pattern

View File

@ -1,15 +1,12 @@
""" """
Geometric transforms Geometric transforms
""" """
from collections.abc import Sequence from typing import Sequence, Tuple
from functools import lru_cache
import numpy import numpy
from numpy.typing import NDArray, ArrayLike from numpy.typing import NDArray
from numpy import pi
@lru_cache
def rotation_matrix_2d(theta: float) -> NDArray[numpy.float64]: def rotation_matrix_2d(theta: float) -> NDArray[numpy.float64]:
""" """
2D rotation matrix for rotating counterclockwise around the origin. 2D rotation matrix for rotating counterclockwise around the origin.
@ -20,18 +17,11 @@ def rotation_matrix_2d(theta: float) -> NDArray[numpy.float64]:
Returns: Returns:
rotation matrix rotation matrix
""" """
arr = numpy.array([[numpy.cos(theta), -numpy.sin(theta)], return numpy.array([[numpy.cos(theta), -numpy.sin(theta)],
[numpy.sin(theta), +numpy.cos(theta)]]) [numpy.sin(theta), +numpy.cos(theta)]])
# If this was a manhattan rotation, round to remove some inacuraccies in sin & cos
if numpy.isclose(theta % (pi / 2), 0):
arr = numpy.round(arr)
arr.flags.writeable = False def normalize_mirror(mirrored: Sequence[bool]) -> Tuple[bool, float]:
return arr
def normalize_mirror(mirrored: Sequence[bool]) -> tuple[bool, float]:
""" """
Converts 0-2 mirror operations `(mirror_across_x_axis, mirror_across_y_axis)` Converts 0-2 mirror operations `(mirror_across_x_axis, mirror_across_y_axis)`
into 0-1 mirror operations and a rotation into 0-1 mirror operations and a rotation
@ -48,71 +38,3 @@ def normalize_mirror(mirrored: Sequence[bool]) -> tuple[bool, float]:
mirror_x = (mirrored_x != mirrored_y) # XOR mirror_x = (mirrored_x != mirrored_y) # XOR
angle = numpy.pi if mirrored_y else 0 angle = numpy.pi if mirrored_y else 0
return mirror_x, angle return mirror_x, angle
def rotate_offsets_around(
offsets: NDArray[numpy.float64],
pivot: NDArray[numpy.float64],
angle: float,
) -> NDArray[numpy.float64]:
"""
Rotates offsets around a pivot point.
Args:
offsets: Nx2 array, rows are (x, y) offsets
pivot: (x, y) location to rotate around
angle: rotation angle in radians
Returns:
Nx2 ndarray of (x, y) position after the rotation is applied.
"""
offsets -= pivot
offsets[:] = (rotation_matrix_2d(angle) @ offsets.T).T
offsets += pivot
return offsets
def apply_transforms(
outer: ArrayLike,
inner: ArrayLike,
tensor: bool = False,
) -> NDArray[numpy.float64]:
"""
Apply a set of transforms (`outer`) to a second set (`inner`).
This is used to find the "absolute" transform for nested `Ref`s.
The two transforms should be of shape Ox4 and Ix4.
Rows should be of the form `(x_offset, y_offset, rotation_ccw_rad, mirror_across_x)`.
The output will be of the form (O*I)x4 (if `tensor=False`) or OxIx4 (`tensor=True`).
Args:
outer: Transforms for the container refs. Shape Ox4.
inner: Transforms for the contained refs. Shape Ix4.
tensor: If `True`, an OxIx4 array is returned, with `result[oo, ii, :]` corresponding
to the `oo`th `outer` transform applied to the `ii`th inner transform.
If `False` (default), this is concatenated into `(O*I)x4` to allow simple
chaining into additional `apply_transforms()` calls.
Returns:
OxIx4 or (O*I)x4 array. Final dimension is
`(total_x, total_y, total_rotation_ccw_rad, net_mirrored_x)`.
"""
outer = numpy.atleast_2d(outer).astype(float, copy=False)
inner = numpy.atleast_2d(inner).astype(float, copy=False)
# If mirrored, flip y's
xy_mir = numpy.tile(inner[:, :2], (outer.shape[0], 1, 1)) # dims are outer, inner, xyrm
xy_mir[outer[:, 3].astype(bool), :, 1] *= -1
rot_mats = [rotation_matrix_2d(angle) for angle in outer[:, 2]]
xy = numpy.einsum('ort,oit->oir', rot_mats, xy_mir)
tot = numpy.empty((outer.shape[0], inner.shape[0], 4))
tot[:, :, :2] = outer[:, None, :2] + xy
tot[:, :, 2:] = outer[:, None, 2:] + inner[None, :, 2:] # sum rotations and mirrored
tot[:, :, 2] %= 2 * pi # clamp rot
tot[:, :, 3] %= 2 # clamp mirrored
if tensor:
return tot
return numpy.concatenate(tot)

View File

@ -1,13 +1,8 @@
""" """
Type definitions Type definitions
""" """
from typing import Protocol from typing import Union, Tuple, Sequence, Dict, List
layer_t = int | tuple[int, int] | str layer_t = Union[int, Tuple[int, int], str]
annotations_t = dict[str, list[int | float | str]] annotations_t = Dict[str, List[Union[int, float, str]]]
class SupportsBool(Protocol):
def __bool__(self) -> bool:
...

View File

@ -15,9 +15,9 @@ def remove_duplicate_vertices(vertices: ArrayLike, closed_path: bool = True) ->
(i.e. the last vertex will be removed if it is the same as the first) (i.e. the last vertex will be removed if it is the same as the first)
Returns: Returns:
`vertices` with no consecutive duplicates. This may be a view into the original array. `vertices` with no consecutive duplicates.
""" """
vertices = numpy.asarray(vertices) vertices = numpy.array(vertices)
duplicates = (vertices == numpy.roll(vertices, 1, axis=0)).all(axis=1) duplicates = (vertices == numpy.roll(vertices, 1, axis=0)).all(axis=1)
if not closed_path: if not closed_path:
duplicates[0] = False duplicates[0] = False
@ -35,7 +35,7 @@ def remove_colinear_vertices(vertices: ArrayLike, closed_path: bool = True) -> N
closed path. If `False`, the path is assumed to be open. Default `True`. closed path. If `False`, the path is assumed to be open. Default `True`.
Returns: Returns:
`vertices` with colinear (superflous) vertices removed. May be a view into the original array. `vertices` with colinear (superflous) vertices removed.
""" """
vertices = remove_duplicate_vertices(vertices) vertices = remove_duplicate_vertices(vertices)
@ -73,17 +73,17 @@ def poly_contains_points(
Returns: Returns:
ndarray of booleans, [point0_is_in_shape, point1_is_in_shape, ...] ndarray of booleans, [point0_is_in_shape, point1_is_in_shape, ...]
""" """
points = numpy.asarray(points, dtype=float) points = numpy.array(points, copy=False)
vertices = numpy.asarray(vertices, dtype=float) vertices = numpy.array(vertices, copy=False)
if points.size == 0: if points.size == 0:
return numpy.zeros(0, dtype=numpy.int8) return numpy.zeros(0)
min_bounds = numpy.min(vertices, axis=0)[None, :] min_bounds = numpy.min(vertices, axis=0)[None, :]
max_bounds = numpy.max(vertices, axis=0)[None, :] max_bounds = numpy.max(vertices, axis=0)[None, :]
trivially_outside = ((points < min_bounds).any(axis=1) trivially_outside = ((points < min_bounds).any(axis=1)
| (points > max_bounds).any(axis=1)) # noqa: E128 | (points > max_bounds).any(axis=1))
nontrivial = ~trivially_outside nontrivial = ~trivially_outside
if trivially_outside.all(): if trivially_outside.all():
@ -101,10 +101,10 @@ def poly_contains_points(
dv = numpy.roll(verts, -1, axis=0) - verts dv = numpy.roll(verts, -1, axis=0) - verts
is_left = (dv[:, 0] * (ntpts[..., 1] - verts[:, 1]) # >0 if left of dv, <0 if right, 0 if on the line is_left = (dv[:, 0] * (ntpts[..., 1] - verts[:, 1]) # >0 if left of dv, <0 if right, 0 if on the line
- dv[:, 1] * (ntpts[..., 0] - verts[:, 0])) # noqa: E128 - dv[:, 1] * (ntpts[..., 0] - verts[:, 0]))
winding_number = ((upward & (is_left > 0)).sum(axis=0) winding_number = ((upward & (is_left > 0)).sum(axis=0)
- (downward & (is_left < 0)).sum(axis=0)) # noqa: E128 - (downward & (is_left < 0)).sum(axis=0))
nontrivial_inside = winding_number != 0 # filter nontrivial points based on winding number nontrivial_inside = winding_number != 0 # filter nontrivial points based on winding number
if include_boundary: if include_boundary:
@ -113,3 +113,5 @@ def poly_contains_points(
inside = nontrivial.copy() inside = nontrivial.copy()
inside[nontrivial] = nontrivial_inside inside[nontrivial] = nontrivial_inside
return inside return inside

View File

@ -39,11 +39,11 @@ classifiers = [
"Topic :: Scientific/Engineering :: Electronic Design Automation (EDA)", "Topic :: Scientific/Engineering :: Electronic Design Automation (EDA)",
"Topic :: Scientific/Engineering :: Visualization", "Topic :: Scientific/Engineering :: Visualization",
] ]
requires-python = ">=3.11" requires-python = ">=3.8"
dynamic = ["version"] dynamic = ["version"]
dependencies = [ dependencies = [
"numpy>=1.26", "numpy~=1.21",
"klamath~=1.4", "klamath~=1.2",
] ]
@ -52,41 +52,9 @@ path = "masque/__init__.py"
[project.optional-dependencies] [project.optional-dependencies]
oasis = ["fatamorgana~=0.11"] oasis = ["fatamorgana~=0.11"]
dxf = ["ezdxf~=1.0.2"] dxf = ["ezdxf"]
svg = ["svgwrite"] svg = ["svgwrite"]
visualize = ["matplotlib"] visualize = ["matplotlib"]
text = ["matplotlib", "freetype-py"] text = ["matplotlib", "freetype-py"]
python-gdsii = ["python-gdsii"]
[tool.ruff]
exclude = [
".git",
"dist",
]
line-length = 145
indent-width = 4
lint.dummy-variable-rgx = "^(_+|(_+[a-zA-Z0-9_]*[a-zA-Z0-9]+?))$"
lint.select = [
"NPY", "E", "F", "W", "B", "ANN", "UP", "SLOT", "SIM", "LOG",
"C4", "ISC", "PIE", "PT", "RET", "TCH", "PTH", "INT",
"ARG", "PL", "R", "TRY",
"G010", "G101", "G201", "G202",
"Q002", "Q003", "Q004",
]
lint.ignore = [
#"ANN001", # No annotation
"ANN002", # *args
"ANN003", # **kwargs
"ANN401", # Any
"ANN101", # self: Self
"SIM108", # single-line if / else assignment
"RET504", # x=y+z; return x
"PIE790", # unnecessary pass
"ISC003", # non-implicit string concatenation
"C408", # dict(x=y) instead of {'x': y}
"PLR09", # Too many xxx
"PLR2004", # magic number
"PLC0414", # import x as x
"TRY003", # Long exception message
]