diff --git a/MIGRATION.md b/MIGRATION.md deleted file mode 100644 index 818b133..0000000 --- a/MIGRATION.md +++ /dev/null @@ -1,299 +0,0 @@ -# Migration Guide - -This guide covers changes between the git tag `release` and the current tree. -At `release`, `masque.__version__` was `3.3`; the current tree reports `3.4`. - -Most downstream changes are in `masque/builder/*`, but there are a few other -API changes that may require code updates. - -## Routing API: renamed and consolidated - -The routing helpers were consolidated into a single implementation in -`masque/builder/pather.py`. - -The biggest migration point is that the old routing verbs were renamed: - -| Old API | New API | -| --- | --- | -| `Pather.path(...)` | `Pather.trace(...)` | -| `Pather.path_to(...)` | `Pather.trace_to(...)` | -| `Pather.mpath(...)` | `Pather.trace(...)` / `Pather.trace_to(...)` with multiple ports | -| `Pather.pathS(...)` | `Pather.jog(...)` | -| `Pather.pathU(...)` | `Pather.uturn(...)` | -| `Pather.path_into(...)` | `Pather.trace_into(...)` | -| `Pather.path_from(src, dst)` | `Pather.at(src).trace_into(dst)` | -| `RenderPather.path(...)` | `Pather(..., auto_render=False).trace(...)` | -| `RenderPather.path_to(...)` | `Pather(..., auto_render=False).trace_to(...)` | -| `RenderPather.mpath(...)` | `Pather(..., auto_render=False).trace(...)` / `Pather(..., auto_render=False).trace_to(...)` | -| `RenderPather.pathS(...)` | `Pather(..., auto_render=False).jog(...)` | -| `RenderPather.pathU(...)` | `Pather(..., auto_render=False).uturn(...)` | -| `RenderPather.path_into(...)` | `Pather(..., auto_render=False).trace_into(...)` | -| `RenderPather.path_from(src, dst)` | `Pather(..., auto_render=False).at(src).trace_into(dst)` | - -There are also new convenience wrappers: - -- `straight(...)` for `trace_to(..., ccw=None, ...)` -- `ccw(...)` for `trace_to(..., ccw=True, ...)` -- `cw(...)` for `trace_to(..., ccw=False, ...)` -- `jog(...)` for S-bends -- `uturn(...)` for U-bends - -Important: `Pather.path()` is no longer the routing API. It now forwards to -`Pattern.path()` and creates a geometric `Path` element. Any old routing code -that still calls `pather.path(...)` must be renamed. - -### Common rewrites - -```python -# old -pather.path('VCC', False, 6_000) -pather.path_to('VCC', None, x=0) -pather.mpath(['GND', 'VCC'], True, xmax=-10_000, spacing=5_000) -pather.pathS('VCC', offset=-2_000, length=8_000) -pather.pathU('VCC', offset=4_000, length=5_000) -pather.path_into('src', 'dst') -pather.path_from('src', 'dst') - -# new -pather.cw('VCC', 6_000) -pather.straight('VCC', x=0) -pather.ccw(['GND', 'VCC'], xmax=-10_000, spacing=5_000) -pather.jog('VCC', offset=-2_000, length=8_000) -pather.uturn('VCC', offset=4_000, length=5_000) -pather.trace_into('src', 'dst') -pather.at('src').trace_into('dst') -``` - -If you prefer the more explicit spelling, `trace(...)` and `trace_to(...)` -remain the underlying primitives: - -```python -pather.trace('VCC', False, 6_000) -pather.trace_to('VCC', None, x=0) -``` - -## `PortPather` and `.at(...)` - -Routing can now be written in a fluent style via `.at(...)`, which returns a -`PortPather`. - -```python -(rpather.at('VCC') - .trace(False, length=6_000) - .trace_to(None, x=0) -) -``` - -This is additive, not required for migration. Existing code can stay with the -non-fluent `Pather` methods after renaming the verbs above. - -Old `PortPather` helper names were also cleaned up: - -| Old API | New API | -| --- | --- | -| `save_copy(...)` | `mark(...)` | -| `rename_to(...)` | `rename(...)` | - -Example: - -```python -# old -pp.save_copy('branch') -pp.rename_to('feed') - -# new -pp.mark('branch') -pp.rename('feed') -``` - -## Imports and module layout - -`Pather` now provides the remaining builder/routing surface in -`masque/builder/pather.py`. The old module files -`masque/builder/builder.py` and `masque/builder/renderpather.py` were removed. - -Update imports like this: - -```python -# old -from masque.builder.builder import Builder -from masque.builder.renderpather import RenderPather - -# new -from masque.builder import Pather - -builder = Pather(...) -deferred = Pather(..., auto_render=False) -``` - -Top-level imports from `masque` also continue to work. - -`Pather` now defaults to `auto_render=True`, so plain construction replaces the -old `Builder` behavior. Use `Pather(..., auto_render=False)` where you -previously used `RenderPather`. - -## `BasicTool` was replaced - -`BasicTool` is no longer exported. Use: - -- `SimpleTool` for the simple "one straight generator + one bend cell" case -- `AutoTool` if you need transitions, multiple candidate straights/bends, or - S-bends/U-bends - -### Old `BasicTool` - -```python -from masque.builder.tools import BasicTool - -tool = BasicTool( - straight=(make_straight, 'input', 'output'), - bend=(lib.abstract('bend'), 'input', 'output'), - transitions={ - 'm2wire': (lib.abstract('via'), 'top', 'bottom'), - }, -) -``` - -### New `AutoTool` - -```python -from masque.builder.tools import AutoTool - -tool = AutoTool( - straights=[ - AutoTool.Straight( - ptype='m1wire', - fn=make_straight, - in_port_name='input', - out_port_name='output', - ), - ], - bends=[ - AutoTool.Bend( - abstract=lib.abstract('bend'), - in_port_name='input', - out_port_name='output', - clockwise=True, - ), - ], - sbends=[], - transitions={ - ('m2wire', 'm1wire'): AutoTool.Transition( - lib.abstract('via'), - 'top', - 'bottom', - ), - }, - default_out_ptype='m1wire', -) -``` - -The key differences are: - -- `BasicTool` -> `SimpleTool` or `AutoTool` -- `straight=(fn, in_name, out_name)` -> `straights=[AutoTool.Straight(...)]` -- `bend=(abstract, in_name, out_name)` -> `bends=[AutoTool.Bend(...)]` -- transition keys are now `(external_ptype, internal_ptype)` tuples -- transitions use `AutoTool.Transition(...)` instead of raw tuples - -If your old `BasicTool` usage did not rely on transitions or multiple routing -options, `SimpleTool` is the closest replacement. - -## Custom `Tool` subclasses - -If you maintain your own `Tool` subclass, the interface changed: - -- `Tool.path(...)` became `Tool.traceL(...)` -- `Tool.traceS(...)` and `Tool.traceU(...)` were added for native S/U routes -- `planL()` / `planS()` / `planU()` remain the planning hooks used by deferred rendering - -In practice, a minimal old implementation like: - -```python -class MyTool(Tool): - def path(self, ccw, length, **kwargs): - ... -``` - -should now become: - -```python -class MyTool(Tool): - def traceL(self, ccw, length, **kwargs): - ... -``` - -If you do not implement `traceS()` or `traceU()`, the unified pather will -either fall back to the planning hooks or synthesize those routes from simpler -steps where possible. - -## Transform semantics changed - -The other major user-visible change is that `mirror()` and `rotate()` are now -treated more consistently as intrinsic transforms on low-level objects. - -The practical migration rule is: - -- use `mirror()` / `rotate()` when you want to change the object relative to its - own origin -- use `flip_across(...)`, `rotate_around(...)`, or container-level transforms - when you want to move the object in its parent coordinate system - -### Example: `Port` - -Old behavior: - -```python -port.mirror(0) # changed both offset and orientation -``` - -New behavior: - -```python -port.mirror(0) # changes orientation only -port.flip_across(axis=0) # old "mirror in the parent pattern" behavior -``` - -### What to audit - -Check code that calls: - -- `Port.mirror(...)` -- `Ref.rotate(...)` -- `Ref.mirror(...)` -- `Label.rotate_around(...)` / `Label.mirror(...)` - -If that code expected offsets or repetition grids to move automatically, it -needs updating. For whole-pattern transforms, prefer calling `Pattern.mirror()` -or `Pattern.rotate_around(...)` at the container level. - -## Other user-facing changes - -### DXF environments - -If you install the DXF extra, the supported `ezdxf` baseline moved from -`~=1.0.2` to `~=1.4`. Any pinned environments should be updated accordingly. - -### New exports - -These are additive, but available now from `masque` and `masque.builder`: - -- `PortPather` -- `SimpleTool` -- `AutoTool` -- `boolean` - -## Minimal migration checklist - -If your code uses the routing stack, do these first: - -1. Replace `path`/`path_to`/`mpath`/`path_into` calls with - `trace`/`trace_to`/multi-port `trace`/`trace_into`. -2. Replace `BasicTool` with `SimpleTool` or `AutoTool`. -3. Fix imports that still reference `masque.builder.builder` or - `masque.builder.renderpather`. -4. Audit any low-level `mirror()` usage, especially on `Port` and `Ref`. - -If your code only uses `Pattern`, `Library`, `place()`, and `plug()` without the -routing helpers, you may not need any changes beyond the transform audit and any -stale imports. diff --git a/README.md b/README.md index 71c37f0..250eef3 100644 --- a/README.md +++ b/README.md @@ -37,55 +37,6 @@ A layout consists of a hierarchy of `Pattern`s stored in a single `Library`. Each `Pattern` can contain `Ref`s pointing at other patterns, `Shape`s, `Label`s, and `Port`s. -Library / Pattern hierarchy: -``` - +-----------------------------------------------------------------------+ - | Library | - | | - | Name: "MyChip" ...> Name: "Transistor" | - | +---------------------------+ : +---------------------------+ | - | | [Pattern] | : | [Pattern] | | - | | | : | | | - | | shapes: {...} | : | shapes: { | | - | | ports: {...} | : | "Si": [, ...] | | - | | | : | "M1": [, ...]}| | - | | refs: | : | ports: {G, S, D} | | - | | "Transistor": [Ref, Ref]|..: +---------------------------+ | - | +---------------------------+ | - | | - | # (`refs` keys resolve to Patterns within the Library) | - +-----------------------------------------------------------------------+ -``` - - -Pattern internals: -``` - +---------------------------------------------------------------+ - | [Pattern] | - | | - | shapes: { | - | (1, 0): [Polygon, Circle, ...], # Geometry by layer | - | (2, 0): [Path, ...] | - | "M1" : [Path, ...] | - | "M2" : [Polygon, ...] | - | } | - | | - | refs: { # Key sets target name, Ref sets transform | - | "my_cell": [ | - | Ref(offset=(0,0), rotation=0), | - | Ref(offset=(10,0), rotation=R90, repetition=Grid(...)) | - | ] | - | } | - | | - | ports: { | - | "in": Port(offset=(0,0), rotation=0, ptype="M1"), | - | "out": Port(offset=(10,0), rotation=R180, ptype="wg") | - | } | - | | - +---------------------------------------------------------------+ -``` - - `masque` departs from several "classic" GDSII paradigms: - A `Pattern` object does not store its own name. A name is only assigned when the pattern is placed into a `Library`, which is effectively a name->`Pattern` mapping. @@ -145,7 +96,7 @@ References are accomplished by listing the target's name, not its `Pattern` obje in order to create a reference, but they also need to access the pattern's ports. * One way to provide this data is through an `Abstract`, generated via `Library.abstract()` or through a `Library.abstract_view()`. - * Another way is use `Pather.place()` or `Pather.plug()`, which automatically creates + * Another way is use `Builder.place()` or `Builder.plug()`, which automatically creates an `Abstract` from its internally-referenced `Library`. @@ -193,8 +144,8 @@ my_pattern.ref(new_name, ...) # instantiate the cell # In practice, you may do lots of my_pattern.ref(lib << make_tree(...), ...) -# With a `Pather` and `place()`/`plug()` the `lib <<` portion can be implicit: -my_builder = Pather(library=lib, ...) +# With a `Builder` and `place()`/`plug()` the `lib <<` portion can be implicit: +my_builder = Builder(library=lib, ...) ... my_builder.place(make_tree(...)) ``` @@ -277,6 +228,11 @@ my_pattern.ref(_make_my_subpattern(), offset=..., ...) ## TODO +* Rework naming/args for path-related (Builder, PortPather, path/pathL/pathS/pathU, path_to, mpath) * PolyCollection & arrow-based read/write +* pather and renderpather examples, including .at() (PortPather) * Bus-to-bus connections? -* tuple / string layer auto-translation +* Tests tests tests +* Better interface for polygon operations (e.g. with `pyclipper`) + - de-embedding + - boolean ops diff --git a/examples/ellip_grating.py b/examples/ellip_grating.py index 57b170c..a51a27e 100644 --- a/examples/ellip_grating.py +++ b/examples/ellip_grating.py @@ -6,7 +6,7 @@ from masque.file import gdsii from masque import Arc, Pattern -def main() -> None: +def main(): pat = Pattern() layer = (0, 0) pat.shapes[layer].extend([ diff --git a/examples/generate_gds_perf.py b/examples/generate_gds_perf.py deleted file mode 100644 index 6bdd999..0000000 --- a/examples/generate_gds_perf.py +++ /dev/null @@ -1,5 +0,0 @@ -from masque.file.gdsii_perf import main - - -if __name__ == '__main__': - raise SystemExit(main()) diff --git a/examples/nested_poly_test.py b/examples/nested_poly_test.py index 60e0a3e..de51d6a 100644 --- a/examples/nested_poly_test.py +++ b/examples/nested_poly_test.py @@ -1,5 +1,7 @@ +import numpy from pyclipper import ( - Pyclipper, PT_SUBJECT, CT_UNION, PFT_NONZERO, + Pyclipper, PT_CLIP, PT_SUBJECT, CT_UNION, CT_INTERSECTION, PFT_NONZERO, + scale_to_clipper, scale_from_clipper, ) p = Pyclipper() p.AddPaths([ @@ -10,8 +12,8 @@ p.AddPaths([ ], PT_SUBJECT, closed=True) #p.Execute2? #p.Execute? -p.Execute(CT_UNION, PFT_NONZERO, PFT_NONZERO) -p.Execute(CT_UNION, PFT_NONZERO, PFT_NONZERO) +p.Execute(PT_UNION, PT_NONZERO, PT_NONZERO) +p.Execute(CT_UNION, PT_NONZERO, PT_NONZERO) p.Execute(CT_UNION, PFT_NONZERO, PFT_NONZERO) p = Pyclipper() diff --git a/examples/profile_gdsii_readers.py b/examples/profile_gdsii_readers.py deleted file mode 100644 index fb7c99e..0000000 --- a/examples/profile_gdsii_readers.py +++ /dev/null @@ -1,131 +0,0 @@ -from __future__ import annotations - -import argparse -import importlib -import json -import time -from pathlib import Path -from typing import Any - -from masque import LibraryError - - -READERS: dict[str, tuple[str, tuple[str, ...]]] = { - 'gdsii': ('masque.file.gdsii', ('readfile',)), - 'gdsii_arrow': ('masque.file.gdsii_arrow', ('readfile', 'arrow_import', 'arrow_convert')), - } - - -def _summarize_library(path: Path, elapsed_s: float, info: dict[str, object], lib: object) -> dict[str, object]: - assert hasattr(lib, '__len__') - assert hasattr(lib, 'tops') - tops = lib.tops() # type: ignore[no-any-return, attr-defined] - try: - unique_top = lib.top() # type: ignore[no-any-return, attr-defined] - except LibraryError: - unique_top = None - - return { - 'path': str(path), - 'elapsed_s': elapsed_s, - 'library_name': info['name'], - 'cell_count': len(lib), # type: ignore[arg-type] - 'topcells': tops, - 'topcell': unique_top, - } - - -def _summarize_arrow_import(path: Path, elapsed_s: float, arrow_arr: Any) -> dict[str, object]: - libarr = arrow_arr[0] - return { - 'path': str(path), - 'elapsed_s': elapsed_s, - 'arrow_rows': len(arrow_arr), - 'library_name': libarr['lib_name'].as_py(), - 'cell_count': len(libarr['cells']), - 'layer_count': len(libarr['layers']), - } - - -def _profile_stage(module: Any, stage: str, path: Path) -> dict[str, object]: - start = time.perf_counter() - - if stage == 'readfile': - lib, info = module.readfile(path) - elapsed_s = time.perf_counter() - start - return _summarize_library(path, elapsed_s, info, lib) - - if stage == 'arrow_import': - if hasattr(module, 'readfile_arrow'): - libarr, _info = module.readfile_arrow(path) - elapsed_s = time.perf_counter() - start - return { - 'path': str(path), - 'elapsed_s': elapsed_s, - 'arrow_rows': 1, - 'library_name': libarr['lib_name'].as_py(), - 'cell_count': len(libarr['cells']), - 'layer_count': len(libarr['layers']), - } - - arrow_arr = module._read_to_arrow(path) - elapsed_s = time.perf_counter() - start - return _summarize_arrow_import(path, elapsed_s, arrow_arr) - - if stage == 'arrow_convert': - arrow_arr = module._read_to_arrow(path) - libarr = arrow_arr[0] - start = time.perf_counter() - lib, info = module.read_arrow(libarr) - elapsed_s = time.perf_counter() - start - return _summarize_library(path, elapsed_s, info, lib) - - raise ValueError(f'Unsupported stage {stage!r}') - - -def build_arg_parser() -> argparse.ArgumentParser: - parser = argparse.ArgumentParser(description='Profile GDS readers with a stable end-to-end workload.') - parser.add_argument('--reader', choices=sorted(READERS), required=True) - parser.add_argument('--stage', default='readfile') - parser.add_argument('--path', type=Path, required=True) - parser.add_argument('--warmup', type=int, default=1) - parser.add_argument('--repeat', type=int, default=1) - parser.add_argument('--output-json', type=Path) - return parser - - -def main(argv: list[str] | None = None) -> int: - parser = build_arg_parser() - args = parser.parse_args(argv) - - module_name, stages = READERS[args.reader] - if args.stage not in stages: - parser.error(f'reader {args.reader!r} only supports stages: {", ".join(stages)}') - - module = importlib.import_module(module_name) - path = args.path.expanduser().resolve() - - for _ in range(args.warmup): - _profile_stage(module, args.stage, path) - - runs = [] - for _ in range(args.repeat): - runs.append(_profile_stage(module, args.stage, path)) - - payload = { - 'reader': args.reader, - 'stage': args.stage, - 'warmup': args.warmup, - 'repeat': args.repeat, - 'runs': runs, - } - rendered = json.dumps(payload, indent=2, sort_keys=True) - if args.output_json is not None: - args.output_json.parent.mkdir(parents=True, exist_ok=True) - args.output_json.write_text(rendered + '\n') - print(rendered) - return 0 - - -if __name__ == '__main__': - raise SystemExit(main()) diff --git a/examples/test_rep.py b/examples/test_rep.py index d25fb55..f82575d 100644 --- a/examples/test_rep.py +++ b/examples/test_rep.py @@ -11,7 +11,7 @@ from masque.file import gdsii, dxf, oasis -def main() -> None: +def main(): lib = Library() cell_name = 'ellip_grating' diff --git a/examples/tutorial/README.md b/examples/tutorial/README.md index 749915b..7210a93 100644 --- a/examples/tutorial/README.md +++ b/examples/tutorial/README.md @@ -1,12 +1,6 @@ masque Tutorial =============== -These examples are meant to be read roughly in order. - -- Start with `basic_shapes.py` for the core `Pattern` / GDS concepts. -- Then read `devices.py` and `library.py` for hierarchical composition and libraries. -- Read the `pather*` tutorials separately when you want routing helpers. - Contents -------- @@ -14,30 +8,24 @@ Contents * Draw basic geometry * Export to GDS - [devices](devices.py) - * Build hierarchical photonic-crystal example devices * Reference other patterns * Add ports to a pattern - * Use `Pather` to snap ports together into a circuit + * Snap ports together to build a circuit * Check for dangling references - [library](library.py) - * Continue from `devices.py` using a lazy library * Create a `LazyLibrary`, which loads / generates patterns only when they are first used * Explore alternate ways of specifying a pattern for `.plug()` and `.place()` * Design a pattern which is meant to plug into an existing pattern (via `.interface()`) - [pather](pather.py) * Use `Pather` to route individual wires and wire bundles - * Use `AutoTool` to generate paths - * Use `AutoTool` to automatically transition between path types -- [renderpather](renderpather.py) - * Use `Pather(auto_render=False)` and `PathTool` to build a layout similar to the one in [pather](pather.py), + * Use `BasicTool` to generate paths + * Use `BasicTool` to automatically transition between path types +- [renderpather](rendpather.py) + * Use `RenderPather` and `PathTool` to build a layout similar to the one in [pather](pather.py), but using `Path` shapes instead of `Polygon`s. -- [port_pather](port_pather.py) - * Use `PortPather` and the `.at()` syntax for more concise routing - * Advanced port manipulation and connections -Additionally, [pcgen](pcgen.py) is a utility module used by `devices.py` for generating -photonic-crystal lattices; it is support code rather than a step-by-step tutorial. +Additionaly, [pcgen](pcgen.py) is a utility module for generating photonic crystal lattices. Running @@ -49,6 +37,3 @@ cd examples/tutorial python3 basic_shapes.py klayout -e basic_shapes.gds ``` - -Some tutorials depend on outputs from earlier ones. In particular, `library.py` -expects `circuit.gds`, which is generated by `devices.py`. diff --git a/examples/tutorial/basic_shapes.py b/examples/tutorial/basic_shapes.py index d8f7e1e..87baaf0 100644 --- a/examples/tutorial/basic_shapes.py +++ b/examples/tutorial/basic_shapes.py @@ -1,9 +1,12 @@ +from collections.abc import Sequence import numpy from numpy import pi -from masque import layer_t, Pattern, Circle, Arc, Ref -from masque.repetition import Grid +from masque import ( + layer_t, Pattern, Label, Port, + Circle, Arc, Polygon, + ) import masque.file.gdsii @@ -36,45 +39,6 @@ def hole( return pat -def hole_array( - radius: float, - num_x: int = 5, - num_y: int = 3, - pitch: float = 2000, - layer: layer_t = (1, 0), - ) -> Pattern: - """ - Generate an array of circular holes using `Repetition`. - - Args: - radius: Circle radius. - num_x, num_y: Number of holes in x and y. - pitch: Center-to-center spacing. - layer: Layer to draw the holes on. - - Returns: - Pattern containing a grid of holes. - """ - # First, make a pattern for a single hole - hpat = hole(radius, layer) - - # Now, create a pattern that references it multiple times using a Grid - pat = Pattern() - pat.refs['hole'] = [ - Ref( - offset=(0, 0), - repetition=Grid(a_vector=(pitch, 0), a_count=num_x, - b_vector=(0, pitch), b_count=num_y) - )] - - # We can also add transformed references (rotation, mirroring, etc.) - pat.refs['hole'].append( - Ref(offset=(0, -pitch), rotation=pi / 4, mirrored=True) - ) - - return pat, hpat - - def triangle( radius: float, layer: layer_t = (1, 0), @@ -96,7 +60,9 @@ def triangle( ]) * radius pat = Pattern() - pat.polygon(layer, vertices=vertices) + pat.shapes[layer].extend([ + Polygon(offset=(0, 0), vertices=vertices), + ]) return pat @@ -145,13 +111,9 @@ def main() -> None: lib['smile'] = smile(1000) lib['triangle'] = triangle(1000) - # Use a Grid to make many holes efficiently - lib['grid'], lib['hole'] = hole_array(1000) - masque.file.gdsii.writefile(lib, 'basic_shapes.gds', **GDS_OPTS) lib['triangle'].visualize() - lib['grid'].visualize(lib) if __name__ == '__main__': diff --git a/examples/tutorial/devices.py b/examples/tutorial/devices.py index 955e786..6b9cfa2 100644 --- a/examples/tutorial/devices.py +++ b/examples/tutorial/devices.py @@ -1,19 +1,11 @@ -""" -Tutorial: building hierarchical devices with `Pattern`, `Port`, and `Pather`. - -This file uses photonic-crystal components as the concrete example, so some of -the geometry-generation code is domain-specific. The tutorial value is in the -Masque patterns around it: creating reusable cells, annotating ports, composing -hierarchy with references, and snapping ports together to build a larger circuit. -""" from collections.abc import Sequence, Mapping import numpy from numpy import pi from masque import ( - layer_t, Pattern, Ref, Pather, Port, Polygon, - Library, + layer_t, Pattern, Ref, Label, Builder, Port, Polygon, + Library, ILibraryView, ) from masque.utils import ports2data from masque.file.gdsii import writefile, check_valid_names @@ -72,9 +64,9 @@ def perturbed_l3( Provided sequence should have same length as `shifts_a`. xy_size: `(x, y)` number of mirror periods in each direction; total size is `2 * n + 1` holes in each direction. Default (10, 10). - perturbed_radius: radius of holes perturbed to form an upwards-directed beam + perturbed_radius: radius of holes perturbed to form an upwards-driected beam (multiplicative factor). Default 1.1. - trench_width: Width of the undercut trenches. Default 1200. + trench width: Width of the undercut trenches. Default 1200. Returns: `Pattern` object representing the L3 design. @@ -87,15 +79,14 @@ def perturbed_l3( shifts_a=shifts_a, shifts_r=shifts_r) - # Build the cavity by instancing the supplied `hole` pattern many times. - # Using references keeps the pattern compact even though it contains many holes. + # Build L3 cavity, using references to the provided hole pattern pat = Pattern() pat.refs[hole] += [ Ref(scale=r, offset=(lattice_constant * x, lattice_constant * y)) for x, y, r in xyr] - # Add rectangular undercut aids based on the referenced hole extents. + # Add rectangular undercut aids min_xy, max_xy = pat.get_bounds_nonempty(hole_lib) trench_dx = max_xy[0] - min_xy[0] @@ -104,7 +95,7 @@ def perturbed_l3( Polygon.rect(ymax=min_xy[1], xmin=min_xy[0], lx=trench_dx, ly=trench_width), ] - # Define the interface in Masque terms: two ports at the left/right extents. + # Ports are at outer extents of the device (with y=0) extent = lattice_constant * xy_size[0] pat.ports = dict( input=Port((-extent, 0), rotation=0, ptype='pcwg'), @@ -134,17 +125,17 @@ def waveguide( Returns: `Pattern` object representing the waveguide. """ - # Generate the normalized lattice locations for the line defect. + # Generate hole locations xy = pcgen.waveguide(length=length, num_mirror=mirror_periods) - # Build the pattern by placing repeated references to the same hole cell. + # Build the pattern pat = Pattern() pat.refs[hole] += [ Ref(offset=(lattice_constant * x, lattice_constant * y)) for x, y in xy] - # Publish the device interface as two ports at the outer edges. + # Ports are at outer edges, with y=0 extent = lattice_constant * length / 2 pat.ports = dict( left=Port((-extent, 0), rotation=0, ptype='pcwg'), @@ -173,17 +164,17 @@ def bend( `Pattern` object representing the waveguide bend. Ports are named 'left' (input) and 'right' (output). """ - # Generate the normalized lattice locations for the bend. + # Generate hole locations xy = pcgen.wgbend(num_mirror=mirror_periods) - # Build the pattern by instancing the shared hole cell. - pat = Pattern() + # Build the pattern + pat= Pattern() pat.refs[hole] += [ Ref(offset=(lattice_constant * x, lattice_constant * y)) for x, y in xy] - # Publish the bend interface as two ports. + # Figure out port locations. extent = lattice_constant * mirror_periods pat.ports = dict( left=Port((-extent, 0), rotation=0, ptype='pcwg'), @@ -212,17 +203,17 @@ def y_splitter( `Pattern` object representing the y-splitter. Ports are named 'in', 'top', and 'bottom'. """ - # Generate the normalized lattice locations for the splitter. + # Generate hole locations xy = pcgen.y_splitter(num_mirror=mirror_periods) - # Build the pattern by instancing the shared hole cell. + # Build pattern pat = Pattern() pat.refs[hole] += [ Ref(offset=(lattice_constant * x, lattice_constant * y)) for x, y in xy] - # Publish the splitter interface as one input and two outputs. + # Determine port locations extent = lattice_constant * mirror_periods pat.ports = { 'in': Port((-extent, 0), rotation=0, ptype='pcwg'), @@ -236,13 +227,13 @@ def y_splitter( def main(interactive: bool = True) -> None: - # First make a couple of reusable primitive cells. + # Generate some basic hole patterns shape_lib = { 'smile': basic_shapes.smile(RADIUS), 'hole': basic_shapes.hole(RADIUS), } - # Then build a small library of higher-level devices from those primitives. + # Build some devices a = LATTICE_CONSTANT devices = {} @@ -254,23 +245,22 @@ def main(interactive: bool = True) -> None: devices['ysplit'] = y_splitter(lattice_constant=a, hole='hole', mirror_periods=5) devices['l3cav'] = perturbed_l3(lattice_constant=a, hole='smile', hole_lib=shape_lib, xy_size=(4, 10)) # uses smile :) - # Turn the device mapping into a `Library`. - # That gives us convenience helpers for hierarchy inspection and abstract views. + # Turn our dict of devices into a Library. + # This provides some convenience functions in the future! lib = Library(devices) # # Build a circuit # - # Create a `Pather`, and register the resulting top cell as "my_circuit". - circ = Pather(library=lib, name='my_circuit') + # Create a `Builder`, and add the circuit to our library as "my_circuit". + circ = Builder(library=lib, name='my_circuit') - # Start by placing a waveguide and renaming its ports to match the circuit-level - # names we want to use while assembling the design. + # Start by placing a waveguide. Call its ports "in" and "signal". circ.place('wg10', offset=(0, 0), port_map={'left': 'in', 'right': 'signal'}) - # Extend the signal path by attaching another waveguide. - # Because `wg10` only has one unattached port left after the plug, Masque can - # infer that it should keep the name `signal`. + # Extend the signal path by attaching the "left" port of a waveguide. + # Since there is only one other port ("right") on the waveguide we + # are attaching (wg10), it automatically inherits the name "signal". circ.plug('wg10', {'signal': 'left'}) # We could have done the following instead: @@ -278,8 +268,8 @@ def main(interactive: bool = True) -> None: # lib['my_circuit'] = circ_pat # circ_pat.place(lib.abstract('wg10'), ...) # circ_pat.plug(lib.abstract('wg10'), ...) - # but `Pather` removes some repeated `lib.abstract(...)` boilerplate and keeps - # the assembly code focused on port-level intent. + # but `Builder` lets us omit some of the repetition of `lib.abstract(...)`, and uses similar + # syntax to `Pather` and `RenderPather`, which add wire/waveguide routing functionality. # Attach a y-splitter to the signal path. # Since the y-splitter has 3 ports total, we can't auto-inherit the @@ -291,10 +281,13 @@ def main(interactive: bool = True) -> None: circ.plug('wg05', {'signal1': 'left'}) circ.plug('wg05', {'signal2': 'left'}) - # Add a bend to both branches. - # Our bend primitive is defined with a specific orientation, so choosing which - # port to plug determines whether the path turns clockwise or counterclockwise. - # We could also mirror one instance instead of using opposite ports. + # Add a bend to both ports. + # Our bend's ports "left" and "right" refer to the original counterclockwise + # orientation. We want the bends to turn in opposite directions, so we attach + # the "right" port to "signal1" to bend clockwise, and the "left" port + # to "signal2" to bend counterclockwise. + # We could also use `mirrored=(True, False)` to mirror one of the devices + # and then use same device port on both paths. circ.plug('bend0', {'signal1': 'right'}) circ.plug('bend0', {'signal2': 'left'}) @@ -303,26 +296,29 @@ def main(interactive: bool = True) -> None: circ.plug('l3cav', {'signal1': 'input'}) circ.plug('wg10', {'signal1': 'left'}) - # `signal2` gets a single waveguide of equivalent overall length. + # "signal2" just gets a single of equivalent length circ.plug('wg28', {'signal2': 'left'}) - # Now bend both branches back towards each other. + # Now we bend both waveguides back towards each other circ.plug('bend0', {'signal1': 'right'}) circ.plug('bend0', {'signal2': 'left'}) circ.plug('wg05', {'signal1': 'left'}) circ.plug('wg05', {'signal2': 'left'}) - # To join the branches, attach a second y-junction. - # This succeeds only if both chosen ports agree on the same translation and - # rotation for the inserted device; otherwise Masque raises an exception. + # To join the waveguides, we attach a second y-junction. + # We plug "signal1" into the "bot" port, and "signal2" into the "top" port. + # The remaining port gets named "signal_out". + # This operation would raise an exception if the ports did not line up + # correctly (i.e. they required different rotations or translations of the + # y-junction device). circ.plug('ysplit', {'signal1': 'bot', 'signal2': 'top'}, {'in': 'signal_out'}) # Finally, add some more waveguide to "signal_out". circ.plug('wg10', {'signal_out': 'left'}) - # Bake the top-level port metadata into labels so it survives GDS export. - # These labels appear on the circuit cell; individual child devices keep their - # own port labels in their own cells. + # We can also add text labels for our circuit's ports. + # They will appear at the uppermost hierarchy level, while the individual + # device ports will appear further down, in their respective cells. ports_to_data(circ.pattern) # Check if we forgot to include any patterns... ooops! @@ -334,12 +330,12 @@ def main(interactive: bool = True) -> None: lib.add(shape_lib) assert not lib.dangling_refs() - # We can visualize the design directly, though opening the written GDS is often easier. + # We can visualize the design. Usually it's easier to just view the GDS. if interactive: print('Visualizing... this step may be slow') circ.pattern.visualize(lib) - # Write out only the subtree reachable from our top cell. + #Write out to GDS, only keeping patterns referenced by our circuit (including itself) subtree = lib.subtree('my_circuit') # don't include wg90, which we don't use check_valid_names(subtree.keys()) writefile(subtree, 'circuit.gds', **GDS_OPTS) diff --git a/examples/tutorial/library.py b/examples/tutorial/library.py index 1b9a1da..eab8a12 100644 --- a/examples/tutorial/library.py +++ b/examples/tutorial/library.py @@ -1,28 +1,23 @@ -""" -Tutorial: using `LazyLibrary` and `Pather.interface()`. - -This example assumes you have already read `devices.py` and generated the -`circuit.gds` file it writes. The goal here is not the photonic-crystal geometry -itself, but rather how Masque lets you mix lazily loaded GDS content with -python-generated devices inside one library. -""" from typing import Any +from collections.abc import Sequence, Callable from pprint import pformat +import numpy +from numpy import pi -from masque import Pather, LazyLibrary +from masque import Pattern, Builder, LazyLibrary from masque.file.gdsii import writefile, load_libraryfile +import pcgen import basic_shapes import devices -from devices import data_to_ports +from devices import ports_to_data, data_to_ports from basic_shapes import GDS_OPTS def main() -> None: - # A `LazyLibrary` delays work until a pattern is actually needed. - # That applies both to GDS cells we load from disk and to python callables - # that generate patterns on demand. + # Define a `LazyLibrary`, which provides lazy evaluation for generating + # patterns and lazy-loading of GDS contents. lib = LazyLibrary() # @@ -32,9 +27,9 @@ def main() -> None: # Scan circuit.gds and prepare to lazy-load its contents gds_lib, _properties = load_libraryfile('circuit.gds', postprocess=data_to_ports) - # Add those cells into our lazy library. - # Nothing is read yet; we are only registering how to fetch and postprocess - # each pattern when it is first requested. + # Add it into the device library by providing a way to read port info + # This maintains the lazy evaluation from above, so no patterns + # are actually read yet. lib.add(gds_lib) print('Patterns loaded from GDS into library:\n' + pformat(list(lib.keys()))) @@ -49,8 +44,8 @@ def main() -> None: hole = 'triangle', ) - # Triangle-based variants. These lambdas are only recipes for building the - # patterns; they do not execute until someone asks for the cell. + # Triangle-based variants. These are defined here, but they won't run until they're + # retrieved from the library. lib['tri_wg10'] = lambda: devices.waveguide(length=10, mirror_periods=5, **opts) lib['tri_wg05'] = lambda: devices.waveguide(length=5, mirror_periods=5, **opts) lib['tri_wg28'] = lambda: devices.waveguide(length=28, mirror_periods=5, **opts) @@ -62,22 +57,22 @@ def main() -> None: # Build a mixed waveguide with an L3 cavity in the middle # - # Start a new design by copying the ports from an existing library cell. - # This gives `circ2` the same external interface as `tri_l3cav`. - circ2 = Pather(library=lib, ports='tri_l3cav') + # Immediately start building from an instance of the L3 cavity + circ2 = Builder(library=lib, ports='tri_l3cav') - # First way to specify what we are plugging in: request an explicit abstract. - # This works with `Pattern` methods directly as well as with `Pather`. + # First way to get abstracts is `lib.abstract(name)` + # We can use this syntax directly with `Pattern.plug()` and `Pattern.place()` as well as through `Builder`. circ2.plug(lib.abstract('wg10'), {'input': 'right'}) - # Second way: use an `AbstractView`, which behaves like a mapping of names - # to abstracts. + # Second way to get abstracts is to use an AbstractView + # This also works directly with `Pattern.plug()` / `Pattern.place()`. abstracts = lib.abstract_view() circ2.plug(abstracts['wg10'], {'output': 'left'}) - # Third way: let `Pather` resolve a pattern name through its own library. - # This shorthand is convenient, but it is specific to helpers that already - # carry a library reference. + # Third way to specify an abstract works by automatically getting + # it from the library already within the Builder object. + # This wouldn't work if we only had a `Pattern` (not a `Builder`). + # Just pass the pattern name! circ2.plug('tri_wg10', {'input': 'right'}) circ2.plug('tri_wg10', {'output': 'left'}) @@ -86,15 +81,13 @@ def main() -> None: # - # Build a second device that is explicitly designed to mate with `circ2`. + # Build a device that could plug into our mixed_wg_cav and joins the two ports # - # `Pather.interface()` makes a new pattern whose ports mirror an existing - # design's external interface. That is useful when you want to design an - # adapter, continuation, or mating structure. - circ3 = Pather.interface(source=circ2) + # We'll be designing against an existing device's interface... + circ3 = Builder.interface(source=circ2) - # Continue routing outward from those inherited ports. + # ... that lets us continue from where we left off. circ3.plug('tri_bend0', {'input': 'right'}) circ3.plug('tri_bend0', {'input': 'left'}, mirrored=True) # mirror since no tri y-symmetry circ3.plug('tri_bend0', {'input': 'right'}) diff --git a/examples/tutorial/pather.py b/examples/tutorial/pather.py index 5cc5a61..101fbb5 100644 --- a/examples/tutorial/pather.py +++ b/examples/tutorial/pather.py @@ -1,9 +1,10 @@ """ -Manual wire routing tutorial: Pather and AutoTool +Manual wire routing tutorial: Pather and BasicTool """ +from collections.abc import Callable from numpy import pi -from masque import Pather, Library, Pattern, Port, layer_t -from masque.builder.tools import AutoTool, Tool +from masque import Pather, RenderPather, Library, Pattern, Port, layer_t, map_layers +from masque.builder.tools import BasicTool, PathTool from masque.file.gdsii import writefile from basic_shapes import GDS_OPTS @@ -106,29 +107,31 @@ def map_layer(layer: layer_t) -> layer_t: 'M2': (20, 0), 'V1': (30, 0), } - if isinstance(layer, str): - return layer_mapping.get(layer, layer) - return layer + return layer_mapping.get(layer, layer) -def prepare_tools() -> tuple[Library, Tool, Tool]: - """ - Create some basic library elements and tools for drawing M1 and M2 - """ +# +# Now we can start building up our library (collection of static cells) and pathing tools. +# +# If any of the operations below are confusing, you can cross-reference against the `RenderPather` +# tutorial, which handles some things more explicitly (e.g. via placement) and simplifies others +# (e.g. geometry definition). +# +def main() -> None: # Build some patterns (static cells) using the above functions and store them in a library library = Library() library['pad'] = make_pad() library['m1_bend'] = make_bend(layer='M1', ptype='m1wire', width=M1_WIDTH) library['m2_bend'] = make_bend(layer='M2', ptype='m2wire', width=M2_WIDTH) library['v1_via'] = make_via( - layer_top = 'M2', - layer_via = 'V1', - layer_bot = 'M1', - width_top = M2_WIDTH, - width_via = V1_WIDTH, - width_bot = M1_WIDTH, - ptype_bot = 'm1wire', - ptype_top = 'm2wire', + layer_top='M2', + layer_via='V1', + layer_bot='M1', + width_top=M2_WIDTH, + width_via=V1_WIDTH, + width_bot=M1_WIDTH, + ptype_bot='m1wire', + ptype_top='m2wire', ) # @@ -137,79 +140,53 @@ def prepare_tools() -> tuple[Library, Tool, Tool]: # M2_tool will route on M2, using wires with M2_WIDTH # Both tools are able to automatically transition from the other wire type (with a via) # - # Note that while we use AutoTool for this tutorial, you can define your own `Tool` + # Note that while we use BasicTool for this tutorial, you can define your own `Tool` # with arbitrary logic inside -- e.g. with single-use bends, complex transition rules, # transmission line geometry, or other features. # - M1_tool = AutoTool( - # First, we need a function which takes in a length and spits out an M1 wire - straights = [ - AutoTool.Straight( - ptype = 'm1wire', - fn = lambda length: make_straight_wire(layer='M1', ptype='m1wire', width=M1_WIDTH, length=length), - in_port_name = 'input', # When we get a pattern from make_straight_wire, use the port named 'input' as the input - out_port_name = 'output', # and use the port named 'output' as the output - ), - ], - bends = [ - AutoTool.Bend( - abstract = library.abstract('m1_bend'), # When we need a bend, we'll reference the pattern we generated earlier - in_port_name = 'input', - out_port_name = 'output', - clockwise = True, - ), - ], + M1_tool = BasicTool( + straight = ( + # First, we need a function which takes in a length and spits out an M1 wire + lambda length: make_straight_wire(layer='M1', ptype='m1wire', width=M1_WIDTH, length=length), + 'input', # When we get a pattern from make_straight_wire, use the port named 'input' as the input + 'output', # and use the port named 'output' as the output + ), + bend = ( + library.abstract('m1_bend'), # When we need a bend, we'll reference the pattern we generated earlier + 'input', # To orient it clockwise, use the port named 'input' as the input + 'output', # and 'output' as the output + ), transitions = { # We can automate transitions for different (normally incompatible) port types - ('m2wire', 'm1wire'): AutoTool.Transition( # For example, when we're attaching to a port with type 'm2wire' + 'm2wire': ( # For example, when we're attaching to a port with type 'm2wire' library.abstract('v1_via'), # we can place a V1 via 'top', # using the port named 'top' as the input (i.e. the M2 side of the via) 'bottom', # and using the port named 'bottom' as the output ), }, - sbends = [], default_out_ptype = 'm1wire', # Unless otherwise requested, we'll default to trying to stay on M1 ) - M2_tool = AutoTool( - straights = [ + M2_tool = BasicTool( + straight = ( # Again, we use make_straight_wire, but this time we set parameters for M2 - AutoTool.Straight( - ptype = 'm2wire', - fn = lambda length: make_straight_wire(layer='M2', ptype='m2wire', width=M2_WIDTH, length=length), - in_port_name = 'input', - out_port_name = 'output', - ), - ], - bends = [ - # and we use an M2 bend - AutoTool.Bend( - abstract = library.abstract('m2_bend'), - in_port_name = 'input', - out_port_name = 'output', - ), - ], + lambda length: make_straight_wire(layer='M2', ptype='m2wire', width=M2_WIDTH, length=length), + 'input', + 'output', + ), + bend = ( + library.abstract('m2_bend'), # and we use an M2 bend + 'input', + 'output', + ), transitions = { - ('m1wire', 'm2wire'): AutoTool.Transition( + 'm1wire': ( library.abstract('v1_via'), # We still use the same via, 'bottom', # but the input port is now 'bottom' 'top', # and the output port is now 'top' ), }, - sbends = [], default_out_ptype = 'm2wire', # We default to trying to stay on M2 ) - return library, M1_tool, M2_tool - - -# -# Now we can start building up our library (collection of static cells) and pathing tools. -# -# If any of the operations below are confusing, you can cross-reference against the deferred -# `Pather` tutorial, which handles some things more explicitly (e.g. via placement) and simplifies -# others (e.g. geometry definition). -# -def main() -> None: - library, M1_tool, M2_tool = prepare_tools() # # Create a new pather which writes to `library` and uses `M2_tool` as its default tool. @@ -226,25 +203,27 @@ def main() -> None: # Path VCC forward (in this case south) and turn clockwise 90 degrees (ccw=False) # The total distance forward (including the bend's forward component) must be 6um - pather.cw('VCC', 6_000) + pather.path('VCC', ccw=False, length=6_000) - # Now path VCC to x=0. This time, don't include any bend. + # Now path VCC to x=0. This time, don't include any bend (ccw=None). # Note that if we tried y=0 here, we would get an error since the VCC port is facing in the x-direction. - pather.straight('VCC', x=0) + pather.path_to('VCC', ccw=None, x=0) # Path GND forward by 5um, turning clockwise 90 degrees. - pather.cw('GND', 5_000) + # This time we use shorthand (bool(0) == False) and omit the parameter labels + # Note that although ccw=0 is equivalent to ccw=False, ccw=None is not! + pather.path('GND', 0, 5_000) # This time, path GND until it matches the current x-coordinate of VCC. Don't place a bend. - pather.straight('GND', x=pather['VCC'].offset[0]) + pather.path_to('GND', None, x=pather['VCC'].offset[0]) # Now, start using M1_tool for GND. - # Since we have defined an M2-to-M1 transition for Pather, we don't need to place one ourselves. + # Since we have defined an M2-to-M1 transition for BasicPather, we don't need to place one ourselves. # If we wanted to place our via manually, we could add `pather.plug('m1_via', {'GND': 'top'})` here # and achieve the same result without having to define any transitions in M1_tool. # Note that even though we have changed the tool used for GND, the via doesn't get placed until - # the next time we route GND (the `pather.ccw()` call below). - pather.retool(M1_tool, keys='GND') + # the next time we draw a path on GND (the pather.mpath() statement below). + pather.retool(M1_tool, keys=['GND']) # Bundle together GND and VCC, and path the bundle forward and counterclockwise. # Pick the distance so that the leading/outermost wire (in this case GND) ends up at x=-10_000. @@ -252,7 +231,7 @@ def main() -> None: # # Since we recently retooled GND, its path starts with a via down to M1 (included in the distance # calculation), and its straight segment and bend will be drawn using M1 while VCC's are drawn with M2. - pather.ccw(['GND', 'VCC'], xmax=-10_000, spacing=5_000) + pather.mpath(['GND', 'VCC'], ccw=True, xmax=-10_000, spacing=5_000) # Now use M1_tool as the default tool for all ports/signals. # Since VCC does not have an explicitly assigned tool, it will now transition down to M1. @@ -262,37 +241,38 @@ def main() -> None: # The total extension (travel distance along the forward direction) for the longest segment (in # this case the segment being added to GND) should be exactly 50um. # After turning, the wire pitch should be reduced only 1.2um. - pather.ccw(['GND', 'VCC'], emax=50_000, spacing=1_200) + pather.mpath(['GND', 'VCC'], ccw=True, emax=50_000, spacing=1_200) # Make a U-turn with the bundle and expand back out to 4.5um wire pitch. - # Here, emin specifies the travel distance for the shortest segment. For the first call - # that applies to VCC, and for the second call, that applies to GND; the relative lengths of the + # Here, emin specifies the travel distance for the shortest segment. For the first mpath() call + # that applies to VCC, and for teh second call, that applies to GND; the relative lengths of the # segments depend on their starting positions and their ordering within the bundle. - pather.cw(['GND', 'VCC'], emin=1_000, spacing=1_200) - pather.cw(['GND', 'VCC'], emin=2_000, spacing=4_500) + pather.mpath(['GND', 'VCC'], ccw=False, emin=1_000, spacing=1_200) + pather.mpath(['GND', 'VCC'], ccw=False, emin=2_000, spacing=4_500) # Now, set the default tool back to M2_tool. Note that GND remains on M1 since it has been - # explicitly assigned a tool. + # explicitly assigned a tool. We could `del pather.tools['GND']` to force it to use the default. pather.retool(M2_tool) # Now path both ports to x=-28_000. - # With ccw=None, all ports stop at the same coordinate, and so specifying xmin= or xmax= is + # When ccw is not None, xmin constrains the trailing/innermost port to stop at the target x coordinate, + # However, with ccw=None, all ports stop at the same coordinate, and so specifying xmin= or xmax= is # equivalent. - pather.straight(['GND', 'VCC'], xmin=-28_000) + pather.mpath(['GND', 'VCC'], None, xmin=-28_000) # Further extend VCC out to x=-50_000, and specify that we would like to get an output on M1. # This results in a via at the end of the wire (instead of having one at the start like we got # when using pather.retool(). - pather.straight('VCC', x=-50_000, out_ptype='m1wire') + pather.path_to('VCC', None, -50_000, out_ptype='m1wire') # Now extend GND out to x=-50_000, using M2 for a portion of the path. # We can use `pather.toolctx()` to temporarily retool, instead of calling `retool()` twice. - with pather.toolctx(M2_tool, keys='GND'): - pather.straight('GND', x=-40_000) - pather.straight('GND', x=-50_000) + with pather.toolctx(M2_tool, keys=['GND']): + pather.path_to('GND', None, -40_000) + pather.path_to('GND', None, -50_000) # Save the pather's pattern into our library - library['Pather_and_AutoTool'] = pather.pattern + library['Pather_and_BasicTool'] = pather.pattern # Convert from text-based layers to numeric layers for GDS, and output the file library.map_layers(map_layer) diff --git a/examples/tutorial/pcgen.py b/examples/tutorial/pcgen.py index 5c5c31b..023079c 100644 --- a/examples/tutorial/pcgen.py +++ b/examples/tutorial/pcgen.py @@ -2,7 +2,7 @@ Routines for creating normalized 2D lattices and common photonic crystal cavity designs. """ -from collections.abc import Sequence +from collection.abc import Sequence import numpy from numpy.typing import ArrayLike, NDArray @@ -50,7 +50,7 @@ def triangular_lattice( elif origin == 'corner': pass else: - raise ValueError(f'Invalid value for `origin`: {origin}') + raise Exception(f'Invalid value for `origin`: {origin}') return xy[xy[:, 0].argsort(), :] @@ -197,12 +197,12 @@ def ln_defect( `[[x0, y0], [x1, y1], ...]` for all the holes """ if defect_length % 2 != 1: - raise ValueError('defect_length must be odd!') - pp = triangular_lattice([2 * dd + 1 for dd in mirror_dims]) + raise Exception('defect_length must be odd!') + p = triangular_lattice([2 * d + 1 for d in mirror_dims]) half_length = numpy.floor(defect_length / 2) hole_nums = numpy.arange(-half_length, half_length + 1) - holes_to_keep = numpy.isin(pp[:, 0], hole_nums, invert=True) - return pp[numpy.logical_or(holes_to_keep, pp[:, 1] != 0), :] + holes_to_keep = numpy.in1d(p[:, 0], hole_nums, invert=True) + return p[numpy.logical_or(holes_to_keep, p[:, 1] != 0), ] def ln_shift_defect( @@ -248,7 +248,7 @@ def ln_shift_defect( for sign in (-1, 1): x_val = sign * (x_removed + ind + 1) which = numpy.logical_and(xyr[:, 0] == x_val, xyr[:, 1] == 0) - xyr[which, :] = (x_val + numpy.sign(x_val) * shifts_a[ind], 0, shifts_r[ind]) + xyr[which, ] = (x_val + numpy.sign(x_val) * shifts_a[ind], 0, shifts_r[ind]) return xyr @@ -309,7 +309,7 @@ def l3_shift_perturbed_defect( # which holes should be perturbed? (xs[[3, 7]], ys[1]) and (xs[[2, 6]], ys[2]) perturbed_holes = ((xs[a], ys[b]) for a, b in ((3, 1), (7, 1), (2, 2), (6, 2))) - for xy in perturbed_holes: - which = (numpy.fabs(xyr[:, :2]) == xy).all(axis=1) - xyr[which, 2] = perturbed_radius + for row in xyr: + if numpy.fabs(row) in perturbed_holes: + row[2] = perturbed_radius return xyr diff --git a/examples/tutorial/port_pather.py b/examples/tutorial/port_pather.py deleted file mode 100644 index ab942d7..0000000 --- a/examples/tutorial/port_pather.py +++ /dev/null @@ -1,169 +0,0 @@ -""" -PortPather tutorial: Using .at() syntax -""" -from masque import Pather, Pattern, Port, R90 -from masque.file.gdsii import writefile - -from basic_shapes import GDS_OPTS -from pather import map_layer, prepare_tools - - -def main() -> None: - # Reuse the same patterns (pads, bends, vias) and tools as in pather.py - library, M1_tool, M2_tool = prepare_tools() - - # Create a deferred Pather and place some initial pads (same as Pather tutorial) - rpather = Pather(library, tools=M2_tool, auto_render=False) - - rpather.place('pad', offset=(18_000, 30_000), port_map={'wire_port': 'VCC'}) - rpather.place('pad', offset=(18_000, 60_000), port_map={'wire_port': 'GND'}) - rpather.pattern.label(layer='M2', string='VCC', offset=(18e3, 30e3)) - rpather.pattern.label(layer='M2', string='GND', offset=(18e3, 60e3)) - - # - # Routing with .at() chaining - # - # The .at(port_name) method returns a PortPather object which wraps the Pather - # and remembers the selected port(s). This allows method chaining. - - # Route VCC: 6um South, then West to x=0. - # (Note: since the port points North into the pad, trace() moves South by default) - (rpather.at('VCC') - .trace(False, length=6_000) # Move South, turn West (Clockwise) - .trace_to(None, x=0) # Continue West to x=0 - ) - - # Route GND: 5um South, then West to match VCC's x-coordinate. - rpather.at('GND').trace(False, length=5_000).trace_to(None, x=rpather['VCC'].x) - - - # - # Tool management and manual plugging - # - # We can use .retool() to change the tool for specific ports. - # We can also use .plug() directly on a PortPather. - - # Manually add a via to GND and switch to M1_tool for subsequent segments - (rpather.at('GND') - .plug('v1_via', 'top') - .retool(M1_tool) # this only retools the 'GND' port - ) - - # We can also pass multiple ports to .at(), and then route them together. - # Here we bundle them, turn South, and retool both to M1 (VCC gets an auto-via). - (rpather.at(['GND', 'VCC']) - .trace(True, xmax=-10_000, spacing=5_000) # Move West to -10k, turn South - .retool(M1_tool) # Retools both GND and VCC - .trace(True, emax=50_000, spacing=1_200) # Turn East, moves 50um extension - .trace(False, emin=1_000, spacing=1_200) # U-turn back South - .trace(False, emin=2_000, spacing=4_500) # U-turn back West - ) - - # Retool VCC back to M2 and move both to x=-28k - rpather.at('VCC').retool(M2_tool) - rpather.at(['GND', 'VCC']).trace(None, xmin=-28_000) - - # Final segments to -50k - rpather.at('VCC').trace_to(None, x=-50_000, out_ptype='m1wire') - with rpather.at('GND').toolctx(M2_tool): - rpather.at('GND').trace_to(None, x=-40_000) - rpather.at('GND').trace_to(None, x=-50_000) - - - # - # Branching with mark and fork - # - # .mark(new_name) creates a port copy and keeps the original selected. - # .fork(new_name) creates a port copy and selects the new one. - - # Create a tap on GND - (rpather.at('GND') - .trace(None, length=5_000) # Move GND further West - .mark('GND_TAP') # Mark this location for a later branch - .jog(offset=-10_000, length=10_000) # Continue GND with an S-bend - ) - - # Branch VCC and follow the new branch - (rpather.at('VCC') - .trace(None, length=5_000) - .fork('VCC_BRANCH') # We are now manipulating 'VCC_BRANCH' - .trace(True, length=5_000) # VCC_BRANCH turns South - ) - # The original 'VCC' port remains at x=-55k, y=VCC.y - - - # - # Port set management: add, drop, rename, delete - # - - # Route the GND_TAP we saved earlier. - (rpather.at('GND_TAP') - .retool(M1_tool) - .trace(True, length=10_000) # Turn South - .rename('GND_FEED') # Give it a more descriptive name - .retool(M1_tool) # Re-apply tool to the new name - ) - - # We can manage the active set of ports in a PortPather - pp = rpather.at(['VCC_BRANCH', 'GND_FEED']) - pp.select('GND') # Now tracking 3 ports - pp.deselect('VCC_BRANCH') # Now tracking 2 ports: GND_FEED, GND - pp.trace(None, each=5_000) # Move both 5um forward (length > transition size) - - # We can also delete ports from the pather entirely - rpather.at('VCC').delete() # VCC is gone (we have VCC_BRANCH instead) - - - # - # Advanced Connections: trace_into - # - # trace_into routes FROM the selected port TO a target port. - - # Create a destination component - dest_ports = { - 'in_A': Port((0, 0), rotation=R90, ptype='m2wire'), - 'in_B': Port((5_000, 0), rotation=R90, ptype='m2wire') - } - library['dest'] = Pattern(ports=dest_ports) - # Place dest so that its ports are to the West and South of our current wires. - # Rotating by pi/2 makes the ports face West (pointing East). - rpather.place('dest', offset=(-100_000, -100_000), rotation=R90, port_map={'in_A': 'DEST_A', 'in_B': 'DEST_B'}) - - # Connect GND_FEED to DEST_A - # Since GND_FEED is moving South and DEST_A faces West, a single bend will suffice. - rpather.at('GND_FEED').trace_into('DEST_A') - - # Connect VCC_BRANCH to DEST_B - rpather.at('VCC_BRANCH').trace_into('DEST_B') - - - # - # Direct Port Transformations and Metadata - # - (rpather.at('GND') - .set_ptype('m1wire') # Change metadata - .translate((1000, 0)) # Shift the port 1um East - .rotate(R90 / 2) # Rotate it 45 degrees - .set_rotation(R90) # Force it to face West - ) - - # Demonstrate .plugged() to acknowledge a manual connection - # (Normally used when you place components so their ports perfectly overlap) - rpather.add_port_pair(offset=(0, 0), names=('TMP1', 'TMP2')) - rpather.at('TMP1').plugged('TMP2') # Removes both ports - - - # - # Rendering and Saving - # - # Since we deferred auto-rendering, we must call .render() to generate the geometry. - rpather.render() - - library['PortPather_Tutorial'] = rpather.pattern - library.map_layers(map_layer) - writefile(library, 'port_pather.gds', **GDS_OPTS) - print("Tutorial complete. Output written to port_pather.gds") - - -if __name__ == '__main__': - main() diff --git a/examples/tutorial/renderpather.py b/examples/tutorial/renderpather.py index 4b43b19..cb002f3 100644 --- a/examples/tutorial/renderpather.py +++ b/examples/tutorial/renderpather.py @@ -1,7 +1,8 @@ """ -Manual wire routing tutorial: deferred Pather and PathTool +Manual wire routing tutorial: RenderPather an PathTool """ -from masque import Pather, Library +from collections.abc import Callable +from masque import RenderPather, Library, Pattern, Port, layer_t, map_layers from masque.builder.tools import PathTool from masque.file.gdsii import writefile @@ -11,9 +12,9 @@ from pather import M1_WIDTH, V1_WIDTH, M2_WIDTH, map_layer, make_pad, make_via def main() -> None: # - # To illustrate deferred routing with `Pather`, we use `PathTool` instead - # of `AutoTool`. `PathTool` lacks some sophistication (e.g. no automatic transitions) - # but when used with `Pather(auto_render=False)`, it can consolidate multiple routing steps into + # To illustrate the advantages of using `RenderPather`, we use `PathTool` instead + # of `BasicTool`. `PathTool` lacks some sophistication (e.g. no automatic transitions) + # but when used with `RenderPather`, it can consolidate multiple routing steps into # a single `Path` shape. # # We'll try to nearly replicate the layout from the `Pather` tutorial; see `pather.py` @@ -24,68 +25,66 @@ def main() -> None: library = Library() library['pad'] = make_pad() library['v1_via'] = make_via( - layer_top = 'M2', - layer_via = 'V1', - layer_bot = 'M1', - width_top = M2_WIDTH, - width_via = V1_WIDTH, - width_bot = M1_WIDTH, - ptype_bot = 'm1wire', - ptype_top = 'm2wire', + layer_top='M2', + layer_via='V1', + layer_bot='M1', + width_top=M2_WIDTH, + width_via=V1_WIDTH, + width_bot=M1_WIDTH, + ptype_bot='m1wire', + ptype_top='m2wire', ) - # `PathTool` is more limited than `AutoTool`. It only generates one type of shape + # `PathTool` is more limited than `BasicTool`. It only generates one type of shape # (`Path`), so it only needs to know what layer to draw on, what width to draw with, # and what port type to present. M1_ptool = PathTool(layer='M1', width=M1_WIDTH, ptype='m1wire') M2_ptool = PathTool(layer='M2', width=M2_WIDTH, ptype='m2wire') - rpather = Pather(tools=M2_ptool, library=library, auto_render=False) + rpather = RenderPather(tools=M2_ptool, library=library) - # As in the pather tutorial, we make some pads and labels... + # As in the pather tutorial, we make soem pads and labels... rpather.place('pad', offset=(18_000, 30_000), port_map={'wire_port': 'VCC'}) rpather.place('pad', offset=(18_000, 60_000), port_map={'wire_port': 'GND'}) rpather.pattern.label(layer='M2', string='VCC', offset=(18e3, 30e3)) rpather.pattern.label(layer='M2', string='GND', offset=(18e3, 60e3)) # ...and start routing the signals. - rpather.cw('VCC', 6_000) - rpather.straight('VCC', x=0) - rpather.cw('GND', 5_000) - rpather.straight('GND', x=rpather.pattern['VCC'].x) + rpather.path('VCC', ccw=False, length=6_000) + rpather.path_to('VCC', ccw=None, x=0) + rpather.path('GND', 0, 5_000) + rpather.path_to('GND', None, x=rpather['VCC'].offset[0]) # `PathTool` doesn't know how to transition betwen metal layers, so we have to # `plug` the via into the GND wire ourselves. rpather.plug('v1_via', {'GND': 'top'}) - rpather.retool(M1_ptool, keys='GND') - rpather.ccw(['GND', 'VCC'], xmax=-10_000, spacing=5_000) + rpather.retool(M1_ptool, keys=['GND']) + rpather.mpath(['GND', 'VCC'], ccw=True, xmax=-10_000, spacing=5_000) # Same thing on the VCC wire when it goes down to M1. rpather.plug('v1_via', {'VCC': 'top'}) rpather.retool(M1_ptool) - rpather.ccw(['GND', 'VCC'], emax=50_000, spacing=1_200) - rpather.cw(['GND', 'VCC'], emin=1_000, spacing=1_200) - rpather.cw(['GND', 'VCC'], emin=2_000, spacing=4_500) + rpather.mpath(['GND', 'VCC'], ccw=True, emax=50_000, spacing=1_200) + rpather.mpath(['GND', 'VCC'], ccw=False, emin=1_000, spacing=1_200) + rpather.mpath(['GND', 'VCC'], ccw=False, emin=2_000, spacing=4_500) # And again when VCC goes back up to M2. rpather.plug('v1_via', {'VCC': 'bottom'}) rpather.retool(M2_ptool) - rpather.straight(['GND', 'VCC'], xmin=-28_000) + rpather.mpath(['GND', 'VCC'], None, xmin=-28_000) # Finally, since PathTool has no conception of transitions, we can't # just ask it to transition to an 'm1wire' port at the end of the final VCC segment. # Instead, we have to calculate the via size ourselves, and adjust the final position # to account for it. - v1pat = library['v1_via'] - via_size = abs(v1pat.ports['top'].x - v1pat.ports['bottom'].x) - - # alternatively, via_size = v1pat.ports['top'].measure_travel(v1pat.ports['bottom'])[0][0] - # would take into account the port orientations if we didn't already know they're along x - rpather.straight('VCC', x=-50_000 + via_size) + via_size = abs( + library['v1_via'].ports['top'].offset[0] + - library['v1_via'].ports['bottom'].offset[0] + ) + rpather.path_to('VCC', None, -50_000 + via_size) rpather.plug('v1_via', {'VCC': 'top'}) - # Render the path we defined rpather.render() - library['Deferred_Pather_and_PathTool'] = rpather.pattern + library['RenderPather_and_PathTool'] = rpather.pattern # Convert from text-based layers to numeric layers for GDS, and output the file diff --git a/masque/__init__.py b/masque/__init__.py index b6cc5e9..4ad7e69 100644 --- a/masque/__init__.py +++ b/masque/__init__.py @@ -42,7 +42,6 @@ from .error import ( from .shapes import ( Shape as Shape, Polygon as Polygon, - RectCollection as RectCollection, Path as Path, Circle as Circle, Arc as Arc, @@ -56,7 +55,6 @@ from .pattern import ( map_targets as map_targets, chain_elements as chain_elements, ) -from .utils.boolean import boolean as boolean from .library import ( ILibraryView as ILibraryView, @@ -74,8 +72,10 @@ from .ports import ( ) from .abstract import Abstract as Abstract from .builder import ( + Builder as Builder, Tool as Tool, Pather as Pather, + RenderPather as RenderPather, RenderStep as RenderStep, SimpleTool as SimpleTool, AutoTool as AutoTool, diff --git a/masque/abstract.py b/masque/abstract.py index d23d7c7..248c8a5 100644 --- a/masque/abstract.py +++ b/masque/abstract.py @@ -8,20 +8,22 @@ from numpy.typing import ArrayLike from .ref import Ref from .ports import PortList, Port from .utils import rotation_matrix_2d -from .traits import Mirrorable + +#if TYPE_CHECKING: +# from .builder import Builder, Tool +# from .library import ILibrary logger = logging.getLogger(__name__) -class Abstract(PortList, Mirrorable): +class Abstract(PortList): """ An `Abstract` is a container for a name and associated ports. When snapping a sub-component to an existing pattern, only the name (not contained in a `Pattern` object) and port info is needed, and not the geometry itself. """ - # Alternate design option: do we want to store a Ref instead of just a name? then we can translate/rotate/mirror... __slots__ = ('name', '_ports') name: str @@ -46,6 +48,8 @@ class Abstract(PortList, Mirrorable): self.name = name self.ports = copy.deepcopy(ports) + # TODO do we want to store a Ref instead of just a name? then we can translate/rotate/mirror... + def __repr__(self) -> str: s = f' Self: """ - Rotate the Abstract around a pivot point. + Rotate the Abstract around the a location. Args: pivot: (x, y) location to rotate around @@ -128,18 +132,50 @@ class Abstract(PortList, Mirrorable): port.rotate(rotation) return self - def mirror(self, axis: int = 0) -> Self: + def mirror_port_offsets(self, across_axis: int = 0) -> Self: """ - Mirror the Abstract across an axis through its origin. + Mirror the offsets of all shapes, labels, and refs across an axis Args: - axis: Axis to mirror across (0: x-axis, 1: y-axis). + across_axis: Axis to mirror across + (0: mirror across x axis, 1: mirror across y axis) Returns: self """ for port in self.ports.values(): - port.flip_across(axis=axis) + port.offset[across_axis - 1] *= -1 + return self + + def mirror_ports(self, across_axis: int = 0) -> Self: + """ + Mirror each port's rotation across an axis, relative to its + offset + + Args: + across_axis: Axis to mirror across + (0: mirror across x axis, 1: mirror across y axis) + + Returns: + self + """ + for port in self.ports.values(): + port.mirror(across_axis) + return self + + def mirror(self, across_axis: int = 0) -> Self: + """ + Mirror the Pattern across an axis + + Args: + axis: Axis to mirror across + (0: mirror across x axis, 1: mirror across y axis) + + Returns: + self + """ + self.mirror_ports(across_axis) + self.mirror_port_offsets(across_axis) return self def apply_ref_transform(self, ref: Ref) -> Self: @@ -157,8 +193,6 @@ class Abstract(PortList, Mirrorable): self.mirror() self.rotate_ports(ref.rotation) self.rotate_port_offsets(ref.rotation) - if ref.scale != 1: - self.scale_by(ref.scale) self.translate_ports(ref.offset) return self @@ -176,8 +210,6 @@ class Abstract(PortList, Mirrorable): # TODO test undo_ref_transform """ self.translate_ports(-ref.offset) - if ref.scale != 1: - self.scale_by(1 / ref.scale) self.rotate_port_offsets(-ref.rotation) self.rotate_ports(-ref.rotation) if ref.mirrored: diff --git a/masque/builder/__init__.py b/masque/builder/__init__.py index a8f4cc0..2fd00a4 100644 --- a/masque/builder/__init__.py +++ b/masque/builder/__init__.py @@ -1,7 +1,7 @@ -from .pather import ( - Pather as Pather, - PortPather as PortPather, -) +from .builder import Builder as Builder +from .pather import Pather as Pather +from .renderpather import RenderPather as RenderPather +from .pather_mixin import PortPather as PortPather from .utils import ell as ell from .tools import ( Tool as Tool, @@ -9,5 +9,4 @@ from .tools import ( SimpleTool as SimpleTool, AutoTool as AutoTool, PathTool as PathTool, -) -from .logging import logged_op as logged_op + ) diff --git a/masque/builder/builder.py b/masque/builder/builder.py new file mode 100644 index 0000000..ee1d277 --- /dev/null +++ b/masque/builder/builder.py @@ -0,0 +1,447 @@ +""" +Simplified Pattern assembly (`Builder`) +""" +from typing import Self +from collections.abc import Iterable, Sequence, Mapping +import copy +import logging +from functools import wraps + +from numpy.typing import ArrayLike + +from ..pattern import Pattern +from ..library import ILibrary, TreeView +from ..error import BuildError +from ..ports import PortList, Port +from ..abstract import Abstract + + +logger = logging.getLogger(__name__) + + +class Builder(PortList): + """ + A `Builder` is a helper object used for snapping together multiple + lower-level patterns at their `Port`s. + + The `Builder` mostly just holds context, in the form of a `Library`, + in addition to its underlying pattern. This simplifies some calls + to `plug` and `place`, by making the library implicit. + + `Builder` can also be `set_dead()`, at which point further calls to `plug()` + and `place()` are ignored (intended for debugging). + + + Examples: Creating a Builder + =========================== + - `Builder(library, ports={'A': port_a, 'C': port_c}, name='mypat')` makes + an empty pattern, adds the given ports, and places it into `library` + under the name `'mypat'`. + + - `Builder(library)` makes an empty pattern with no ports. The pattern + is not added into `library` and must later be added with e.g. + `library['mypat'] = builder.pattern` + + - `Builder(library, pattern=pattern, name='mypat')` uses an existing + pattern (including its ports) and sets `library['mypat'] = pattern`. + + - `Builder.interface(other_pat, port_map=['A', 'B'], library=library)` + makes a new (empty) pattern, copies over ports 'A' and 'B' from + `other_pat`, and creates additional ports 'in_A' and 'in_B' facing + in the opposite directions. This can be used to build a device which + can plug into `other_pat` (using the 'in_*' ports) but which does not + itself include `other_pat` as a subcomponent. + + - `Builder.interface(other_builder, ...)` does the same thing as + `Builder.interface(other_builder.pattern, ...)` but also uses + `other_builder.library` as its library by default. + + + Examples: Adding to a pattern + ============================= + - `my_device.plug(subdevice, {'A': 'C', 'B': 'B'}, map_out={'D': 'myport'})` + instantiates `subdevice` into `my_device`, plugging ports 'A' and 'B' + of `my_device` into ports 'C' and 'B' of `subdevice`. The connected ports + are removed and any unconnected ports from `subdevice` are added to + `my_device`. Port 'D' of `subdevice` (unconnected) is renamed to 'myport'. + + - `my_device.plug(wire, {'myport': 'A'})` places port 'A' of `wire` at 'myport' + of `my_device`. If `wire` has only two ports (e.g. 'A' and 'B'), no `map_out`, + argument is provided, and the `thru` argument is not explicitly + set to `False`, the unconnected port of `wire` is automatically renamed to + 'myport'. This allows easy extension of existing ports without changing + their names or having to provide `map_out` each time `plug` is called. + + - `my_device.place(pad, offset=(10, 10), rotation=pi / 2, port_map={'A': 'gnd'})` + instantiates `pad` at the specified (x, y) offset and with the specified + rotation, adding its ports to those of `my_device`. Port 'A' of `pad` is + renamed to 'gnd' so that further routing can use this signal or net name + rather than the port name on the original `pad` device. + """ + __slots__ = ('pattern', 'library', '_dead') + + pattern: Pattern + """ Layout of this device """ + + library: ILibrary + """ + Library from which patterns should be referenced + """ + + _dead: bool + """ If True, plug()/place() are skipped (for debugging)""" + + @property + def ports(self) -> dict[str, Port]: + return self.pattern.ports + + @ports.setter + def ports(self, value: dict[str, Port]) -> None: + self.pattern.ports = value + + def __init__( + self, + library: ILibrary, + *, + pattern: Pattern | None = None, + ports: str | Mapping[str, Port] | None = None, + name: str | None = None, + ) -> None: + """ + Args: + library: The library from which referenced patterns will be taken + pattern: The pattern which will be modified by subsequent operations. + If `None` (default), a new pattern is created. + ports: Allows specifying the initial set of ports, if `pattern` does + not already have any ports (or is not provided). May be a string, + in which case it is interpreted as a name in `library`. + Default `None` (no ports). + name: If specified, `library[name]` is set to `self.pattern`. + """ + self._dead = False + self.library = library + if pattern is not None: + self.pattern = pattern + else: + self.pattern = Pattern() + + if ports is not None: + if self.pattern.ports: + raise BuildError('Ports supplied for pattern with pre-existing ports!') + if isinstance(ports, str): + ports = library.abstract(ports).ports + + self.pattern.ports.update(copy.deepcopy(dict(ports))) + + if name is not None: + library[name] = self.pattern + + @classmethod + def interface( + cls: type['Builder'], + source: PortList | Mapping[str, Port] | str, + *, + library: ILibrary | None = None, + in_prefix: str = 'in_', + out_prefix: str = '', + port_map: dict[str, str] | Sequence[str] | None = None, + name: str | None = None, + ) -> 'Builder': + """ + Wrapper for `Pattern.interface()`, which returns a Builder instead. + + Args: + source: A collection of ports (e.g. Pattern, Builder, or dict) + from which to create the interface. May be a pattern name if + `library` is provided. + library: Library from which existing patterns should be referenced, + and to which the new one should be added (if named). If not provided, + `source.library` must exist and will be used. + in_prefix: Prepended to port names for newly-created ports with + reversed directions compared to the current device. + out_prefix: Prepended to port names for ports which are directly + copied from the current device. + port_map: Specification for ports to copy into the new device: + - If `None`, all ports are copied. + - If a sequence, only the listed ports are copied + - If a mapping, the listed ports (keys) are copied and + renamed (to the values). + + Returns: + The new builder, with an empty pattern and 2x as many ports as + listed in port_map. + + Raises: + `PortError` if `port_map` contains port names not present in the + current device. + `PortError` if applying the prefixes results in duplicate port + names. + """ + if library is None: + if hasattr(source, 'library') and isinstance(source.library, ILibrary): + library = source.library + else: + raise BuildError('No library was given, and `source.library` does not have one either.') + + if isinstance(source, str): + source = library.abstract(source).ports + + pat = Pattern.interface(source, in_prefix=in_prefix, out_prefix=out_prefix, port_map=port_map) + new = Builder(library=library, pattern=pat, name=name) + return new + + @wraps(Pattern.label) + def label(self, *args, **kwargs) -> Self: + self.pattern.label(*args, **kwargs) + return self + + @wraps(Pattern.ref) + def ref(self, *args, **kwargs) -> Self: + self.pattern.ref(*args, **kwargs) + return self + + @wraps(Pattern.polygon) + def polygon(self, *args, **kwargs) -> Self: + self.pattern.polygon(*args, **kwargs) + return self + + @wraps(Pattern.rect) + def rect(self, *args, **kwargs) -> Self: + self.pattern.rect(*args, **kwargs) + return self + + # Note: We're a superclass of `Pather`, where path() means something different... + #@wraps(Pattern.path) + #def path(self, *args, **kwargs) -> Self: + # self.pattern.path(*args, **kwargs) + # return self + + def plug( + self, + other: Abstract | str | Pattern | TreeView, + map_in: dict[str, str], + map_out: dict[str, str | None] | None = None, + *, + mirrored: bool = False, + thru: bool | str = True, + set_rotation: bool | None = None, + append: bool = False, + ok_connections: Iterable[tuple[str, str]] = (), + ) -> Self: + """ + Wrapper around `Pattern.plug` which allows a string for `other`. + + The `Builder`'s library is used to dereference the string (or `Abstract`, if + one is passed with `append=True`). If a `TreeView` is passed, it is first + added into `self.library`. + + Args: + other: An `Abstract`, string, `Pattern`, or `TreeView` describing the + device to be instatiated. If it is a `TreeView`, it is first + added into `self.library`, after which the topcell is plugged; + an equivalent statement is `self.plug(self.library << other, ...)`. + map_in: dict of `{'self_port': 'other_port'}` mappings, specifying + port connections between the two devices. + map_out: dict of `{'old_name': 'new_name'}` mappings, specifying + new names for ports in `other`. + mirrored: Enables mirroring `other` across the x axis prior to + connecting any ports. + thru: If map_in specifies only a single port, `thru` provides a mechainsm + to avoid repeating the port name. Eg, for `map_in={'myport': 'A'}`, + - If True (default), and `other` has only two ports total, and map_out + doesn't specify a name for the other port, its name is set to the key + in `map_in`, i.e. 'myport'. + - If a string, `map_out[thru]` is set to the key in `map_in` (i.e. 'myport'). + An error is raised if that entry already exists. + + This makes it easy to extend a pattern with simple 2-port devices + (e.g. wires) without providing `map_out` each time `plug` is + called. See "Examples" above for more info. Default `True`. + set_rotation: If the necessary rotation cannot be determined from + the ports being connected (i.e. all pairs have at least one + port with `rotation=None`), `set_rotation` must be provided + to indicate how much `other` should be rotated. Otherwise, + `set_rotation` must remain `None`. + append: If `True`, `other` is appended instead of being referenced. + Note that this does not flatten `other`, so its refs will still + be refs (now inside `self`). + ok_connections: Set of "allowed" ptype combinations. Identical + ptypes are always allowed to connect, as is `'unk'` with + any other ptypte. Non-allowed ptype connections will emit a + warning. Order is ignored, i.e. `(a, b)` is equivalent to + `(b, a)`. + + Returns: + self + + Raises: + `PortError` if any ports specified in `map_in` or `map_out` do not + exist in `self.ports` or `other_names`. + `PortError` if there are any duplicate names after `map_in` and `map_out` + are applied. + `PortError` if the specified port mapping is not achieveable (the ports + do not line up) + """ + if self._dead: + logger.error('Skipping plug() since device is dead') + return self + + if not isinstance(other, str | Abstract | Pattern): + # We got a Tree; add it into self.library and grab an Abstract for it + other = self.library << other + + if isinstance(other, str): + other = self.library.abstract(other) + if append and isinstance(other, Abstract): + other = self.library[other.name] + + self.pattern.plug( + other = other, + map_in = map_in, + map_out = map_out, + mirrored = mirrored, + thru = thru, + set_rotation = set_rotation, + append = append, + ok_connections = ok_connections, + ) + return self + + def place( + self, + other: Abstract | str | Pattern | TreeView, + *, + offset: ArrayLike = (0, 0), + rotation: float = 0, + pivot: ArrayLike = (0, 0), + mirrored: bool = False, + port_map: dict[str, str | None] | None = None, + skip_port_check: bool = False, + append: bool = False, + ) -> Self: + """ + Wrapper around `Pattern.place` which allows a string or `TreeView` for `other`. + + The `Builder`'s library is used to dereference the string (or `Abstract`, if + one is passed with `append=True`). If a `TreeView` is passed, it is first + added into `self.library`. + + Args: + other: An `Abstract`, string, `Pattern`, or `TreeView` describing the + device to be instatiated. If it is a `TreeView`, it is first + added into `self.library`, after which the topcell is plugged; + an equivalent statement is `self.plug(self.library << other, ...)`. + offset: Offset at which to place the instance. Default (0, 0). + rotation: Rotation applied to the instance before placement. Default 0. + pivot: Rotation is applied around this pivot point (default (0, 0)). + Rotation is applied prior to translation (`offset`). + mirrored: Whether theinstance should be mirrored across the x axis. + Mirroring is applied before translation and rotation. + port_map: dict of `{'old_name': 'new_name'}` mappings, specifying + new names for ports in the instantiated device. New names can be + `None`, which will delete those ports. + skip_port_check: Can be used to skip the internal call to `check_ports`, + in case it has already been performed elsewhere. + append: If `True`, `other` is appended instead of being referenced. + Note that this does not flatten `other`, so its refs will still + be refs (now inside `self`). + + Returns: + self + + Raises: + `PortError` if any ports specified in `map_in` or `map_out` do not + exist in `self.ports` or `other.ports`. + `PortError` if there are any duplicate names after `map_in` and `map_out` + are applied. + """ + if self._dead: + logger.error('Skipping place() since device is dead') + return self + + if not isinstance(other, str | Abstract | Pattern): + # We got a Tree; add it into self.library and grab an Abstract for it + other = self.library << other + + if isinstance(other, str): + other = self.library.abstract(other) + if append and isinstance(other, Abstract): + other = self.library[other.name] + + self.pattern.place( + other = other, + offset = offset, + rotation = rotation, + pivot = pivot, + mirrored = mirrored, + port_map = port_map, + skip_port_check = skip_port_check, + append = append, + ) + return self + + def translate(self, offset: ArrayLike) -> Self: + """ + Translate the pattern and all ports. + + Args: + offset: (x, y) distance to translate by + + Returns: + self + """ + self.pattern.translate_elements(offset) + return self + + def rotate_around(self, pivot: ArrayLike, angle: float) -> Self: + """ + Rotate the pattern and all ports. + + Args: + angle: angle (radians, counterclockwise) to rotate by + pivot: location to rotate around + + Returns: + self + """ + self.pattern.rotate_around(pivot, angle) + for port in self.ports.values(): + port.rotate_around(pivot, angle) + return self + + def mirror(self, axis: int = 0) -> Self: + """ + Mirror the pattern and all ports across the specified axis. + + Args: + axis: Axis to mirror across (x=0, y=1) + + Returns: + self + """ + self.pattern.mirror(axis) + return self + + def set_dead(self) -> Self: + """ + Disallows further changes through `plug()` or `place()`. + This is meant for debugging: + ``` + dev.plug(a, ...) + dev.set_dead() # added for debug purposes + dev.plug(b, ...) # usually raises an error, but now skipped + dev.plug(c, ...) # also skipped + dev.pattern.visualize() # shows the device as of the set_dead() call + ``` + + Returns: + self + """ + self._dead = True + return self + + def __repr__(self) -> str: + s = f'' + return s + + diff --git a/masque/builder/logging.py b/masque/builder/logging.py deleted file mode 100644 index b4a113b..0000000 --- a/masque/builder/logging.py +++ /dev/null @@ -1,120 +0,0 @@ -""" -Logging and operation decorators for Pather -""" -from typing import TYPE_CHECKING, Any -from collections.abc import Iterator, Sequence, Callable -import logging -from functools import wraps -import inspect -import numpy -from contextlib import contextmanager - -if TYPE_CHECKING: - from .pather import Pather - -logger = logging.getLogger(__name__) - - -def _format_log_args(**kwargs) -> str: - arg_strs = [] - for k, v in kwargs.items(): - if isinstance(v, str | int | float | bool | None): - arg_strs.append(f"{k}={v}") - elif isinstance(v, numpy.ndarray): - arg_strs.append(f"{k}={v.tolist()}") - elif isinstance(v, list | tuple) and len(v) <= 10: - arg_strs.append(f"{k}={v}") - else: - arg_strs.append(f"{k}=...") - return ", ".join(arg_strs) - - -class PatherLogger: - """ - Encapsulates state for Pather diagnostic logging. - """ - debug: bool - indent: int - depth: int - - def __init__(self, debug: bool = False) -> None: - self.debug = debug - self.indent = 0 - self.depth = 0 - - def _log(self, module_name: str, msg: str) -> None: - if self.debug and self.depth <= 1: - log_obj = logging.getLogger(module_name) - log_obj.info(' ' * self.indent + msg) - - @contextmanager - def log_operation( - self, - pather: 'Pather', - op: str, - portspec: str | Sequence[str] | None = None, - **kwargs: Any, - ) -> Iterator[None]: - if not self.debug or self.depth > 0: - self.depth += 1 - try: - yield - finally: - self.depth -= 1 - return - - target = f"({portspec})" if portspec else "" - module_name = pather.__class__.__module__ - self._log(module_name, f"Operation: {op}{target} {_format_log_args(**kwargs)}") - - before_ports = {name: port.copy() for name, port in pather.ports.items()} - self.depth += 1 - self.indent += 1 - - try: - yield - finally: - after_ports = pather.ports - for name in sorted(after_ports.keys()): - if name not in before_ports or after_ports[name] != before_ports[name]: - self._log(module_name, f"Port {name}: {pather.ports[name].describe()}") - for name in sorted(before_ports.keys()): - if name not in after_ports: - self._log(module_name, f"Port {name}: removed") - - self.indent -= 1 - self.depth -= 1 - - -def logged_op( - portspec_getter: Callable[[dict[str, Any]], str | Sequence[str] | None] | None = None, - ) -> Callable[[Callable[..., Any]], Callable[..., Any]]: - """ - Decorator to wrap Pather methods with logging. - """ - def decorator(func: Callable[..., Any]) -> Callable[..., Any]: - sig = inspect.signature(func) - - @wraps(func) - def wrapper(self: 'Pather', *args: Any, **kwargs: Any) -> Any: - logger_obj = getattr(self, '_logger', None) - if logger_obj is None or not logger_obj.debug: - return func(self, *args, **kwargs) - - bound = sig.bind(self, *args, **kwargs) - bound.apply_defaults() - all_args = bound.arguments - # remove 'self' from logged args - logged_args = {k: v for k, v in all_args.items() if k != 'self'} - - ps = portspec_getter(all_args) if portspec_getter else None - - # Remove portspec from logged_args if it's there to avoid duplicate arg to log_operation - logged_args.pop('portspec', None) - - with logger_obj.log_operation(self, func.__name__, ps, **logged_args): - if getattr(self, '_dead', False) and func.__name__ in ('plug', 'place'): - logger.warning(f"Skipping geometry for {func.__name__}() since device is dead") - return func(self, *args, **kwargs) - return wrapper - return decorator diff --git a/masque/builder/pather.py b/masque/builder/pather.py index b4c264d..9af473d 100644 --- a/masque/builder/pather.py +++ b/masque/builder/pather.py @@ -1,112 +1,124 @@ """ -Unified Pattern assembly and routing (`Pather`) +Manual wire/waveguide routing (`Pather`) """ -from typing import Self, Literal, Any, overload -from collections.abc import Iterator, Iterable, Mapping, MutableMapping, Sequence, Callable +from typing import Self +from collections.abc import Sequence, Mapping, MutableMapping import copy import logging -from collections import defaultdict -from functools import wraps from pprint import pformat -from itertools import chain -from contextlib import contextmanager - -import numpy -from numpy import pi -from numpy.typing import ArrayLike from ..pattern import Pattern -from ..library import ILibrary, TreeView, SINGLE_USE_PREFIX -from ..error import BuildError, PortError +from ..library import ILibrary +from ..error import BuildError from ..ports import PortList, Port -from ..abstract import Abstract from ..utils import SupportsBool -from .tools import Tool, RenderStep -from .utils import ell -from .logging import logged_op, PatherLogger +from .tools import Tool +from .pather_mixin import PatherMixin +from .builder import Builder logger = logging.getLogger(__name__) -class Pather(PortList): +class Pather(Builder, PatherMixin): """ - A `Pather` is a helper object used for snapping together multiple - lower-level patterns at their `Port`s, and for routing single-use - patterns (e.g. wires or waveguides) between them. + An extension of `Builder` which provides functionality for routing and attaching + single-use patterns (e.g. wires or waveguides) and bundles / buses of such patterns. - The `Pather` holds context in the form of a `Library`, its underlying - pattern, and a set of `Tool`s for generating routing segments. + `Pather` is mostly concerned with calculating how long each wire should be. It calls + out to `Tool.path` functions provided by subclasses of `Tool` to build the actual patterns. + `Tool`s are assigned on a per-port basis and stored in `.tools`; a key of `None` represents + a "default" `Tool` used for all ports which do not have a port-specific `Tool` assigned. - Routing operations (`trace`, `jog`, `uturn`, etc.) are rendered - incrementally by default. Set `auto_render=False` to defer geometry - generation until an explicit call to `render()`. Examples: Creating a Pather =========================== - - `Pather(library, tools=my_tool)` makes an empty pattern with no ports. - The default routing tool for all ports is set to `my_tool`. + - `Pather(library, tools=my_tool)` makes an empty pattern with no ports. The pattern + is not added into `library` and must later be added with e.g. + `library['mypat'] = pather.pattern`. + The default wire/waveguide generating tool for all ports is set to `my_tool`. + + - `Pather(library, ports={'in': Port(...), 'out': ...}, name='mypat', tools=my_tool)` + makes an empty pattern, adds the given ports, and places it into `library` + under the name `'mypat'`. The default wire/waveguide generating tool + for all ports is set to `my_tool` + + - `Pather(..., tools={'in': top_metal_40um, 'out': bottom_metal_1um, None: my_tool})` + assigns specific tools to individual ports, and `my_tool` as a default for ports + which are not specified. + + - `Pather.interface(other_pat, port_map=['A', 'B'], library=library, tools=my_tool)` + makes a new (empty) pattern, copies over ports 'A' and 'B' from + `other_pat`, and creates additional ports 'in_A' and 'in_B' facing + in the opposite directions. This can be used to build a device which + can plug into `other_pat` (using the 'in_*' ports) but which does not + itself include `other_pat` as a subcomponent. + + - `Pather.interface(other_pather, ...)` does the same thing as + `Builder.interface(other_builder.pattern, ...)` but also uses + `other_builder.library` as its library by default. - - `Pather(library, name='mypat')` makes an empty pattern and adds it to - `library` under the name `'mypat'`. Examples: Adding to a pattern ============================= - - `pather.plug(subdevice, {'A': 'C'})` instantiates `subdevice` and - connects port 'A' of the current pattern to port 'C' of `subdevice`. + - `pather.path('my_port', ccw=True, distance)` creates a "wire" for which the output + port is `distance` units away along the axis of `'my_port'` and rotated 90 degrees + counterclockwise (since `ccw=True`) relative to `'my_port'`. The wire is `plug`ged + into the existing `'my_port'`, causing the port to move to the wire's output. - - `pather.trace('my_port', ccw=True, length=100)` plans a 100-unit bend - starting at 'my_port'. Geometry is added immediately by default. - Set `auto_render=False` to defer and call `pather.render()` later. + There is no formal guarantee about how far off-axis the output will be located; + there may be a significant width to the bend that is used to accomplish the 90 degree + turn. However, an error is raised if `distance` is too small to fit the bend. + + - `pather.path('my_port', ccw=None, distance)` creates a straight wire with a length + of `distance` and `plug`s it into `'my_port'`. + + - `pather.path_to('my_port', ccw=False, position)` creates a wire which starts at + `'my_port'` and has its output at the specified `position`, pointing 90 degrees + clockwise relative to the input. Again, the off-axis position or distance to the + output is not specified, so `position` takes the form of a single coordinate. To + ease debugging, position may be specified as `x=position` or `y=position` and an + error will be raised if the wrong coordinate is given. + + - `pather.mpath(['A', 'B', 'C'], ..., spacing=spacing)` is a superset of `path` + and `path_to` which can act on multiple ports simultaneously. Each port's wire is + generated using its own `Tool` (or the default tool if left unspecified). + The output ports are spaced out by `spacing` along the input ports' axis, unless + `ccw=None` is specified (i.e. no bends) in which case they all end at the same + destination coordinate. + + - `pather.plug(wire, {'myport': 'A'})` places port 'A' of `wire` at 'myport' + of `pather.pattern`. If `wire` has only two ports (e.g. 'A' and 'B'), no `map_out`, + argument is provided, and the `inherit_name` argument is not explicitly + set to `False`, the unconnected port of `wire` is automatically renamed to + 'myport'. This allows easy extension of existing ports without changing + their names or having to provide `map_out` each time `plug` is called. + + - `pather.place(pad, offset=(10, 10), rotation=pi / 2, port_map={'A': 'gnd'})` + instantiates `pad` at the specified (x, y) offset and with the specified + rotation, adding its ports to those of `pather.pattern`. Port 'A' of `pad` is + renamed to 'gnd' so that further routing can use this signal or net name + rather than the port name on the original `pad` device. + + - `pather.retool(tool)` or `pather.retool(tool, ['in', 'out', None])` can change + which tool is used for the given ports (or as the default tool). Useful + when placing vias or using multiple waveguide types along a route. """ - __slots__ = ( - 'pattern', 'library', 'tools', 'paths', - '_dead', '_logger', '_auto_render', '_auto_render_append' - ) - - pattern: Pattern - """ Layout of this device """ + __slots__ = ('tools',) library: ILibrary - """ Library from which patterns should be referenced """ + """ + Library from which existing patterns should be referenced, and to which + new ones should be added + """ tools: dict[str | None, Tool] """ - Tool objects used to dynamically generate new routing segments. - A key of `None` indicates the default `Tool`. + Tool objects are used to dynamically generate new single-use `Pattern`s + (e.g wires or waveguides) to be plugged into this device. A key of `None` + indicates the default `Tool`. """ - paths: defaultdict[str, list[RenderStep]] - """ Per-port list of planned operations, to be used by `render()` """ - - _dead: bool - """ If True, geometry generation is skipped (for debugging) """ - - _logger: PatherLogger - """ Handles diagnostic logging of operations """ - - _auto_render: bool - """ If True, routing operations call render() immediately """ - - PROBE_LENGTH: float = 1e6 - """ Large length used when probing tools for their lateral displacement """ - - _POSITION_KEYS: tuple[str, ...] = ('p', 'x', 'y', 'pos', 'position') - """ Single-port position bounds accepted by `trace_to()` and `jog()` """ - - _BUNDLE_BOUND_KEYS: tuple[str, ...] = ( - 'emin', 'emax', 'pmin', 'pmax', 'xmin', 'xmax', 'ymin', 'ymax', 'min_past_furthest', - ) - """ Bounds accepted by `trace()` / `trace_to()` when solving bundle extensions """ - - @property - def ports(self) -> dict[str, Port]: - return self.pattern.ports - - @ports.setter - def ports(self, value: dict[str, Port]) -> None: - self.pattern.ports = value - def __init__( self, library: ILibrary, @@ -115,38 +127,40 @@ class Pather(PortList): ports: str | Mapping[str, Port] | None = None, tools: Tool | MutableMapping[str | None, Tool] | None = None, name: str | None = None, - debug: bool = False, - auto_render: bool = True, - auto_render_append: bool = True, ) -> None: """ Args: - library: The library for pattern references and generated segments. - pattern: The pattern to modify. If `None`, a new one is created. - ports: Initial set of ports. May be a string (name in `library`) - or a port mapping. - tools: Tool(s) to use for routing segments. + library: The library from which referenced patterns will be taken, + and where new patterns (e.g. generated by the `tools`) will be placed. + pattern: The pattern which will be modified by subsequent operations. + If `None` (default), a new pattern is created. + ports: Allows specifying the initial set of ports, if `pattern` does + not already have any ports (or is not provided). May be a string, + in which case it is interpreted as a name in `library`. + Default `None` (no ports). + tools: A mapping of {port: tool} which specifies what `Tool` should be used + to generate waveguide or wire segments when `path`/`path_to`/`mpath` + are called. Relies on `Tool.path` implementations. name: If specified, `library[name]` is set to `self.pattern`. - debug: If True, enables detailed logging. - auto_render: If True, enables immediate rendering of routing steps. - auto_render_append: If `auto_render` is True, determines whether - to append geometry or add a reference. """ self._dead = False - self._logger = PatherLogger(debug=debug) - self._auto_render = auto_render - self._auto_render_append = auto_render_append self.library = library - self.pattern = pattern if pattern is not None else Pattern() - self.paths = defaultdict(list) + if pattern is not None: + self.pattern = pattern + else: + self.pattern = Pattern() if ports is not None: if self.pattern.ports: raise BuildError('Ports supplied for pattern with pre-existing ports!') if isinstance(ports, str): ports = library.abstract(ports).ports + self.pattern.ports.update(copy.deepcopy(dict(ports))) + if name is not None: + library[name] = self.pattern + if tools is None: self.tools = {} elif isinstance(tools, Tool): @@ -154,977 +168,29 @@ class Pather(PortList): else: self.tools = dict(tools) - if name is not None: - library[name] = self.pattern - - def __del__(self) -> None: - if any(self.paths.values()): - logger.warning(f'Pather {self} had unrendered paths', stack_info=True) - - def __repr__(self) -> str: - s = f'' - return s - - # - # Core Pattern Operations (Immediate) - # - def _prepare_break(self, name: str | None) -> tuple[str, RenderStep] | None: - """ Snapshot one batch-breaking step for a name with deferred geometry. """ - if self._dead or name is None: - return None - - steps = self.paths.get(name) - if not steps: - return None - - port = self.ports.get(name, steps[-1].end_port) - return name, RenderStep('P', None, port.copy(), port.copy(), None) - - def _prepare_breaks(self, names: Iterable[str | None]) -> list[tuple[str, RenderStep]]: - """ Snapshot break markers to be committed after a successful mutation. """ - prepared: list[tuple[str, RenderStep]] = [] - for n in names: - step = self._prepare_break(n) - if step is not None: - prepared.append(step) - return prepared - - def _commit_breaks(self, prepared: Iterable[tuple[str, RenderStep]]) -> None: - """ Append previously prepared break markers. """ - for name, step in prepared: - self.paths[name].append(step) - - @logged_op(lambda args: list(args['map_in'].keys())) - def plug( - self, - other: Abstract | str | Pattern | TreeView, - map_in: dict[str, str], - map_out: dict[str, str | None] | None = None, - **kwargs, - ) -> Self: - other = self.library.resolve(other, append=kwargs.get('append', False)) - - prepared_breaks: list[tuple[str, RenderStep]] = [] - if not self._dead: - other_ports = other.ports - affected = set(map_in.keys()) - plugged = set(map_in.values()) - for name in other_ports: - if name not in plugged: - new_name = (map_out or {}).get(name, name) - if new_name is not None: - affected.add(new_name) - prepared_breaks = self._prepare_breaks(affected) - - self.pattern.plug(other=other, map_in=map_in, map_out=map_out, skip_geometry=self._dead, **kwargs) - self._commit_breaks(prepared_breaks) - return self - - @logged_op() - def place( - self, - other: Abstract | str | Pattern | TreeView, - port_map: dict[str, str | None] | None = None, - **kwargs, - ) -> Self: - other = self.library.resolve(other, append=kwargs.get('append', False)) - - prepared_breaks: list[tuple[str, RenderStep]] = [] - if not self._dead: - other_ports = other.ports - affected = set() - for name in other_ports: - new_name = (port_map or {}).get(name, name) - if new_name is not None: - affected.add(new_name) - prepared_breaks = self._prepare_breaks(affected) - - self.pattern.place(other=other, port_map=port_map, skip_geometry=self._dead, **kwargs) - self._commit_breaks(prepared_breaks) - return self - - @logged_op(lambda args: list(args['connections'].keys())) - def plugged(self, connections: dict[str, str]) -> Self: - prepared_breaks = self._prepare_breaks(chain(connections.keys(), connections.values())) - self.pattern.plugged(connections) - self._commit_breaks(prepared_breaks) - return self - - @logged_op(lambda args: list(args['mapping'].keys())) - def rename_ports(self, mapping: dict[str, str | None], overwrite: bool = False) -> Self: - winners = self.pattern._rename_ports_impl( - mapping, - overwrite=overwrite or self._dead, - allow_collisions=self._dead, - ) - - moved_steps = {kk: self.paths.pop(kk) for kk in mapping if kk in self.paths} - for kk, steps in moved_steps.items(): - vv = mapping[kk] - # Preserve deferred geometry even if the live port is deleted. - # `render()` can still materialize the saved steps using their stored start/end ports. - # Current semantics intentionally keep deleted ports' queued steps under the old key, - # so if a new live port later reuses that name it does not retarget the old geometry; - # the old and new routes merely share a render bucket until `render()` consumes them. - target = kk if vv is None else vv - if self._dead and vv is not None and winners.get(vv) != kk: - target = kk - self.paths[target].extend(steps) - return self - - def set_dead(self) -> Self: - self._dead = True - return self - - # - # Pattern Wrappers - # - @wraps(Pattern.label) - def label(self, *args, **kwargs) -> Self: - self.pattern.label(*args, **kwargs) - return self - - @wraps(Pattern.ref) - def ref(self, *args, **kwargs) -> Self: - self.pattern.ref(*args, **kwargs) - return self - - @wraps(Pattern.polygon) - def polygon(self, *args, **kwargs) -> Self: - self.pattern.polygon(*args, **kwargs) - return self - - @wraps(Pattern.rect) - def rect(self, *args, **kwargs) -> Self: - self.pattern.rect(*args, **kwargs) - return self - - @wraps(Pattern.path) - def path(self, *args, **kwargs) -> Self: - self.pattern.path(*args, **kwargs) - return self - - @logged_op(lambda args: list(args['self'].ports.keys())) - def translate(self, offset: ArrayLike) -> Self: - offset_arr = numpy.asarray(offset) - self.pattern.translate_elements(offset_arr) - for steps in self.paths.values(): - for i, step in enumerate(steps): - steps[i] = step.transformed(offset_arr, 0, numpy.zeros(2)) - return self - - @logged_op(lambda args: list(args['self'].ports.keys())) - def rotate_around(self, pivot: ArrayLike, angle: float) -> Self: - pivot_arr = numpy.asarray(pivot) - self.pattern.rotate_around(pivot_arr, angle) - for steps in self.paths.values(): - for i, step in enumerate(steps): - steps[i] = step.transformed(numpy.zeros(2), angle, pivot_arr) - return self - - @logged_op(lambda args: list(args['self'].ports.keys())) - def mirror(self, axis: int = 0) -> Self: - self.pattern.mirror(axis) - for steps in self.paths.values(): - for i, step in enumerate(steps): - steps[i] = step.mirrored(axis) - return self - - @logged_op(lambda args: args['name']) - def mkport(self, name: str, value: Port) -> Self: - super().mkport(name, value) - return self - - # - # Routing Logic (Deferred / Incremental) - # - def _apply_step( - self, - opcode: Literal['L', 'S', 'U'], - portspec: str, - out_port: Port, - data: Any, - tool: Tool, - plug_into: str | None = None, - ) -> None: - """ Common logic for applying a planned step to a port. """ - port = self.pattern[portspec] - port_rot = port.rotation - assert port_rot is not None - - out_port.rotate_around((0, 0), pi + port_rot) - out_port.translate(port.offset) - - if not self._dead: - step = RenderStep(opcode, tool, port.copy(), out_port.copy(), data) - self.paths[portspec].append(step) - - self.pattern.ports[portspec] = out_port.copy() - - if plug_into is not None: - self.plugged({portspec: plug_into}) - - if self._auto_render: - self.render(append=self._auto_render_append) - - def _transform_relative_port(self, start_port: Port, out_port: Port) -> Port: - """ Transform a tool-planned output port into layout coordinates without mutating state. """ - port_rot = start_port.rotation - assert port_rot is not None - - transformed = out_port.copy() - transformed.rotate_around((0, 0), pi + port_rot) - transformed.translate(start_port.offset) - return transformed - - def _resolved_position_bound( - self, - portspec: str, - bounds: Mapping[str, Any], + @classmethod + def from_builder( + cls: type['Pather'], + builder: Builder, *, - allow_length: bool, - ) -> tuple[str, Any, float] | None: + tools: Tool | MutableMapping[str | None, Tool] | None = None, + ) -> 'Pather': """ - Resolve a single positional bound for a single port into a travel length. + Construct a `Pather` by adding tools to a `Builder`. + + Args: + builder: Builder to turn into a Pather + tools: Tools for the `Pather` + + Returns: + A new Pather object, using `builder.library` and `builder.pattern`. """ - present = [(key, bounds[key]) for key in self._POSITION_KEYS if bounds.get(key) is not None] - if not present: - return None - if len(present) > 1: - keys = ', '.join(key for key, _value in present) - raise BuildError(f'Provide exactly one positional bound; got {keys}') - if not allow_length and bounds.get('length') is not None: - raise BuildError('length cannot be combined with a positional bound') + new = Pather(library=builder.library, tools=tools, pattern=builder.pattern) + return new - key, value = present[0] - port = self.pattern[portspec] - assert port.rotation is not None - is_horiz = numpy.isclose(port.rotation % pi, 0) - if is_horiz: - if key == 'y': - raise BuildError('Port is horizontal') - target = Port((value, port.offset[1]), rotation=None) - else: - if key == 'x': - raise BuildError('Port is vertical') - target = Port((port.offset[0], value), rotation=None) - (travel, _jog), _ = port.measure_travel(target) - return key, value, -float(travel) - - @staticmethod - def _format_route_key_list(keys: Sequence[str]) -> str: - return ', '.join(keys) - - @staticmethod - def _present_keys(bounds: Mapping[str, Any], keys: Sequence[str]) -> list[str]: - return [key for key in keys if bounds.get(key) is not None] - - def _present_bundle_bounds(self, bounds: Mapping[str, Any]) -> list[str]: - return self._present_keys(bounds, self._BUNDLE_BOUND_KEYS) - - def _validate_trace_args( - self, - portspec: Sequence[str], - *, - length: float | None, - spacing: float | ArrayLike | None, - bounds: Mapping[str, Any], - ) -> None: - bundle_bounds = self._present_bundle_bounds(bounds) - if len(bundle_bounds) > 1: - args = self._format_route_key_list(bundle_bounds) - raise BuildError(f'Provide exactly one bundle bound for trace(); got {args}') - - invalid_with_length = self._present_keys(bounds, ('each', 'set_rotation')) + bundle_bounds - invalid_with_each = self._present_keys(bounds, ('set_rotation',)) + bundle_bounds - - if length is not None: - if len(portspec) > 1: - raise BuildError('length only allowed with a single port') - if spacing is not None: - invalid_with_length.append('spacing') - if invalid_with_length: - args = self._format_route_key_list(invalid_with_length) - raise BuildError(f'length cannot be combined with other routing bounds: {args}') - return - - if bounds.get('each') is not None: - if spacing is not None: - invalid_with_each.append('spacing') - if invalid_with_each: - args = self._format_route_key_list(invalid_with_each) - raise BuildError(f'each cannot be combined with other routing bounds: {args}') - return - - if not bundle_bounds: - raise BuildError('No bound type specified for trace()') - - def _validate_trace_to_positional_args( - self, - *, - spacing: float | ArrayLike | None, - bounds: Mapping[str, Any], - ) -> None: - invalid = self._present_keys(bounds, ('each', 'set_rotation')) + self._present_bundle_bounds(bounds) - if spacing is not None: - invalid.append('spacing') - if invalid: - args = self._format_route_key_list(invalid) - raise BuildError(f'Positional bounds cannot be combined with other routing bounds: {args}') - - def _validate_jog_args(self, *, length: float | None, bounds: Mapping[str, Any]) -> None: - invalid = self._present_keys(bounds, ('each', 'set_rotation')) + self._present_bundle_bounds(bounds) - if length is not None: - invalid = self._present_keys(bounds, self._POSITION_KEYS) + invalid - if invalid: - args = self._format_route_key_list(invalid) - raise BuildError(f'length cannot be combined with other routing bounds in jog(): {args}') - return - - if invalid: - args = self._format_route_key_list(invalid) - raise BuildError(f'Unsupported routing bounds for jog(): {args}') - - def _validate_uturn_args(self, bounds: Mapping[str, Any]) -> None: - invalid = self._present_keys(bounds, self._POSITION_KEYS + ('each', 'set_rotation')) + self._present_bundle_bounds(bounds) - if invalid: - args = self._format_route_key_list(invalid) - raise BuildError(f'Unsupported routing bounds for uturn(): {args}') - - def _validate_fallback_endpoint( - self, - portspec: str, - actual_end: Port, - *, - length: float, - jog: float, - out_rotation: float, - requested_out_ptype: str | None, - route_name: str, - ) -> None: - """ - Ensure a synthesized fallback route still satisfies the public routing contract. - """ - start_port = self.pattern[portspec] - expected_local = Port((length, jog), rotation=out_rotation, ptype=actual_end.ptype) - expected_end = self._transform_relative_port(start_port, expected_local) - - offsets_match = bool(numpy.allclose(actual_end.offset, expected_end.offset)) - rotations_match = ( - actual_end.rotation is not None - and expected_end.rotation is not None - and bool(numpy.isclose(actual_end.rotation, expected_end.rotation)) - ) - ptype_matches = requested_out_ptype is None or actual_end.ptype == requested_out_ptype - - if offsets_match and rotations_match and ptype_matches: - return - - raise BuildError( - f'{route_name} fallback via two planL() steps is unsupported for this tool/kwargs combination. ' - f'Expected offset={tuple(expected_end.offset)}, rotation={expected_end.rotation}, ' - f'ptype={requested_out_ptype or actual_end.ptype}; got offset={tuple(actual_end.offset)}, ' - f'rotation={actual_end.rotation}, ptype={actual_end.ptype}' - ) - - def _apply_validated_double_l( - self, - portspec: str, - tool: Tool, - first: tuple[Port, Any], - second: tuple[Port, Any], - *, - length: float, - jog: float, - out_rotation: float, - requested_out_ptype: str | None, - route_name: str, - plug_into: str | None, - ) -> None: - out_port0, data0 = first - out_port1, data1 = second - staged_port0 = self._transform_relative_port(self.pattern[portspec], out_port0) - staged_port1 = self._transform_relative_port(staged_port0, out_port1) - self._validate_fallback_endpoint( - portspec, - staged_port1, - length = length, - jog = jog, - out_rotation = out_rotation, - requested_out_ptype = requested_out_ptype, - route_name = route_name, - ) - self._apply_step('L', portspec, out_port0, data0, tool) - self._apply_step('L', portspec, out_port1, data1, tool, plug_into) - - def _plan_s_fallback( - self, - tool: Tool, - portspec: str, - in_ptype: str, - length: float, - jog: float, - **kwargs: Any, - ) -> tuple[tuple[Port, Any], tuple[Port, Any]]: - ccw0 = jog > 0 - R1 = self._get_tool_R(tool, ccw0, in_ptype, **kwargs) - R2 = self._get_tool_R(tool, not ccw0, in_ptype, **kwargs) - L1, L2 = length - R2, abs(jog) - R1 - if L1 < 0 or L2 < 0: - raise BuildError(f"Jog {jog} or length {length} too small for double-L fallback") - - first = tool.planL(ccw0, L1, in_ptype = in_ptype, **(kwargs | {'out_ptype': None})) - second = tool.planL(not ccw0, L2, in_ptype = first[0].ptype, **kwargs) - return first, second - - def _plan_u_fallback( - self, - tool: Tool, - in_ptype: str, - length: float, - jog: float, - **kwargs: Any, - ) -> tuple[tuple[Port, Any], tuple[Port, Any]]: - ccw = jog > 0 - R = self._get_tool_R(tool, ccw, in_ptype, **kwargs) - L1, L2 = length + R, abs(jog) - R - first = tool.planL(ccw, L1, in_ptype = in_ptype, **(kwargs | {'out_ptype': None})) - second = tool.planL(ccw, L2, in_ptype = first[0].ptype, **kwargs) - return first, second - - def _run_route_transaction(self, callback: Callable[[], None]) -> None: - """ Run a routing mutation atomically, rendering once at the end if auto-render is enabled. """ - saved_ports = copy.deepcopy(self.pattern.ports) - saved_paths = defaultdict(list, copy.deepcopy(dict(self.paths))) - saved_auto_render = self._auto_render - self._auto_render = False - try: - callback() - except Exception: - self.pattern.ports = saved_ports - self.paths = saved_paths - raise - finally: - self._auto_render = saved_auto_render - if saved_auto_render and any(self.paths.values()): - self.render(append = self._auto_render_append) - - def _execute_route_op(self, op_name: str, kwargs: dict[str, Any]) -> None: - if op_name == 'trace_to': - self.trace_to(**kwargs) - elif op_name == 'jog': - self.jog(**kwargs) - elif op_name == 'uturn': - self.uturn(**kwargs) - elif op_name == 'rename_ports': - self.rename_ports(**kwargs) - else: - raise BuildError(f'Unrecognized routing op {op_name}') - - def _execute_route_ops(self, ops: Sequence[tuple[str, dict[str, Any]]]) -> None: - for op_name, op_kwargs in ops: - self._execute_route_op(op_name, op_kwargs) - - def _merge_trace_into_op_kwargs( - self, - op_name: str, - user_kwargs: Mapping[str, Any], - **reserved: Any, - ) -> dict[str, Any]: - """ Merge tool kwargs with internally computed op kwargs, rejecting collisions. """ - collisions = sorted(set(user_kwargs) & set(reserved)) - if collisions: - args = ', '.join(collisions) - raise BuildError(f'trace_into() kwargs cannot override {op_name}() arguments: {args}') - return {**user_kwargs, **reserved} - - def _plan_trace_into( - self, - portspec_src: str, - portspec_dst: str, - *, - out_ptype: str | None, - plug_destination: bool, - thru: str | None, - **kwargs: Any, - ) -> list[tuple[str, dict[str, Any]]]: - port_src, port_dst = self.pattern[portspec_src], self.pattern[portspec_dst] - if out_ptype is None: - out_ptype = port_dst.ptype - if port_src.rotation is None or port_dst.rotation is None: - raise PortError('Ports must have rotation') - - src_horiz = numpy.isclose(port_src.rotation % pi, 0) - dst_horiz = numpy.isclose(port_dst.rotation % pi, 0) - xd, yd = port_dst.offset - angle = (port_dst.rotation - port_src.rotation) % (2 * pi) - dst_args = {'out_ptype': out_ptype} - if plug_destination: - dst_args['plug_into'] = portspec_dst - - ops: list[tuple[str, dict[str, Any]]] = [] - if src_horiz and not dst_horiz: - ops.append(('trace_to', self._merge_trace_into_op_kwargs( - 'trace_to', - kwargs, - portspec = portspec_src, - ccw = angle > pi, - x = xd, - ))) - ops.append(('trace_to', self._merge_trace_into_op_kwargs( - 'trace_to', - kwargs, - portspec = portspec_src, - ccw = None, - y = yd, - **dst_args, - ))) - elif dst_horiz and not src_horiz: - ops.append(('trace_to', self._merge_trace_into_op_kwargs( - 'trace_to', - kwargs, - portspec = portspec_src, - ccw = angle > pi, - y = yd, - ))) - ops.append(('trace_to', self._merge_trace_into_op_kwargs( - 'trace_to', - kwargs, - portspec = portspec_src, - ccw = None, - x = xd, - **dst_args, - ))) - elif numpy.isclose(angle, pi): - (travel, jog), _ = port_src.measure_travel(port_dst) - if numpy.isclose(jog, 0): - ops.append(( - 'trace_to', - self._merge_trace_into_op_kwargs( - 'trace_to', - kwargs, - portspec = portspec_src, - ccw = None, - x = xd if src_horiz else None, - y = yd if not src_horiz else None, - **dst_args, - ), - )) - else: - ops.append(('jog', self._merge_trace_into_op_kwargs( - 'jog', - kwargs, - portspec = portspec_src, - offset = -jog, - length = -travel, - **dst_args, - ))) - elif numpy.isclose(angle, 0): - (travel, jog), _ = port_src.measure_travel(port_dst) - ops.append(('uturn', self._merge_trace_into_op_kwargs( - 'uturn', - kwargs, - portspec = portspec_src, - offset = -jog, - length = -travel, - **dst_args, - ))) - else: - raise BuildError(f"Cannot route relative angle {angle}") - - if thru: - ops.append(('rename_ports', {'mapping': {thru: portspec_src}})) - return ops - - def _get_tool_R(self, tool: Tool, ccw: SupportsBool, in_ptype: str | None, **kwargs) -> float: - """ Probe a tool to find the lateral displacement (radius) of its bend. """ - kwargs_no_out = kwargs | {'out_ptype': None} - probe_len = kwargs.get('probe_length', self.PROBE_LENGTH) - try: - out_port, _ = tool.planL(ccw, probe_len, in_ptype=in_ptype, **kwargs_no_out) - return abs(out_port.y) - except (BuildError, NotImplementedError): - # Fallback for tools without planL: use traceL and measure the result - port_names = ('A', 'B') - tree = tool.traceL(ccw, probe_len, in_ptype=in_ptype, port_names=port_names, **kwargs_no_out) - pat = tree.top_pattern() - (_, R), _ = pat[port_names[0]].measure_travel(pat[port_names[1]]) - return abs(R) - - def _apply_dead_fallback( - self, - portspec: str, - length: float, - jog: float, - ccw: SupportsBool | None, - in_ptype: str, - plug_into: str | None = None, - *, - out_rot: float | None = None, - out_ptype: str | None = None, - ) -> None: - if out_rot is None: - if ccw is None: - out_rot = pi - elif bool(ccw): - out_rot = -pi / 2 - else: - out_rot = pi / 2 - logger.warning(f"Tool planning failed for dead pather. Using dummy extension for {portspec}.") - port = self.pattern[portspec] - port_rot = port.rotation - assert port_rot is not None - out_port = Port((length, jog), rotation=out_rot, ptype=out_ptype or in_ptype) - out_port.rotate_around((0, 0), pi + port_rot) - out_port.translate(port.offset) - self.pattern.ports[portspec] = out_port - if plug_into is not None: - self.plugged({portspec: plug_into}) - - @logged_op(lambda args: args['portspec']) - def _traceL(self, portspec: str, ccw: SupportsBool | None, length: float, *, plug_into: str | None = None, **kwargs: Any) -> Self: - tool = self.tools.get(portspec, self.tools.get(None)) - if tool is None: - raise BuildError(f'No tool assigned for port {portspec}') - in_ptype = self.pattern[portspec].ptype - try: - out_port, data = tool.planL(ccw, length, in_ptype=in_ptype, **kwargs) - except (BuildError, NotImplementedError): - if not self._dead: - raise - self._apply_dead_fallback( - portspec, - length, - 0, - ccw, - in_ptype, - plug_into, - out_ptype=kwargs.get('out_ptype'), - ) - return self - if out_port is not None: - self._apply_step('L', portspec, out_port, data, tool, plug_into) - return self - - @logged_op(lambda args: args['portspec']) - def _traceS(self, portspec: str, length: float, jog: float, *, plug_into: str | None = None, **kwargs: Any) -> Self: - tool = self.tools.get(portspec, self.tools.get(None)) - if tool is None: - raise BuildError(f'No tool assigned for port {portspec}') - in_ptype = self.pattern[portspec].ptype - try: - out_port, data = tool.planS(length, jog, in_ptype=in_ptype, **kwargs) - except (BuildError, NotImplementedError): - try: - first, second = self._plan_s_fallback(tool, portspec, in_ptype, length, jog, **kwargs) - except (BuildError, NotImplementedError): - if not self._dead: - raise - self._apply_dead_fallback( - portspec, - length, - jog, - None, - in_ptype, - plug_into, - out_rot=pi, - out_ptype=kwargs.get('out_ptype'), - ) - return self - - self._apply_validated_double_l( - portspec, - tool, - first, - second, - length = length, - jog = jog, - out_rotation = pi, - requested_out_ptype = kwargs.get('out_ptype'), - route_name = 'S-bend', - plug_into = plug_into, - ) - return self - if out_port is not None: - self._apply_step('S', portspec, out_port, data, tool, plug_into) - return self - - @logged_op(lambda args: args['portspec']) - def _traceU(self, portspec: str, jog: float, *, length: float = 0, plug_into: str | None = None, **kwargs: Any) -> Self: - tool = self.tools.get(portspec, self.tools.get(None)) - if tool is None: - raise BuildError(f'No tool assigned for port {portspec}') - in_ptype = self.pattern[portspec].ptype - try: - out_port, data = tool.planU(jog, length=length, in_ptype=in_ptype, **kwargs) - except (BuildError, NotImplementedError): - try: - first, second = self._plan_u_fallback(tool, in_ptype, length, jog, **kwargs) - except (BuildError, NotImplementedError): - if not self._dead: - raise - self._apply_dead_fallback( - portspec, - length, - jog, - None, - in_ptype, - plug_into, - out_rot=0, - out_ptype=kwargs.get('out_ptype'), - ) - return self - - self._apply_validated_double_l( - portspec, - tool, - first, - second, - length = length, - jog = jog, - out_rotation = 0, - requested_out_ptype = kwargs.get('out_ptype'), - route_name = 'U-turn', - plug_into = plug_into, - ) - return self - if out_port is not None: - self._apply_step('U', portspec, out_port, data, tool, plug_into) - return self - - # - # High-level Routing Methods - # - def trace( - self, - portspec: str | Sequence[str], - ccw: SupportsBool | None, - length: float | None = None, - *, - spacing: float | ArrayLike | None = None, - **bounds: Any, - ) -> Self: - """ - Route one or more ports using straight segments or single 90-degree bends. - - Provide exactly one routing mode: - - `length` for a single port, - - `each` to extend each selected port independently by the same amount, or - - one bundle bound such as `xmin`, `emax`, or `min_past_furthest`. - - `spacing` and `set_rotation` are only valid when using a bundle bound. - """ - with self._logger.log_operation(self, 'trace', portspec, ccw=ccw, length=length, spacing=spacing, **bounds): - if isinstance(portspec, str): - portspec = [portspec] - self._validate_trace_args(portspec, length=length, spacing=spacing, bounds=bounds) - if length is not None: - return self._traceL(portspec[0], ccw, length, **bounds) - if bounds.get('each') is not None: - each = bounds.pop('each') - for p in portspec: - self._traceL(p, ccw, each, **bounds) - return self - # Bundle routing - bt = self._present_bundle_bounds(bounds)[0] - bval = bounds.pop(bt) - set_rot = bounds.pop('set_rotation', None) - exts = ell(self.pattern[tuple(portspec)], ccw, spacing=spacing, bound=bval, bound_type=bt, set_rotation=set_rot) - for p, length_val in exts.items(): - self._traceL(p, ccw, length_val, **bounds) - return self - - def trace_to( - self, - portspec: str | Sequence[str], - ccw: SupportsBool | None, - *, - spacing: float | ArrayLike | None = None, - **bounds: Any, - ) -> Self: - """ - Route until a single positional bound is reached, or delegate to `trace()` for length/bundle bounds. - - Exactly one of `p`, `pos`, `position`, `x`, or `y` may be used as a positional - bound. Positional bounds are only valid for a single port and may not be combined - with `length`, `spacing`, `each`, or bundle-bound keywords such as `xmin`/`emax`. - """ - with self._logger.log_operation(self, 'trace_to', portspec, ccw=ccw, spacing=spacing, **bounds): - if isinstance(portspec, str): - portspec = [portspec] - if len(portspec) == 1: - resolved = self._resolved_position_bound(portspec[0], bounds, allow_length=False) - else: - resolved = None - pos_count = sum(bounds.get(key) is not None for key in self._POSITION_KEYS) - if pos_count: - raise BuildError('Position bounds only allowed with a single port') - if resolved is not None: - if len(portspec) > 1: - raise BuildError('Position bounds only allowed with a single port') - self._validate_trace_to_positional_args(spacing=spacing, bounds=bounds) - _key, _value, length = resolved - other_bounds = {bk: bv for bk, bv in bounds.items() if bk not in self._POSITION_KEYS and bk != 'length'} - return self._traceL(portspec[0], ccw, length, **other_bounds) - return self.trace(portspec, ccw, spacing=spacing, **bounds) - - def straight(self, portspec: str | Sequence[str], length: float | None = None, **bounds) -> Self: - return self.trace_to(portspec, None, length=length, **bounds) - - def bend(self, portspec: str | Sequence[str], ccw: SupportsBool, length: float | None = None, **bounds) -> Self: - return self.trace_to(portspec, ccw, length=length, **bounds) - - def ccw(self, portspec: str | Sequence[str], length: float | None = None, **bounds) -> Self: - return self.bend(portspec, True, length, **bounds) - - def cw(self, portspec: str | Sequence[str], length: float | None = None, **bounds) -> Self: - return self.bend(portspec, False, length, **bounds) - - def jog(self, portspec: str | Sequence[str], offset: float, length: float | None = None, **bounds: Any) -> Self: - """ - Route an S-bend. - - `length` is the along-travel displacement. If omitted, exactly one positional - bound (`p`, `pos`, `position`, `x`, or `y`) must be provided for a single port, - and the required travel distance is derived from that bound. When `length` is - provided, no other routing-bound keywords are accepted. - """ - with self._logger.log_operation(self, 'jog', portspec, offset=offset, length=length, **bounds): - if isinstance(portspec, str): - portspec = [portspec] - self._validate_jog_args(length=length, bounds=bounds) - other_bounds = dict(bounds) - if length is None: - if len(portspec) != 1: - raise BuildError('Positional length solving for jog() is only allowed with a single port') - resolved = self._resolved_position_bound(portspec[0], bounds, allow_length=True) - if resolved is None: - raise BuildError('jog() requires either length=... or exactly one positional bound') - _key, _value, length = resolved - other_bounds = {bk: bv for bk, bv in bounds.items() if bk not in self._POSITION_KEYS} - for p in portspec: - self._traceS(p, length, offset, **other_bounds) - return self - - def uturn(self, portspec: str | Sequence[str], offset: float, length: float | None = None, **bounds: Any) -> Self: - """ - Route a U-turn. - - `length` is the along-travel displacement to the final port. If omitted, it defaults - to 0. Positional and bundle-bound keywords are not supported for this operation. - """ - with self._logger.log_operation(self, 'uturn', portspec, offset=offset, length=length, **bounds): - if isinstance(portspec, str): - portspec = [portspec] - self._validate_uturn_args(bounds) - for p in portspec: - self._traceU(p, offset, length=length if length else 0, **bounds) - return self - - def trace_into( - self, - portspec_src: str, - portspec_dst: str, - *, - out_ptype: str | None = None, - plug_destination: bool = True, - thru: str | None = None, - **kwargs: Any, - ) -> Self: - """ - Route one port into another using the shortest supported combination of trace primitives. - - If `plug_destination` is `True`, the destination port is consumed by the final step. - If `thru` is provided, that port is renamed to the source name after the route is complete. - The operation is transactional for live port state and deferred routing steps. - """ - with self._logger.log_operation( - self, - 'trace_into', - [portspec_src, portspec_dst], - out_ptype=out_ptype, - plug_destination=plug_destination, - thru=thru, - **kwargs, - ): - ops = self._plan_trace_into( - portspec_src, - portspec_dst, - out_ptype = out_ptype, - plug_destination = plug_destination, - thru = thru, - **kwargs, - ) - self._run_route_transaction(lambda: self._execute_route_ops(ops)) - return self - - # - # Rendering - # - def render(self, append: bool = True) -> Self: - """ Generate geometry for all planned paths. """ - with self._logger.log_operation(self, 'render', None, append=append): - tool_port_names = ('A', 'B') - pat = Pattern() - - def validate_tree(portspec: str, batch: list[RenderStep], tree: ILibrary) -> None: - missing = sorted( - name - for name in tree.dangling_refs(tree.top()) - if isinstance(name, str) and name.startswith(SINGLE_USE_PREFIX) - ) - if not missing: - return - - tool_name = type(batch[0].tool).__name__ - raise BuildError( - f'Tool {tool_name}.render() returned missing single-use refs for {portspec}: {missing}' - ) - - def render_batch(portspec: str, batch: list[RenderStep], append: bool) -> None: - assert batch[0].tool is not None - tree = batch[0].tool.render(batch, port_names=tool_port_names) - validate_tree(portspec, batch, tree) - name = self.library << tree - if portspec in pat.ports: - del pat.ports[portspec] - pat.ports[portspec] = batch[0].start_port.copy() - if append: - pat.plug(self.library[name], {portspec: tool_port_names[0]}, append=True) - del self.library[name] - else: - pat.plug(self.library.abstract(name), {portspec: tool_port_names[0]}, append=False) - if portspec not in pat.ports and tool_port_names[1] in pat.ports: - pat.rename_ports({tool_port_names[1]: portspec}, overwrite=True) - - for portspec, steps in self.paths.items(): - if not steps: - continue - batch: list[RenderStep] = [] - for step in steps: - appendable = step.opcode in ('L', 'S', 'U') - same_tool = batch and step.tool == batch[0].tool - if batch and (not appendable or not same_tool or not batch[-1].is_continuous_with(step)): - render_batch(portspec, batch, append) - batch = [] - if appendable: - batch.append(step) - elif step.opcode == 'P' and portspec in pat.ports: - del pat.ports[portspec] - if batch: - render_batch(portspec, batch, append) - - self.paths.clear() - pat.ports.clear() - self.pattern.append(pat) - return self - - # - # Utilities - # @classmethod def interface( - cls, + cls: type['Pather'], source: PortList | Mapping[str, Port] | str, *, library: ILibrary | None = None, @@ -1133,241 +199,177 @@ class Pather(PortList): out_prefix: str = '', port_map: dict[str, str] | Sequence[str] | None = None, name: str | None = None, - **kwargs: Any, - ) -> Self: + ) -> 'Pather': + """ + Wrapper for `Pattern.interface()`, which returns a Pather instead. + + Args: + source: A collection of ports (e.g. Pattern, Builder, or dict) + from which to create the interface. May be a pattern name if + `library` is provided. + library: Library from which existing patterns should be referenced, + and to which the new one should be added (if named). If not provided, + `source.library` must exist and will be used. + tools: `Tool`s which will be used by the pather for generating new wires + or waveguides (via `path`/`path_to`/`mpath`). + in_prefix: Prepended to port names for newly-created ports with + reversed directions compared to the current device. + out_prefix: Prepended to port names for ports which are directly + copied from the current device. + port_map: Specification for ports to copy into the new device: + - If `None`, all ports are copied. + - If a sequence, only the listed ports are copied + - If a mapping, the listed ports (keys) are copied and + renamed (to the values). + + Returns: + The new pather, with an empty pattern and 2x as many ports as + listed in port_map. + + Raises: + `PortError` if `port_map` contains port names not present in the + current device. + `PortError` if applying the prefixes results in duplicate port + names. + """ if library is None: if hasattr(source, 'library') and isinstance(source.library, ILibrary): library = source.library else: - raise BuildError('No library provided') + raise BuildError('No library provided (and not present in `source.library`') + if tools is None and hasattr(source, 'tools') and isinstance(source.tools, dict): tools = source.tools + if isinstance(source, str): source = library.abstract(source).ports + pat = Pattern.interface(source, in_prefix=in_prefix, out_prefix=out_prefix, port_map=port_map) - return cls(library=library, pattern=pat, name=name, tools=tools, **kwargs) + new = Pather(library=library, pattern=pat, name=name, tools=tools) + return new - def retool(self, tool: Tool, keys: str | Sequence[str | None] | None = None) -> Self: - if keys is None or isinstance(keys, str): - self.tools[keys] = tool + def __repr__(self) -> str: + s = f'' + return s + + + def path( + self, + portspec: str, + ccw: SupportsBool | None, + length: float, + *, + plug_into: str | None = None, + **kwargs, + ) -> Self: + """ + Create a "wire"/"waveguide" and `plug` it into the port `portspec`, with the aim + of traveling exactly `length` distance. + + The wire will travel `length` distance along the port's axis, and an unspecified + (tool-dependent) distance in the perpendicular direction. The output port will + be rotated (or not) based on the `ccw` parameter. + + Args: + portspec: The name of the port into which the wire will be plugged. + ccw: If `None`, the output should be along the same axis as the input. + Otherwise, cast to bool and turn counterclockwise if True + and clockwise otherwise. + length: The total distance from input to output, along the input's axis only. + (There may be a tool-dependent offset along the other axis.) + plug_into: If not None, attempts to plug the wire's output port into the provided + port on `self`. + + Returns: + self + + Raises: + BuildError if `distance` is too small to fit the bend (if a bend is present). + LibraryError if no valid name could be picked for the pattern. + """ + if self._dead: + logger.error('Skipping path() since device is dead') + return self + + tool_port_names = ('A', 'B') + + tool = self.tools.get(portspec, self.tools[None]) + in_ptype = self.pattern[portspec].ptype + tree = tool.path(ccw, length, in_ptype=in_ptype, port_names=tool_port_names, **kwargs) + tname = self.library << tree + if plug_into is not None: + output = {plug_into: tool_port_names[1]} else: - for k in keys: - self.tools[k] = tool + output = {} + self.plug(tname, {portspec: tool_port_names[0], **output}) return self - @contextmanager - def toolctx(self, tool: Tool, keys: str | Sequence[str | None] | None = None) -> Iterator[Self]: - if keys is None or isinstance(keys, str): - keys = [keys] - saved = {k: self.tools.get(k) for k in keys} + def pathS( + self, + portspec: str, + length: float, + jog: float, + *, + plug_into: str | None = None, + **kwargs, + ) -> Self: + """ + Create an S-shaped "wire"/"waveguide" and `plug` it into the port `portspec`, with the aim + of traveling exactly `length` distance with an offset `jog` along the other axis (+ve jog is + left of direction of travel). + + The output port will have the same orientation as the source port (`portspec`). + + This function attempts to use `tool.planS()`, but falls back to `tool.planL()` if the former + raises a NotImplementedError. + + Args: + portspec: The name of the port into which the wire will be plugged. + jog: Total manhattan distance perpendicular to the direction of travel. + Positive values are to the left of the direction of travel. + length: The total manhattan distance from input to output, along the input's axis only. + (There may be a tool-dependent offset along the other axis.) + plug_into: If not None, attempts to plug the wire's output port into the provided + port on `self`. + + Returns: + self + + Raises: + BuildError if `distance` is too small to fit the s-bend (for nonzero jog). + LibraryError if no valid name could be picked for the pattern. + """ + if self._dead: + logger.error('Skipping pathS() since device is dead') + return self + + tool_port_names = ('A', 'B') + + tool = self.tools.get(portspec, self.tools[None]) + in_ptype = self.pattern[portspec].ptype try: - yield self.retool(tool, keys) - finally: - for k, t in saved.items(): - if t is None: - self.tools.pop(k, None) - else: - self.tools[k] = t + tree = tool.pathS(length, jog, in_ptype=in_ptype, port_names=tool_port_names, **kwargs) + except NotImplementedError: + # Fall back to drawing two L-bends + ccw0 = jog > 0 + kwargs_no_out = kwargs | {'out_ptype': None} + t_tree0 = tool.path( ccw0, length / 2, port_names=tool_port_names, in_ptype=in_ptype, **kwargs_no_out) + t_pat0 = t_tree0.top_pattern() + (_, jog0), _ = t_pat0[tool_port_names[0]].measure_travel(t_pat0[tool_port_names[1]]) + t_tree1 = tool.path(not ccw0, abs(jog - jog0), port_names=tool_port_names, in_ptype=t_pat0[tool_port_names[1]].ptype, **kwargs) + t_pat1 = t_tree1.top_pattern() + (_, jog1), _ = t_pat1[tool_port_names[0]].measure_travel(t_pat1[tool_port_names[1]]) - def flatten(self) -> Self: - self.pattern.flatten(self.library) - return self + kwargs_plug = kwargs | {'plug_into': plug_into} + self.path(portspec, ccw0, length - abs(jog1), **kwargs_no_out) + self.path(portspec, not ccw0, abs(jog - jog0), **kwargs_plug) + return self - def at(self, portspec: str | Iterable[str]) -> 'PortPather': - return PortPather(portspec, self) - - -class PortPather: - """ Port state manager for fluent pathing. """ - def __init__(self, ports: str | Iterable[str], pather: Pather) -> None: - self.ports = [ports] if isinstance(ports, str) else list(ports) - self.pather = pather - - def retool(self, tool: Tool) -> Self: - self.pather.retool(tool, self.ports) - return self - - @contextmanager - def toolctx(self, tool: Tool) -> Iterator[Self]: - with self.pather.toolctx(tool, keys=self.ports): - yield self - - def trace(self, ccw: SupportsBool | None, length: float | None = None, **kw: Any) -> Self: - self.pather.trace(self.ports, ccw, length, **kw) - return self - - def trace_to(self, ccw: SupportsBool | None, **kw: Any) -> Self: - self.pather.trace_to(self.ports, ccw, **kw) - return self - - def straight(self, length: float | None = None, **kw: Any) -> Self: - return self.trace_to(None, length=length, **kw) - - def bend(self, ccw: SupportsBool, length: float | None = None, **kw: Any) -> Self: - return self.trace_to(ccw, length=length, **kw) - - def ccw(self, length: float | None = None, **kw: Any) -> Self: - return self.bend(True, length, **kw) - - def cw(self, length: float | None = None, **kw: Any) -> Self: - return self.bend(False, length, **kw) - - def jog(self, offset: float, length: float | None = None, **kw: Any) -> Self: - self.pather.jog(self.ports, offset, length, **kw) - return self - - def uturn(self, offset: float, length: float | None = None, **kw: Any) -> Self: - self.pather.uturn(self.ports, offset, length, **kw) - return self - - def trace_into(self, target_port: str, **kwargs) -> Self: - if len(self.ports) > 1: - raise BuildError(f'Unable use implicit trace_into() with {len(self.ports)} (>1) ports.') - self.pather.trace_into(self.ports[0], target_port, **kwargs) - return self - - def plug(self, other: Abstract | str, other_port: str, **kwargs) -> Self: - if len(self.ports) > 1: - raise BuildError(f'Unable use implicit plug() with {len(self.ports)} ports.' - 'Use the pather or pattern directly to plug multiple ports.') - self.pather.plug(other, {self.ports[0]: other_port}, **kwargs) - return self - - def plugged(self, other_port: str | Mapping[str, str]) -> Self: - if isinstance(other_port, Mapping): - self.pather.plugged(dict(other_port)) - elif len(self.ports) > 1: - raise BuildError(f'Unable use implicit plugged() with {len(self.ports)} (>1) ports.') + tname = self.library << tree + if plug_into is not None: + output = {plug_into: tool_port_names[1]} else: - self.pather.plugged({self.ports[0]: other_port}) + output = {} + self.plug(tname, {portspec: tool_port_names[0], **output}) return self - # - # Delegate to port - # - # These mutate only the selected live port state. They do not rewrite already planned - # RenderSteps, so deferred geometry remains as previously planned and only future routing - # starts from the updated port. - def set_ptype(self, ptype: str) -> Self: - for port in self.ports: - self.pather.pattern[port].set_ptype(ptype) - return self - - def translate(self, *args, **kwargs) -> Self: - for port in self.ports: - self.pather.pattern[port].translate(*args, **kwargs) - return self - - def mirror(self, *args, **kwargs) -> Self: - for port in self.ports: - self.pather.pattern[port].mirror(*args, **kwargs) - return self - - def rotate(self, rotation: float) -> Self: - for port in self.ports: - self.pather.pattern[port].rotate(rotation) - return self - - def set_rotation(self, rotation: float | None) -> Self: - for port in self.ports: - self.pather.pattern[port].set_rotation(rotation) - return self - - def rename(self, name: str | Mapping[str, str | None]) -> Self: - """ Rename active ports. """ - name_map: dict[str, str | None] - if isinstance(name, str): - if len(self.ports) > 1: - raise BuildError('Use a mapping to rename >1 port') - name_map = {self.ports[0]: name} - else: - name_map = dict(name) - self.pather.rename_ports(name_map) - self.ports = list(dict.fromkeys(mm for mm in [name_map.get(pp, pp) for pp in self.ports] if mm is not None)) - return self - - def select(self, ports: str | Iterable[str]) -> Self: - """ Add ports to the selection. """ - if isinstance(ports, str): - ports = [ports] - for port in ports: - if port not in self.ports: - self.ports.append(port) - return self - - def deselect(self, ports: str | Iterable[str]) -> Self: - """ Remove ports from the selection. """ - if isinstance(ports, str): - ports = [ports] - ports_set = set(ports) - self.ports = [pp for pp in self.ports if pp not in ports_set] - return self - - def _normalize_copy_map(self, name: str | Mapping[str, str], action: str) -> dict[str, str]: - if isinstance(name, str): - if len(self.ports) > 1: - raise BuildError(f'Use a mapping to {action} >1 port') - name_map = {self.ports[0]: name} - else: - name_map = dict(name) - - missing_selected = set(name_map) - set(self.ports) - if missing_selected: - raise PortError(f'Can only {action} selected ports: {missing_selected}') - - missing_pattern = set(name_map) - set(self.pather.pattern.ports) - if missing_pattern: - raise PortError(f'Ports to {action} were not found: {missing_pattern}') - - if not self.pather._dead: - targets = list(name_map.values()) - duplicate_targets = {vv for vv in targets if targets.count(vv) > 1} - if duplicate_targets: - raise PortError(f'{action.capitalize()} targets would collide: {duplicate_targets}') - - overwritten = { - dst for src, dst in name_map.items() - if dst in self.pather.pattern.ports and dst != src - } - if overwritten: - raise PortError(f'{action.capitalize()} would overwrite existing ports: {overwritten}') - - return name_map - - def mark(self, name: str | Mapping[str, str]) -> Self: - """ Bookmark current port(s). """ - name_map = self._normalize_copy_map(name, 'mark') - source_ports = {src: self.pather.pattern[src].copy() for src in name_map} - for src, dst in name_map.items(): - self.pather.pattern.ports[dst] = source_ports[src].copy() - return self - - def fork(self, name: str | Mapping[str, str]) -> Self: - """ Split and follow new name. """ - name_map = self._normalize_copy_map(name, 'fork') - source_ports = {src: self.pather.pattern[src].copy() for src in name_map} - for src, dst in name_map.items(): - self.pather.pattern.ports[dst] = source_ports[src].copy() - self.ports = [(dst if pp == src else pp) for pp in self.ports] - self.ports = list(dict.fromkeys(self.ports)) - return self - - def drop(self) -> Self: - """ Remove selected ports from the pattern and the PortPather. """ - self.pather.rename_ports({pp: None for pp in self.ports}) - self.ports = [] - return self - - @overload - def delete(self, name: None) -> None: ... - - @overload - def delete(self, name: str) -> Self: ... - - def delete(self, name: str | None = None) -> Self | None: - if name is None: - self.drop() - return None - self.pather.rename_ports({name: None}) - self.ports = [pp for pp in self.ports if pp != name] - return self diff --git a/masque/builder/pather_mixin.py b/masque/builder/pather_mixin.py new file mode 100644 index 0000000..1655329 --- /dev/null +++ b/masque/builder/pather_mixin.py @@ -0,0 +1,677 @@ +from typing import Self, overload +from collections.abc import Sequence, Iterator, Iterable +import logging +from contextlib import contextmanager +from abc import abstractmethod, ABCMeta + +import numpy +from numpy import pi +from numpy.typing import ArrayLike + +from ..pattern import Pattern +from ..library import ILibrary, TreeView +from ..error import PortError, BuildError +from ..utils import SupportsBool +from ..abstract import Abstract +from .tools import Tool +from .utils import ell +from ..ports import PortList + + +logger = logging.getLogger(__name__) + + +class PatherMixin(PortList, metaclass=ABCMeta): + pattern: Pattern + """ Layout of this device """ + + library: ILibrary + """ Library from which patterns should be referenced """ + + _dead: bool + """ If True, plug()/place() are skipped (for debugging) """ + + tools: dict[str | None, Tool] + """ + Tool objects are used to dynamically generate new single-use Devices + (e.g wires or waveguides) to be plugged into this device. + """ + + @abstractmethod + def path( + self, + portspec: str, + ccw: SupportsBool | None, + length: float, + *, + plug_into: str | None = None, + **kwargs, + ) -> Self: + pass + + @abstractmethod + def pathS( + self, + portspec: str, + length: float, + jog: float, + *, + plug_into: str | None = None, + **kwargs, + ) -> Self: + pass + + @abstractmethod + def plug( + self, + other: Abstract | str | Pattern | TreeView, + map_in: dict[str, str], + map_out: dict[str, str | None] | None = None, + *, + mirrored: bool = False, + thru: bool | str = True, + set_rotation: bool | None = None, + append: bool = False, + ok_connections: Iterable[tuple[str, str]] = (), + ) -> Self: + pass + + def retool( + self, + tool: Tool, + keys: str | Sequence[str | None] | None = None, + ) -> Self: + """ + Update the `Tool` which will be used when generating `Pattern`s for the ports + given by `keys`. + + Args: + tool: The new `Tool` to use for the given ports. + keys: Which ports the tool should apply to. `None` indicates the default tool, + used when there is no matching entry in `self.tools` for the port in question. + + Returns: + self + """ + if keys is None or isinstance(keys, str): + self.tools[keys] = tool + else: + for key in keys: + self.tools[key] = tool + return self + + @contextmanager + def toolctx( + self, + tool: Tool, + keys: str | Sequence[str | None] | None = None, + ) -> Iterator[Self]: + """ + Context manager for temporarily `retool`-ing and reverting the `retool` + upon exiting the context. + + Args: + tool: The new `Tool` to use for the given ports. + keys: Which ports the tool should apply to. `None` indicates the default tool, + used when there is no matching entry in `self.tools` for the port in question. + + Returns: + self + """ + if keys is None or isinstance(keys, str): + keys = [keys] + saved_tools = {kk: self.tools.get(kk, None) for kk in keys} # If not in self.tools, save `None` + try: + yield self.retool(tool=tool, keys=keys) + finally: + for kk, tt in saved_tools.items(): + if tt is None: + # delete if present + self.tools.pop(kk, None) + else: + self.tools[kk] = tt + + def path_to( + self, + portspec: str, + ccw: SupportsBool | None, + position: float | None = None, + *, + x: float | None = None, + y: float | None = None, + plug_into: str | None = None, + **kwargs, + ) -> Self: + """ + Build a "wire"/"waveguide" extending from the port `portspec`, with the aim + of ending exactly at a target position. + + The wire will travel so that the output port will be placed at exactly the target + position along the input port's axis. There can be an unspecified (tool-dependent) + offset in the perpendicular direction. The output port will be rotated (or not) + based on the `ccw` parameter. + + If using `RenderPather`, `RenderPather.render` must be called after all paths have been fully planned. + + Args: + portspec: The name of the port into which the wire will be plugged. + ccw: If `None`, the output should be along the same axis as the input. + Otherwise, cast to bool and turn counterclockwise if True + and clockwise otherwise. + position: The final port position, along the input's axis only. + (There may be a tool-dependent offset along the other axis.) + Only one of `position`, `x`, and `y` may be specified. + x: The final port position along the x axis. + `portspec` must refer to a horizontal port if `x` is passed, otherwise a + BuildError will be raised. + y: The final port position along the y axis. + `portspec` must refer to a vertical port if `y` is passed, otherwise a + BuildError will be raised. + plug_into: If not None, attempts to plug the wire's output port into the provided + port on `self`. + + Returns: + self + + Raises: + BuildError if `position`, `x`, or `y` is too close to fit the bend (if a bend + is present). + BuildError if `x` or `y` is specified but does not match the axis of `portspec`. + BuildError if more than one of `x`, `y`, and `position` is specified. + """ + if self._dead: + logger.error('Skipping path_to() since device is dead') + return self + + pos_count = sum(vv is not None for vv in (position, x, y)) + if pos_count > 1: + raise BuildError('Only one of `position`, `x`, and `y` may be specified at once') + if pos_count < 1: + raise BuildError('One of `position`, `x`, and `y` must be specified') + + port = self.pattern[portspec] + if port.rotation is None: + raise PortError(f'Port {portspec} has no rotation and cannot be used for path_to()') + + if not numpy.isclose(port.rotation % (pi / 2), 0): + raise BuildError('path_to was asked to route from non-manhattan port') + + is_horizontal = numpy.isclose(port.rotation % pi, 0) + if is_horizontal: + if y is not None: + raise BuildError('Asked to path to y-coordinate, but port is horizontal') + if position is None: + position = x + else: + if x is not None: + raise BuildError('Asked to path to x-coordinate, but port is vertical') + if position is None: + position = y + + x0, y0 = port.offset + if is_horizontal: + if numpy.sign(numpy.cos(port.rotation)) == numpy.sign(position - x0): + raise BuildError(f'path_to routing to behind source port: x0={x0:g} to {position:g}') + length = numpy.abs(position - x0) + else: + if numpy.sign(numpy.sin(port.rotation)) == numpy.sign(position - y0): + raise BuildError(f'path_to routing to behind source port: y0={y0:g} to {position:g}') + length = numpy.abs(position - y0) + + return self.path( + portspec, + ccw, + length, + plug_into = plug_into, + **kwargs, + ) + + def path_into( + self, + portspec_src: str, + portspec_dst: str, + *, + out_ptype: str | None = None, + plug_destination: bool = True, + thru: str | None = None, + **kwargs, + ) -> Self: + """ + Create a "wire"/"waveguide" traveling between the ports `portspec_src` and + `portspec_dst`, and `plug` it into both (or just the source port). + + Only unambiguous scenarios are allowed: + - Straight connector between facing ports + - Single 90 degree bend + - Jog between facing ports + (jog is done as late as possible, i.e. only 2 L-shaped segments are used) + + By default, the destination's `pytpe` will be used as the `out_ptype` for the + wire, and the `portspec_dst` will be plugged (i.e. removed). + + If using `RenderPather`, `RenderPather.render` must be called after all paths have been fully planned. + + Args: + portspec_src: The name of the starting port into which the wire will be plugged. + portspec_dst: The name of the destination port. + out_ptype: Passed to the pathing tool in order to specify the desired port type + to be generated at the destination end. If `None` (default), the destination + port's `ptype` will be used. + thru: If not `None`, the port by this name will be rename to `portspec_src`. + This can be used when routing a signal through a pre-placed 2-port device. + + Returns: + self + + Raises: + PortError if either port does not have a specified rotation. + BuildError if and invalid port config is encountered: + - Non-manhattan ports + - U-bend + - Destination too close to (or behind) source + """ + if self._dead: + logger.error('Skipping path_into() since device is dead') + return self + + port_src = self.pattern[portspec_src] + port_dst = self.pattern[portspec_dst] + + if out_ptype is None: + out_ptype = port_dst.ptype + + if port_src.rotation is None: + raise PortError(f'Port {portspec_src} has no rotation and cannot be used for path_into()') + if port_dst.rotation is None: + raise PortError(f'Port {portspec_dst} has no rotation and cannot be used for path_into()') + + if not numpy.isclose(port_src.rotation % (pi / 2), 0): + raise BuildError('path_into was asked to route from non-manhattan port') + if not numpy.isclose(port_dst.rotation % (pi / 2), 0): + raise BuildError('path_into was asked to route to non-manhattan port') + + src_is_horizontal = numpy.isclose(port_src.rotation % pi, 0) + dst_is_horizontal = numpy.isclose(port_dst.rotation % pi, 0) + xs, ys = port_src.offset + xd, yd = port_dst.offset + + angle = (port_dst.rotation - port_src.rotation) % (2 * pi) + + dst_extra_args = {'out_ptype': out_ptype} + if plug_destination: + dst_extra_args['plug_into'] = portspec_dst + + src_args = {**kwargs} + dst_args = {**src_args, **dst_extra_args} + if src_is_horizontal and not dst_is_horizontal: + # single bend should suffice + self.path_to(portspec_src, angle > pi, x=xd, **src_args) + self.path_to(portspec_src, None, y=yd, **dst_args) + elif dst_is_horizontal and not src_is_horizontal: + # single bend should suffice + self.path_to(portspec_src, angle > pi, y=yd, **src_args) + self.path_to(portspec_src, None, x=xd, **dst_args) + elif numpy.isclose(angle, pi): + if src_is_horizontal and ys == yd: + # straight connector + self.path_to(portspec_src, None, x=xd, **dst_args) + elif not src_is_horizontal and xs == xd: + # straight connector + self.path_to(portspec_src, None, y=yd, **dst_args) + else: + # S-bend, delegate to implementations + (travel, jog), _ = port_src.measure_travel(port_dst) + self.pathS(portspec_src, -travel, -jog, **dst_args) + elif numpy.isclose(angle, 0): + raise BuildError('Don\'t know how to route a U-bend yet (TODO)!') + else: + raise BuildError(f'Don\'t know how to route ports with relative angle {angle}') + + if thru is not None: + self.rename_ports({thru: portspec_src}) + + return self + + def mpath( + self, + portspec: str | Sequence[str], + ccw: SupportsBool | None, + *, + spacing: float | ArrayLike | None = None, + set_rotation: float | None = None, + **kwargs, + ) -> Self: + """ + `mpath` is a superset of `path` and `path_to` which can act on bundles or buses + of "wires or "waveguides". + + The wires will travel so that the output ports will be placed at well-defined + locations along the axis of their input ports, but may have arbitrary (tool- + dependent) offsets in the perpendicular direction. + + If `ccw` is not `None`, the wire bundle will turn 90 degres in either the + clockwise (`ccw=False`) or counter-clockwise (`ccw=True`) direction. Within the + bundle, the center-to-center wire spacings after the turn are set by `spacing`, + which is required when `ccw` is not `None`. The final position of bundle as a + whole can be set in a number of ways: + + =A>---------------------------V turn direction: `ccw=False` + =B>-------------V | + =C>-----------------------V | + =D=>----------------V | + | + + x---x---x---x `spacing` (can be scalar or array) + + <--------------> `emin=` + <------> `bound_type='min_past_furthest', bound=` + <--------------------------------> `emax=` + x `pmin=` + x `pmax=` + + - `emin=`, equivalent to `bound_type='min_extension', bound=` + The total extension value for the furthest-out port (B in the diagram). + - `emax=`, equivalent to `bound_type='max_extension', bound=`: + The total extension value for the closest-in port (C in the diagram). + - `pmin=`, equivalent to `xmin=`, `ymin=`, or `bound_type='min_position', bound=`: + The coordinate of the innermost bend (D's bend). + The x/y versions throw an error if they do not match the port axis (for debug) + - `pmax=`, `xmax=`, `ymax=`, or `bound_type='max_position', bound=`: + The coordinate of the outermost bend (A's bend). + The x/y versions throw an error if they do not match the port axis (for debug) + - `bound_type='min_past_furthest', bound=`: + The distance between furthest out-port (B) and the innermost bend (D's bend). + + If `ccw=None`, final output positions (along the input axis) of all wires will be + identical (i.e. wires will all be cut off evenly). In this case, `spacing=None` is + required. In this case, `emin=` and `emax=` are equivalent to each other, and + `pmin=`, `pmax=`, `xmin=`, etc. are also equivalent to each other. + + If using `RenderPather`, `RenderPather.render` must be called after all paths have been fully planned. + + Args: + portspec: The names of the ports which are to be routed. + ccw: If `None`, the outputs should be along the same axis as the inputs. + Otherwise, cast to bool and turn 90 degrees counterclockwise if `True` + and clockwise otherwise. + spacing: Center-to-center distance between output ports along the input port's axis. + Must be provided if (and only if) `ccw` is not `None`. + set_rotation: If the provided ports have `rotation=None`, this can be used + to set a rotation for them. + + Returns: + self + + Raises: + BuildError if the implied length for any wire is too close to fit the bend + (if a bend is requested). + BuildError if `xmin`/`xmax` or `ymin`/`ymax` is specified but does not + match the axis of `portspec`. + BuildError if an incorrect bound type or spacing is specified. + """ + if self._dead: + logger.error('Skipping mpath() since device is dead') + return self + + bound_types = set() + if 'bound_type' in kwargs: + bound_types.add(kwargs.pop('bound_type')) + bound = kwargs.pop('bound') + for bt in ('emin', 'emax', 'pmin', 'pmax', 'xmin', 'xmax', 'ymin', 'ymax', 'min_past_furthest'): + if bt in kwargs: + bound_types.add(bt) + bound = kwargs.pop(bt) + + if not bound_types: + raise BuildError('No bound type specified for mpath') + if len(bound_types) > 1: + raise BuildError(f'Too many bound types specified for mpath: {bound_types}') + bound_type = tuple(bound_types)[0] + + if isinstance(portspec, str): + portspec = [portspec] + ports = self.pattern[tuple(portspec)] + + extensions = ell(ports, ccw, spacing=spacing, bound=bound, bound_type=bound_type, set_rotation=set_rotation) + + #if container: + # assert not getattr(self, 'render'), 'Containers not implemented for RenderPather' + # bld = self.interface(source=ports, library=self.library, tools=self.tools) + # for port_name, length in extensions.items(): + # bld.path(port_name, ccw, length, **kwargs) + # self.library[container] = bld.pattern + # self.plug(Abstract(container, bld.pattern.ports), {sp: 'in_' + sp for sp in ports}) # TODO safe to use 'in_'? + #else: + for port_name, length in extensions.items(): + self.path(port_name, ccw, length, **kwargs) + return self + + # TODO def bus_join()? + + def flatten(self) -> Self: + """ + Flatten the contained pattern, using the contained library to resolve references. + + Returns: + self + """ + self.pattern.flatten(self.library) + return self + + def at(self, portspec: str | Iterable[str]) -> 'PortPather': + return PortPather(portspec, self) + + +class PortPather: + """ + Port state manager + + This class provides a convenient way to perform multiple pathing operations on a + set of ports without needing to repeatedly pass their names. + """ + ports: list[str] + pather: PatherMixin + + def __init__(self, ports: str | Iterable[str], pather: PatherMixin) -> None: + self.ports = [ports] if isinstance(ports, str) else list(ports) + self.pather = pather + + # + # Delegate to pather + # + def retool(self, tool: Tool) -> Self: + self.pather.retool(tool, keys=self.ports) + return self + + @contextmanager + def toolctx(self, tool: Tool) -> Iterator[Self]: + with self.pather.toolctx(tool, keys=self.ports): + yield self + + def path(self, *args, **kwargs) -> Self: + if len(self.ports) > 1: + logger.warning('Use path_each() when pathing multiple ports independently') + for port in self.ports: + self.pather.path(port, *args, **kwargs) + return self + + def path_each(self, *args, **kwargs) -> Self: + for port in self.ports: + self.pather.path(port, *args, **kwargs) + return self + + def pathS(self, *args, **kwargs) -> Self: + if len(self.ports) > 1: + logger.warning('Use pathS_each() when pathing multiple ports independently') + for port in self.ports: + self.pather.pathS(port, *args, **kwargs) + return self + + def pathS_each(self, *args, **kwargs) -> Self: + for port in self.ports: + self.pather.pathS(port, *args, **kwargs) + return self + + def path_to(self, *args, **kwargs) -> Self: + if len(self.ports) > 1: + logger.warning('Use path_each_to() when pathing multiple ports independently') + for port in self.ports: + self.pather.path_to(port, *args, **kwargs) + return self + + def path_each_to(self, *args, **kwargs) -> Self: + for port in self.ports: + self.pather.path_to(port, *args, **kwargs) + return self + + def mpath(self, *args, **kwargs) -> Self: + self.pather.mpath(self.ports, *args, **kwargs) + return self + + def path_into(self, *args, **kwargs) -> Self: + """ Path_into, using the current port as the source """ + if len(self.ports) > 1: + raise BuildError(f'Unable use implicit path_into() with {len(self.ports)} (>1) ports.') + self.pather.path_into(self.ports[0], *args, **kwargs) + return self + + def path_from(self, *args, **kwargs) -> Self: + """ Path_into, using the current port as the destination """ + if len(self.ports) > 1: + raise BuildError(f'Unable use implicit path_from() with {len(self.ports)} (>1) ports.') + thru = kwargs.pop('thru', None) + self.pather.path_into(args[0], self.ports[0], *args[1:], **kwargs) + if thru is not None: + self.rename_from(thru) + return self + + def plug( + self, + other: Abstract | str, + other_port: str, + *args, + **kwargs, + ) -> Self: + if len(self.ports) > 1: + raise BuildError(f'Unable use implicit plug() with {len(self.ports)} ports.' + 'Use the pather or pattern directly to plug multiple ports.') + self.pather.plug(other, {self.ports[0]: other_port}, *args, **kwargs) + return self + + def plugged(self, other_port: str) -> Self: + if len(self.ports) > 1: + raise BuildError(f'Unable use implicit plugged() with {len(self.ports)} (>1) ports.') + self.pather.plugged({self.ports[0]: other_port}) + return self + + # + # Delegate to port + # + def set_ptype(self, ptype: str) -> Self: + for port in self.ports: + self.pather[port].set_ptype(ptype) + return self + + def translate(self, *args, **kwargs) -> Self: + for port in self.ports: + self.pather[port].translate(*args, **kwargs) + return self + + def mirror(self, *args, **kwargs) -> Self: + for port in self.ports: + self.pather[port].mirror(*args, **kwargs) + return self + + def rotate(self, rotation: float) -> Self: + for port in self.ports: + self.pather[port].rotate(rotation) + return self + + def set_rotation(self, rotation: float | None) -> Self: + for port in self.ports: + self.pather[port].set_rotation(rotation) + return self + + def rename_to(self, new_name: str) -> Self: + if len(self.ports) > 1: + BuildError('Use rename_ports() for >1 port') + self.pather.rename_ports({self.ports[0]: new_name}) + self.ports[0] = new_name + return self + + def rename_from(self, old_name: str) -> Self: + if len(self.ports) > 1: + BuildError('Use rename_ports() for >1 port') + self.pather.rename_ports({old_name: self.ports[0]}) + return self + + def rename_ports(self, name_map: dict[str, str | None]) -> Self: + self.pather.rename_ports(name_map) + self.ports = [mm for mm in [name_map.get(pp, pp) for pp in self.ports] if mm is not None] + return self + + def add_ports(self, ports: Iterable[str]) -> Self: + ports = list(ports) + conflicts = set(ports) & set(self.ports) + if conflicts: + raise BuildError(f'ports {conflicts} already selected') + self.ports += ports + return self + + def add_port(self, port: str, index: int | None = None) -> Self: + if port in self.ports: + raise BuildError(f'{port=} already selected') + if index is not None: + self.ports.insert(index, port) + else: + self.ports.append(port) + return self + + def drop_port(self, port: str) -> Self: + if port not in self.ports: + raise BuildError(f'{port=} already not selected') + self.ports = [pp for pp in self.ports if pp != port] + return self + + def into_copy(self, new_name: str, src: str | None = None) -> Self: + """ Copy a port and replace it with the copy """ + if not self.ports: + raise BuildError('Have no ports to copy') + if len(self.ports) == 1: + src = self.ports[0] + elif src is None: + raise BuildError('Must specify src when >1 port is available') + if src not in self.ports: + raise BuildError(f'{src=} not available') + self.pather.ports[new_name] = self.pather[src].copy() + self.ports = [(new_name if pp == src else pp) for pp in self.ports] + return self + + def save_copy(self, new_name: str, src: str | None = None) -> Self: + """ Copy a port and but keep using the original """ + if not self.ports: + raise BuildError('Have no ports to copy') + if len(self.ports) == 1: + src = self.ports[0] + elif src is None: + raise BuildError('Must specify src when >1 port is available') + if src not in self.ports: + raise BuildError(f'{src=} not available') + self.pather.ports[new_name] = self.pather[src].copy() + return self + + @overload + def delete(self, name: None) -> None: ... + + @overload + def delete(self, name: str) -> Self: ... + + def delete(self, name: str | None = None) -> Self | None: + if name is None: + for pp in self.ports: + del self.pather.ports[pp] + return None + del self.pather.ports[name] + self.ports = [pp for pp in self.ports if pp != name] + return self + diff --git a/masque/builder/renderpather.py b/masque/builder/renderpather.py new file mode 100644 index 0000000..303a59d --- /dev/null +++ b/masque/builder/renderpather.py @@ -0,0 +1,646 @@ +""" +Pather with batched (multi-step) rendering +""" +from typing import Self +from collections.abc import Sequence, Mapping, MutableMapping, Iterable +import copy +import logging +from collections import defaultdict +from functools import wraps +from pprint import pformat + +from numpy import pi +from numpy.typing import ArrayLike + +from ..pattern import Pattern +from ..library import ILibrary, TreeView +from ..error import BuildError +from ..ports import PortList, Port +from ..abstract import Abstract +from ..utils import SupportsBool +from .tools import Tool, RenderStep +from .pather_mixin import PatherMixin + + +logger = logging.getLogger(__name__) + + +class RenderPather(PatherMixin): + """ + `RenderPather` is an alternative to `Pather` which uses the `path`/`path_to`/`mpath` + functions to plan out wire paths without incrementally generating the layout. Instead, + it waits until `render` is called, at which point it draws all the planned segments + simultaneously. This allows it to e.g. draw each wire using a single `Path` or + `Polygon` shape instead of multiple rectangles. + + `RenderPather` calls out to `Tool.planL` and `Tool.render` to provide tool-specific + dimensions and build the final geometry for each wire. `Tool.planL` provides the + output port data (relative to the input) for each segment. The tool, input and output + ports are placed into a `RenderStep`, and a sequence of `RenderStep`s is stored for + each port. When `render` is called, it bundles `RenderStep`s into batches which use + the same `Tool`, and passes each batch to the relevant tool's `Tool.render` to build + the geometry. + + See `Pather` for routing examples. After routing is complete, `render` must be called + to generate the final geometry. + """ + __slots__ = ('pattern', 'library', 'paths', 'tools', '_dead', ) + + pattern: Pattern + """ Layout of this device """ + + library: ILibrary + """ Library from which patterns should be referenced """ + + _dead: bool + """ If True, plug()/place() are skipped (for debugging) """ + + paths: defaultdict[str, list[RenderStep]] + """ Per-port list of operations, to be used by `render` """ + + tools: dict[str | None, Tool] + """ + Tool objects are used to dynamically generate new single-use Devices + (e.g wires or waveguides) to be plugged into this device. + """ + + @property + def ports(self) -> dict[str, Port]: + return self.pattern.ports + + @ports.setter + def ports(self, value: dict[str, Port]) -> None: + self.pattern.ports = value + + def __init__( + self, + library: ILibrary, + *, + pattern: Pattern | None = None, + ports: str | Mapping[str, Port] | None = None, + tools: Tool | MutableMapping[str | None, Tool] | None = None, + name: str | None = None, + ) -> None: + """ + Args: + library: The library from which referenced patterns will be taken, + and where new patterns (e.g. generated by the `tools`) will be placed. + pattern: The pattern which will be modified by subsequent operations. + If `None` (default), a new pattern is created. + ports: Allows specifying the initial set of ports, if `pattern` does + not already have any ports (or is not provided). May be a string, + in which case it is interpreted as a name in `library`. + Default `None` (no ports). + tools: A mapping of {port: tool} which specifies what `Tool` should be used + to generate waveguide or wire segments when `path`/`path_to`/`mpath` + are called. Relies on `Tool.planL` and `Tool.render` implementations. + name: If specified, `library[name]` is set to `self.pattern`. + """ + self._dead = False + self.paths = defaultdict(list) + self.library = library + if pattern is not None: + self.pattern = pattern + else: + self.pattern = Pattern() + + if ports is not None: + if self.pattern.ports: + raise BuildError('Ports supplied for pattern with pre-existing ports!') + if isinstance(ports, str): + ports = library.abstract(ports).ports + + self.pattern.ports.update(copy.deepcopy(dict(ports))) + + if name is not None: + library[name] = self.pattern + + if tools is None: + self.tools = {} + elif isinstance(tools, Tool): + self.tools = {None: tools} + else: + self.tools = dict(tools) + + @classmethod + def interface( + cls: type['RenderPather'], + source: PortList | Mapping[str, Port] | str, + *, + library: ILibrary | None = None, + tools: Tool | MutableMapping[str | None, Tool] | None = None, + in_prefix: str = 'in_', + out_prefix: str = '', + port_map: dict[str, str] | Sequence[str] | None = None, + name: str | None = None, + ) -> 'RenderPather': + """ + Wrapper for `Pattern.interface()`, which returns a RenderPather instead. + + Args: + source: A collection of ports (e.g. Pattern, Builder, or dict) + from which to create the interface. May be a pattern name if + `library` is provided. + library: Library from which existing patterns should be referenced, + and to which the new one should be added (if named). If not provided, + `source.library` must exist and will be used. + tools: `Tool`s which will be used by the pather for generating new wires + or waveguides (via `path`/`path_to`/`mpath`). + in_prefix: Prepended to port names for newly-created ports with + reversed directions compared to the current device. + out_prefix: Prepended to port names for ports which are directly + copied from the current device. + port_map: Specification for ports to copy into the new device: + - If `None`, all ports are copied. + - If a sequence, only the listed ports are copied + - If a mapping, the listed ports (keys) are copied and + renamed (to the values). + + Returns: + The new `RenderPather`, with an empty pattern and 2x as many ports as + listed in port_map. + + Raises: + `PortError` if `port_map` contains port names not present in the + current device. + `PortError` if applying the prefixes results in duplicate port + names. + """ + if library is None: + if hasattr(source, 'library') and isinstance(source.library, ILibrary): + library = source.library + else: + raise BuildError('No library provided (and not present in `source.library`') + + if tools is None and hasattr(source, 'tools') and isinstance(source.tools, dict): + tools = source.tools + + if isinstance(source, str): + source = library.abstract(source).ports + + pat = Pattern.interface(source, in_prefix=in_prefix, out_prefix=out_prefix, port_map=port_map) + new = RenderPather(library=library, pattern=pat, name=name, tools=tools) + return new + + def __repr__(self) -> str: + s = f'' + return s + + def plug( + self, + other: Abstract | str | Pattern | TreeView, + map_in: dict[str, str], + map_out: dict[str, str | None] | None = None, + *, + mirrored: bool = False, + thru: bool | str = True, + set_rotation: bool | None = None, + append: bool = False, + ok_connections: Iterable[tuple[str, str]] = (), + ) -> Self: + """ + Wrapper for `Pattern.plug` which adds a `RenderStep` with opcode 'P' + for any affected ports. This separates any future `RenderStep`s on the + same port into a new batch, since the plugged device interferes with drawing. + + Args: + other: An `Abstract`, string, or `Pattern` describing the device to be instatiated. + map_in: dict of `{'self_port': 'other_port'}` mappings, specifying + port connections between the two devices. + map_out: dict of `{'old_name': 'new_name'}` mappings, specifying + new names for ports in `other`. + mirrored: Enables mirroring `other` across the x axis prior to + connecting any ports. + thru: If map_in specifies only a single port, `thru` provides a mechainsm + to avoid repeating the port name. Eg, for `map_in={'myport': 'A'}`, + - If True (default), and `other` has only two ports total, and map_out + doesn't specify a name for the other port, its name is set to the key + in `map_in`, i.e. 'myport'. + - If a string, `map_out[thru]` is set to the key in `map_in` (i.e. 'myport'). + An error is raised if that entry already exists. + + This makes it easy to extend a pattern with simple 2-port devices + (e.g. wires) without providing `map_out` each time `plug` is + called. See "Examples" above for more info. Default `True`. + set_rotation: If the necessary rotation cannot be determined from + the ports being connected (i.e. all pairs have at least one + port with `rotation=None`), `set_rotation` must be provided + to indicate how much `other` should be rotated. Otherwise, + `set_rotation` must remain `None`. + append: If `True`, `other` is appended instead of being referenced. + Note that this does not flatten `other`, so its refs will still + be refs (now inside `self`). + ok_connections: Set of "allowed" ptype combinations. Identical + ptypes are always allowed to connect, as is `'unk'` with + any other ptypte. Non-allowed ptype connections will emit a + warning. Order is ignored, i.e. `(a, b)` is equivalent to + `(b, a)`. + + + Returns: + self + + Raises: + `PortError` if any ports specified in `map_in` or `map_out` do not + exist in `self.ports` or `other_names`. + `PortError` if there are any duplicate names after `map_in` and `map_out` + are applied. + `PortError` if the specified port mapping is not achieveable (the ports + do not line up) + """ + if self._dead: + logger.error('Skipping plug() since device is dead') + return self + + other_tgt: Pattern | Abstract + if isinstance(other, str): + other_tgt = self.library.abstract(other) + if append and isinstance(other, Abstract): + other_tgt = self.library[other.name] + + # get rid of plugged ports + for kk in map_in: + if kk in self.paths: + self.paths[kk].append(RenderStep('P', None, self.ports[kk].copy(), self.ports[kk].copy(), None)) + + plugged = map_in.values() + for name, port in other_tgt.ports.items(): + if name in plugged: + continue + new_name = map_out.get(name, name) if map_out is not None else name + if new_name is not None and new_name in self.paths: + self.paths[new_name].append(RenderStep('P', None, port.copy(), port.copy(), None)) + + self.pattern.plug( + other = other_tgt, + map_in = map_in, + map_out = map_out, + mirrored = mirrored, + thru = thru, + set_rotation = set_rotation, + append = append, + ok_connections = ok_connections, + ) + + return self + + def place( + self, + other: Abstract | str, + *, + offset: ArrayLike = (0, 0), + rotation: float = 0, + pivot: ArrayLike = (0, 0), + mirrored: bool = False, + port_map: dict[str, str | None] | None = None, + skip_port_check: bool = False, + append: bool = False, + ) -> Self: + """ + Wrapper for `Pattern.place` which adds a `RenderStep` with opcode 'P' + for any affected ports. This separates any future `RenderStep`s on the + same port into a new batch, since the placed device interferes with drawing. + + Note that mirroring is applied before rotation; translation (`offset`) is applied last. + + Args: + other: An `Abstract` or `Pattern` describing the device to be instatiated. + offset: Offset at which to place the instance. Default (0, 0). + rotation: Rotation applied to the instance before placement. Default 0. + pivot: Rotation is applied around this pivot point (default (0, 0)). + Rotation is applied prior to translation (`offset`). + mirrored: Whether theinstance should be mirrored across the x axis. + Mirroring is applied before translation and rotation. + port_map: dict of `{'old_name': 'new_name'}` mappings, specifying + new names for ports in the instantiated pattern. New names can be + `None`, which will delete those ports. + skip_port_check: Can be used to skip the internal call to `check_ports`, + in case it has already been performed elsewhere. + append: If `True`, `other` is appended instead of being referenced. + Note that this does not flatten `other`, so its refs will still + be refs (now inside `self`). + + Returns: + self + + Raises: + `PortError` if any ports specified in `map_in` or `map_out` do not + exist in `self.ports` or `other.ports`. + `PortError` if there are any duplicate names after `map_in` and `map_out` + are applied. + """ + if self._dead: + logger.error('Skipping place() since device is dead') + return self + + other_tgt: Pattern | Abstract + if isinstance(other, str): + other_tgt = self.library.abstract(other) + if append and isinstance(other, Abstract): + other_tgt = self.library[other.name] + + for name, port in other_tgt.ports.items(): + new_name = port_map.get(name, name) if port_map is not None else name + if new_name is not None and new_name in self.paths: + self.paths[new_name].append(RenderStep('P', None, port.copy(), port.copy(), None)) + + self.pattern.place( + other = other_tgt, + offset = offset, + rotation = rotation, + pivot = pivot, + mirrored = mirrored, + port_map = port_map, + skip_port_check = skip_port_check, + append = append, + ) + + return self + + def plugged( + self, + connections: dict[str, str], + ) -> Self: + for aa, bb in connections.items(): + porta = self.ports[aa] + portb = self.ports[bb] + self.paths[aa].append(RenderStep('P', None, porta.copy(), porta.copy(), None)) + self.paths[bb].append(RenderStep('P', None, portb.copy(), portb.copy(), None)) + PortList.plugged(self, connections) + return self + + def path( + self, + portspec: str, + ccw: SupportsBool | None, + length: float, + *, + plug_into: str | None = None, + **kwargs, + ) -> Self: + """ + Plan a "wire"/"waveguide" extending from the port `portspec`, with the aim + of traveling exactly `length` distance. + + The wire will travel `length` distance along the port's axis, an an unspecified + (tool-dependent) distance in the perpendicular direction. The output port will + be rotated (or not) based on the `ccw` parameter. + + `RenderPather.render` must be called after all paths have been fully planned. + + Args: + portspec: The name of the port into which the wire will be plugged. + ccw: If `None`, the output should be along the same axis as the input. + Otherwise, cast to bool and turn counterclockwise if True + and clockwise otherwise. + length: The total distance from input to output, along the input's axis only. + (There may be a tool-dependent offset along the other axis.) + plug_into: If not None, attempts to plug the wire's output port into the provided + port on `self`. + + Returns: + self + + Raises: + BuildError if `distance` is too small to fit the bend (if a bend is present). + LibraryError if no valid name could be picked for the pattern. + """ + if self._dead: + logger.error('Skipping path() since device is dead') + return self + + port = self.pattern[portspec] + in_ptype = port.ptype + port_rot = port.rotation + assert port_rot is not None # TODO allow manually setting rotation for RenderPather.path()? + + tool = self.tools.get(portspec, self.tools[None]) + # ask the tool for bend size (fill missing dx or dy), check feasibility, and get out_ptype + out_port, data = tool.planL(ccw, length, in_ptype=in_ptype, **kwargs) + + # Update port + out_port.rotate_around((0, 0), pi + port_rot) + out_port.translate(port.offset) + + step = RenderStep('L', tool, port.copy(), out_port.copy(), data) + self.paths[portspec].append(step) + + self.pattern.ports[portspec] = out_port.copy() + + if plug_into is not None: + self.plugged({portspec: plug_into}) + + return self + + def pathS( + self, + portspec: str, + length: float, + jog: float, + *, + plug_into: str | None = None, + **kwargs, + ) -> Self: + """ + Create an S-shaped "wire"/"waveguide" and `plug` it into the port `portspec`, with the aim + of traveling exactly `length` distance with an offset `jog` along the other axis (+ve jog is + left of direction of travel). + + The output port will have the same orientation as the source port (`portspec`). + + `RenderPather.render` must be called after all paths have been fully planned. + + This function attempts to use `tool.planS()`, but falls back to `tool.planL()` if the former + raises a NotImplementedError. + + Args: + portspec: The name of the port into which the wire will be plugged. + jog: Total manhattan distance perpendicular to the direction of travel. + Positive values are to the left of the direction of travel. + length: The total manhattan distance from input to output, along the input's axis only. + (There may be a tool-dependent offset along the other axis.) + plug_into: If not None, attempts to plug the wire's output port into the provided + port on `self`. + + Returns: + self + + Raises: + BuildError if `distance` is too small to fit the s-bend (for nonzero jog). + LibraryError if no valid name could be picked for the pattern. + """ + if self._dead: + logger.error('Skipping pathS() since device is dead') + return self + + port = self.pattern[portspec] + in_ptype = port.ptype + port_rot = port.rotation + assert port_rot is not None # TODO allow manually setting rotation for RenderPather.path()? + + tool = self.tools.get(portspec, self.tools[None]) + + # check feasibility, get output port and data + try: + out_port, data = tool.planS(length, jog, in_ptype=in_ptype, **kwargs) + except NotImplementedError: + # Fall back to drawing two L-bends + ccw0 = jog > 0 + kwargs_no_out = (kwargs | {'out_ptype': None}) + t_port0, _ = tool.planL( ccw0, length / 2, in_ptype=in_ptype, **kwargs_no_out) + jog0 = Port((0, 0), 0).measure_travel(t_port0)[0][1] + t_port1, _ = tool.planL(not ccw0, abs(jog - jog0), in_ptype=t_port0.ptype, **kwargs) + jog1 = Port((0, 0), 0).measure_travel(t_port1)[0][1] + + kwargs_plug = kwargs | {'plug_into': plug_into} + self.path(portspec, ccw0, length - abs(jog1), **kwargs_no_out) + self.path(portspec, not ccw0, abs(jog - jog0), **kwargs_plug) + return self + + out_port.rotate_around((0, 0), pi + port_rot) + out_port.translate(port.offset) + step = RenderStep('S', tool, port.copy(), out_port.copy(), data) + self.paths[portspec].append(step) + self.pattern.ports[portspec] = out_port.copy() + + if plug_into is not None: + self.plugged({portspec: plug_into}) + return self + + + def render( + self, + append: bool = True, + ) -> Self: + """ + Generate the geometry which has been planned out with `path`/`path_to`/etc. + + Args: + append: If `True`, the rendered geometry will be directly appended to + `self.pattern`. Note that it will not be flattened, so if only one + layer of hierarchy is eliminated. + + Returns: + self + """ + lib = self.library + tool_port_names = ('A', 'B') + pat = Pattern() + + def render_batch(portspec: str, batch: list[RenderStep], append: bool) -> None: + assert batch[0].tool is not None + name = lib << batch[0].tool.render(batch, port_names=tool_port_names) + pat.ports[portspec] = batch[0].start_port.copy() + if append: + pat.plug(lib[name], {portspec: tool_port_names[0]}, append=append) + del lib[name] # NOTE if the rendered pattern has refs, those are now in `pat` but not flattened + else: + pat.plug(lib.abstract(name), {portspec: tool_port_names[0]}, append=append) + + for portspec, steps in self.paths.items(): + batch: list[RenderStep] = [] + for step in steps: + appendable_op = step.opcode in ('L', 'S', 'U') + same_tool = batch and step.tool == batch[0].tool + + # If we can't continue a batch, render it + if batch and (not appendable_op or not same_tool): + render_batch(portspec, batch, append) + batch = [] + + # batch is emptied already if we couldn't continue it + if appendable_op: + batch.append(step) + + # Opcodes which break the batch go below this line + if not appendable_op and portspec in pat.ports: + del pat.ports[portspec] + + #If the last batch didn't end yet + if batch: + render_batch(portspec, batch, append) + + self.paths.clear() + pat.ports.clear() + self.pattern.append(pat) + + return self + + def translate(self, offset: ArrayLike) -> Self: + """ + Translate the pattern and all ports. + + Args: + offset: (x, y) distance to translate by + + Returns: + self + """ + self.pattern.translate_elements(offset) + return self + + def rotate_around(self, pivot: ArrayLike, angle: float) -> Self: + """ + Rotate the pattern and all ports. + + Args: + angle: angle (radians, counterclockwise) to rotate by + pivot: location to rotate around + + Returns: + self + """ + self.pattern.rotate_around(pivot, angle) + return self + + def mirror(self, axis: int) -> Self: + """ + Mirror the pattern and all ports across the specified axis. + + Args: + axis: Axis to mirror across (x=0, y=1) + + Returns: + self + """ + self.pattern.mirror(axis) + return self + + def set_dead(self) -> Self: + """ + Disallows further changes through `plug()` or `place()`. + This is meant for debugging: + ``` + dev.plug(a, ...) + dev.set_dead() # added for debug purposes + dev.plug(b, ...) # usually raises an error, but now skipped + dev.plug(c, ...) # also skipped + dev.pattern.visualize() # shows the device as of the set_dead() call + ``` + + Returns: + self + """ + self._dead = True + return self + + @wraps(Pattern.label) + def label(self, *args, **kwargs) -> Self: + self.pattern.label(*args, **kwargs) + return self + + @wraps(Pattern.ref) + def ref(self, *args, **kwargs) -> Self: + self.pattern.ref(*args, **kwargs) + return self + + @wraps(Pattern.polygon) + def polygon(self, *args, **kwargs) -> Self: + self.pattern.polygon(*args, **kwargs) + return self + + @wraps(Pattern.rect) + def rect(self, *args, **kwargs) -> Self: + self.pattern.rect(*args, **kwargs) + return self + diff --git a/masque/builder/tools.py b/masque/builder/tools.py index c17a7b9..6bd7547 100644 --- a/masque/builder/tools.py +++ b/masque/builder/tools.py @@ -1,12 +1,10 @@ """ Tools are objects which dynamically generate simple single-use devices (e.g. wires or waveguides) -Concrete tools may implement native planning/rendering for `L`, `S`, or `U` routes. -Any unimplemented planning method falls back to the corresponding `trace*()` method, -and `Pather` may further synthesize some routes from simpler primitives when needed. +# TODO document all tools """ -from typing import Literal, Any, Self, cast -from collections.abc import Sequence, Callable, Iterator +from typing import Literal, Any, Self +from collections.abc import Sequence, Callable from abc import ABCMeta # , abstractmethod # TODO any way to make Tool ok with implementing only one method? from dataclasses import dataclass @@ -25,8 +23,8 @@ from ..error import BuildError @dataclass(frozen=True, slots=True) class RenderStep: """ - Representation of a single saved operation, used by deferred `Pather` - instances and passed to `Tool.render()` when `Pather.render()` is called. + Representation of a single saved operation, used by `RenderPather` and passed + to `Tool.render()` when `RenderPather.render()` is called. """ opcode: Literal['L', 'S', 'U', 'P'] """ What operation is being performed. @@ -49,72 +47,16 @@ class RenderStep: if self.opcode != 'P' and self.tool is None: raise BuildError('Got tool=None but the opcode is not "P"') - def is_continuous_with(self, other: 'RenderStep') -> bool: - """ - Check if another RenderStep can be appended to this one. - """ - # Check continuity with tolerance - offsets_match = bool(numpy.allclose(other.start_port.offset, self.end_port.offset)) - rotations_match = (other.start_port.rotation is None and self.end_port.rotation is None) or ( - other.start_port.rotation is not None and self.end_port.rotation is not None and - bool(numpy.isclose(other.start_port.rotation, self.end_port.rotation)) - ) - return offsets_match and rotations_match - - def transformed(self, translation: NDArray[numpy.float64], rotation: float, pivot: NDArray[numpy.float64]) -> 'RenderStep': - """ - Return a new RenderStep with transformed start and end ports. - """ - new_start = self.start_port.copy() - new_end = self.end_port.copy() - - for pp in (new_start, new_end): - pp.rotate_around(pivot, rotation) - pp.translate(translation) - - return RenderStep( - opcode = self.opcode, - tool = self.tool, - start_port = new_start, - end_port = new_end, - data = self.data, - ) - - def mirrored(self, axis: int) -> 'RenderStep': - """ - Return a new RenderStep with mirrored start and end ports. - """ - new_start = self.start_port.copy() - new_end = self.end_port.copy() - - new_start.flip_across(axis=axis) - new_end.flip_across(axis=axis) - - return RenderStep( - opcode = self.opcode, - tool = self.tool, - start_port = new_start, - end_port = new_end, - data = self.data, - ) - - -def measure_tool_plan(tree: ILibrary, port_names: tuple[str, str]) -> tuple[Port, Any]: - """ - Extracts a Port and returns the tree (as data) for tool planning fallbacks. - """ - pat = tree.top_pattern() - in_p = pat[port_names[0]] - out_p = pat[port_names[1]] - (travel, jog), rot = in_p.measure_travel(out_p) - return Port((travel, jog), rotation=rot, ptype=out_p.ptype), tree - class Tool: """ Interface for path (e.g. wire or waveguide) generation. + + Note that subclasses may implement only a subset of the methods and leave others + unimplemented (e.g. in cases where they don't make sense or the required components + are impractical or unavailable). """ - def traceL( + def path( self, ccw: SupportsBool | None, length: float, @@ -128,7 +70,7 @@ class Tool: Create a wire or waveguide that travels exactly `length` distance along the axis of its input port. - Used by `Pather`. + Used by `Pather` and `RenderPather`. The output port must be exactly `length` away along the input port's axis, but may be placed an additional (unspecified) distance away along the perpendicular @@ -157,9 +99,9 @@ class Tool: Raises: BuildError if an impossible or unsupported geometry is requested. """ - raise NotImplementedError(f'traceL() not implemented for {type(self)}') + raise NotImplementedError(f'path() not implemented for {type(self)}') - def traceS( + def pathS( self, length: float, jog: float, @@ -174,7 +116,7 @@ class Tool: of its input port, and `jog` distance on the perpendicular axis. `jog` is positive when moving left of the direction of travel (from input to ouput port). - Used by `Pather`. + Used by `Pather` and `RenderPather`. The output port should be rotated to face the input port (i.e. plugging the device into a port will move that port but keep its orientation). @@ -199,7 +141,7 @@ class Tool: Raises: BuildError if an impossible or unsupported geometry is requested. """ - raise NotImplementedError(f'traceS() not implemented for {type(self)}') + raise NotImplementedError(f'path() not implemented for {type(self)}') def planL( self, @@ -214,7 +156,7 @@ class Tool: Plan a wire or waveguide that travels exactly `length` distance along the axis of its input port. - Used by `Pather` when `auto_render=False`. + Used by `RenderPather`. The output port must be exactly `length` away along the input port's axis, but may be placed an additional (unspecified) distance away along the perpendicular @@ -241,17 +183,7 @@ class Tool: Raises: BuildError if an impossible or unsupported geometry is requested. """ - # Fallback implementation using traceL - port_names = kwargs.pop('port_names', ('A', 'B')) - tree = self.traceL( - ccw, - length, - in_ptype=in_ptype, - out_ptype=out_ptype, - port_names=port_names, - **kwargs, - ) - return measure_tool_plan(tree, port_names) + raise NotImplementedError(f'planL() not implemented for {type(self)}') def planS( self, @@ -266,7 +198,7 @@ class Tool: Plan a wire or waveguide that travels exactly `length` distance along the axis of its input port and `jog` distance along the perpendicular axis (i.e. an S-bend). - Used by `Pather` when `auto_render=False`. + Used by `RenderPather`. The output port must have an orientation rotated by pi from the input port. @@ -276,8 +208,8 @@ class Tool: Args: length: The total distance from input to output, along the input's axis only. jog: The total offset from the input to output, along the perpendicular axis. - A positive number implies a leftward shift (i.e. counterclockwise bend followed - by a clockwise bend) + A positive number implies a rightwards shift (i.e. clockwise bend followed + by a counterclockwise bend) in_ptype: The `ptype` of the port into which this wire's input will be `plug`ged. out_ptype: The `ptype` of the port into which this wire's output will be `plug`ged. kwargs: Custom tool-specific parameters. @@ -289,58 +221,7 @@ class Tool: Raises: BuildError if an impossible or unsupported geometry is requested. """ - # Fallback implementation using traceS - port_names = kwargs.pop('port_names', ('A', 'B')) - tree = self.traceS( - length, - jog, - in_ptype=in_ptype, - out_ptype=out_ptype, - port_names=port_names, - **kwargs, - ) - return measure_tool_plan(tree, port_names) - - def traceU( - self, - jog: float, - *, - length: float = 0, - in_ptype: str | None = None, - out_ptype: str | None = None, - port_names: tuple[str, str] = ('A', 'B'), - **kwargs, - ) -> Library: - """ - Create a wire or waveguide that travels exactly `jog` distance along the axis - perpendicular to its input port (i.e. a U-bend). - - Used by `Pather`. Tools may leave this unimplemented if they - do not support a native U-bend primitive. - - The output port must have an orientation identical to the input port. - - The input and output ports should be compatible with `in_ptype` and - `out_ptype`, respectively. They should also be named `port_names[0]` and - `port_names[1]`, respectively. - - Args: - jog: The total offset from the input to output, along the perpendicular axis. - A positive number implies a leftwards shift (i.e. counterclockwise bend - followed by a clockwise bend) - in_ptype: The `ptype` of the port into which this wire's input will be `plug`ged. - out_ptype: The `ptype` of the port into which this wire's output will be `plug`ged. - port_names: The output pattern will have its input port named `port_names[0]` and - its output named `port_names[1]`. - kwargs: Custom tool-specific parameters. - - Returns: - A pattern tree containing the requested U-shaped wire or waveguide - - Raises: - BuildError if an impossible or unsupported geometry is requested. - """ - raise NotImplementedError(f'traceU() not implemented for {type(self)}') + raise NotImplementedError(f'planS() not implemented for {type(self)}') def planU( self, @@ -351,12 +232,12 @@ class Tool: **kwargs, ) -> tuple[Port, Any]: """ + # NOTE: TODO: U-bend is WIP; this interface may change in the future. + Plan a wire or waveguide that travels exactly `jog` distance along the axis perpendicular to its input port (i.e. a U-bend). - Used by `Pather` when `auto_render=False`. This is an optional native-planning hook: tools may - implement it when they can represent a U-turn directly, otherwise they may rely - on `traceU()` or let `Pather` synthesize the route from simpler primitives. + Used by `RenderPather`. The output port must have an orientation identical to the input port. @@ -365,12 +246,11 @@ class Tool: Args: jog: The total offset from the input to output, along the perpendicular axis. - A positive number implies a leftwards shift (i.e. counterclockwise_bend + A positive number implies a leftwards shift (i.e. counterclockwise bend followed by a clockwise bend) in_ptype: The `ptype` of the port into which this wire's input will be `plug`ged. out_ptype: The `ptype` of the port into which this wire's output will be `plug`ged. - kwargs: Custom tool-specific parameters. `length` may be supplied here to - request a U-turn whose final port is displaced along both axes. + kwargs: Custom tool-specific parameters. Returns: The calculated output `Port` for the wire, assuming an input port at (0, 0) with rotation 0. @@ -379,26 +259,14 @@ class Tool: Raises: BuildError if an impossible or unsupported geometry is requested. """ - # Fallback implementation using traceU - kwargs = dict(kwargs) - length = kwargs.pop('length', 0) - port_names = kwargs.pop('port_names', ('A', 'B')) - tree = self.traceU( - jog, - length=length, - in_ptype=in_ptype, - out_ptype=out_ptype, - port_names=port_names, - **kwargs, - ) - return measure_tool_plan(tree, port_names) + raise NotImplementedError(f'planU() not implemented for {type(self)}') def render( self, batch: Sequence[RenderStep], *, - port_names: tuple[str, str] = ('A', 'B'), - **kwargs, + port_names: tuple[str, str] = ('A', 'B'), # noqa: ARG002 (unused) + **kwargs, # noqa: ARG002 (unused) ) -> ILibrary: """ Render the provided `batch` of `RenderStep`s into geometry, returning a tree @@ -412,50 +280,7 @@ class Tool: kwargs: Custom tool-specific parameters. """ assert not batch or batch[0].tool == self - # Fallback: render each step individually - lib, pat = Library.mktree(SINGLE_USE_PREFIX + 'batch') - pat.add_port_pair(names=port_names, ptype=batch[0].start_port.ptype if batch else 'unk') - - for step in batch: - if step.opcode == 'L': - if isinstance(step.data, ILibrary): - seg_tree = step.data - else: - # extract parameters from kwargs or data - seg_tree = self.traceL( - ccw=step.data.get('ccw') if isinstance(step.data, dict) else None, - length=float(step.data.get('length', 0)) if isinstance(step.data, dict) else 0.0, - port_names=port_names, - **kwargs, - ) - elif step.opcode == 'S': - if isinstance(step.data, ILibrary): - seg_tree = step.data - else: - seg_tree = self.traceS( - length=float(step.data.get('length', 0)) if isinstance(step.data, dict) else 0.0, - jog=float(step.data.get('jog', 0)) if isinstance(step.data, dict) else 0.0, - port_names=port_names, - **kwargs, - ) - elif step.opcode == 'U': - if isinstance(step.data, ILibrary): - seg_tree = step.data - else: - seg_tree = self.traceU( - jog=float(step.data.get('jog', 0)) if isinstance(step.data, dict) else 0.0, - length=float(step.data.get('length', 0)) if isinstance(step.data, dict) else 0.0, - port_names=port_names, - **kwargs, - ) - else: - continue - - seg_name = lib << seg_tree - pat.plug(lib[seg_name], {port_names[1]: port_names[0]}, append=True) - del lib[seg_name] - - return lib + raise NotImplementedError(f'render() not implemented for {type(self)}') abstract_tuple_t = tuple[Abstract, str, str] @@ -565,7 +390,7 @@ class SimpleTool(Tool, metaclass=ABCMeta): pat.plug(bend, {port_names[1]: inport}, mirrored=mirrored) return tree - def traceL( + def path( self, ccw: SupportsBool | None, length: float, @@ -582,7 +407,7 @@ class SimpleTool(Tool, metaclass=ABCMeta): out_ptype = out_ptype, ) - tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'traceL') + tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'path') pat.add_port_pair(names=port_names, ptype='unk' if in_ptype is None else in_ptype) self._renderL(data=data, tree=tree, port_names=port_names, straight_kwargs=kwargs) return tree @@ -595,7 +420,7 @@ class SimpleTool(Tool, metaclass=ABCMeta): **kwargs, ) -> ILibrary: - tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'traceL') + tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'path') pat.add_port_pair(names=(port_names[0], port_names[1])) for step in batch: @@ -672,19 +497,6 @@ class AutoTool(Tool, metaclass=ABCMeta): def reversed(self) -> Self: return type(self)(self.abstract, self.our_port_name, self.their_port_name) - @dataclass(frozen=True, slots=True) - class LPlan: - """ Template for an L-path configuration """ - straight: 'AutoTool.Straight' - bend: 'AutoTool.Bend | None' - in_trans: 'AutoTool.Transition | None' - b_trans: 'AutoTool.Transition | None' - out_trans: 'AutoTool.Transition | None' - overhead_x: float - overhead_y: float - bend_angle: float - out_ptype: str - @dataclass(frozen=True, slots=True) class LData: """ Data for planL """ @@ -697,65 +509,6 @@ class AutoTool(Tool, metaclass=ABCMeta): b_transition: 'AutoTool.Transition | None' out_transition: 'AutoTool.Transition | None' - def _iter_l_plans( - self, - ccw: SupportsBool | None, - in_ptype: str | None, - out_ptype: str | None, - ) -> Iterator[LPlan]: - """ - Iterate over all possible combinations of straights and bends that - could form an L-path. - """ - bends = cast('list[AutoTool.Bend | None]', self.bends) - if ccw is None and not bends: - bends = [None] - - for straight in self.straights: - for bend in bends: - bend_dxy, bend_angle = self._bend2dxy(bend, ccw) - - in_ptype_pair = ('unk' if in_ptype is None else in_ptype, straight.ptype) - in_transition = self.transitions.get(in_ptype_pair, None) - itrans_dxy = self._itransition2dxy(in_transition) - - out_ptype_pair = ( - 'unk' if out_ptype is None else out_ptype, - straight.ptype if ccw is None else cast('AutoTool.Bend', bend).out_port.ptype - ) - out_transition = self.transitions.get(out_ptype_pair, None) - otrans_dxy = self._otransition2dxy(out_transition, bend_angle) - - b_transition = None - if ccw is not None: - assert bend is not None - if bend.in_port.ptype != straight.ptype: - b_transition = self.transitions.get((bend.in_port.ptype, straight.ptype), None) - btrans_dxy = self._itransition2dxy(b_transition) - - overhead_x = bend_dxy[0] + itrans_dxy[0] + btrans_dxy[0] + otrans_dxy[0] - overhead_y = bend_dxy[1] + itrans_dxy[1] + btrans_dxy[1] + otrans_dxy[1] - - if out_transition is not None: - out_ptype_actual = out_transition.their_port.ptype - elif ccw is not None: - assert bend is not None - out_ptype_actual = bend.out_port.ptype - else: - out_ptype_actual = straight.ptype - - yield self.LPlan( - straight = straight, - bend = bend, - in_trans = in_transition, - b_trans = b_transition, - out_trans = out_transition, - overhead_x = overhead_x, - overhead_y = overhead_y, - bend_angle = bend_angle, - out_ptype = out_ptype_actual, - ) - @dataclass(frozen=True, slots=True) class SData: """ Data for planS """ @@ -768,71 +521,6 @@ class AutoTool(Tool, metaclass=ABCMeta): b_transition: 'AutoTool.Transition | None' out_transition: 'AutoTool.Transition | None' - @dataclass(frozen=True, slots=True) - class UData: - """ Data for planU or planS (double-L) """ - ldata0: 'AutoTool.LData' - ldata1: 'AutoTool.LData' - straight2: 'AutoTool.Straight' - l2_length: float - mid_transition: 'AutoTool.Transition | None' - - def _solve_double_l( - self, - length: float, - jog: float, - ccw1: SupportsBool, - ccw2: SupportsBool, - in_ptype: str | None, - out_ptype: str | None, - **kwargs, - ) -> tuple[Port, UData]: - """ - Solve for a path consisting of two L-bends connected by a straight segment. - Used for both U-turns (ccw1 == ccw2) and S-bends (ccw1 != ccw2). - """ - is_u = bool(ccw1) == bool(ccw2) - out_rot = 0 if is_u else pi - - for plan1 in self._iter_l_plans(ccw1, in_ptype, None): - rot_mid = rotation_matrix_2d(pi + plan1.bend_angle) - mid_axis = rot_mid @ numpy.array((1.0, 0.0)) - if not numpy.isclose(mid_axis[0], 0) or numpy.isclose(mid_axis[1], 0): - continue - - for straight_mid in self.straights: - mid_ptype_pair = (plan1.out_ptype, straight_mid.ptype) - mid_trans = self.transitions.get(mid_ptype_pair, None) - mid_trans_dxy = self._itransition2dxy(mid_trans) - - for plan2 in self._iter_l_plans(ccw2, straight_mid.ptype, out_ptype): - fixed_dxy = numpy.array((plan1.overhead_x, plan1.overhead_y)) - fixed_dxy += rot_mid @ ( - mid_trans_dxy - + numpy.array((plan2.overhead_x, plan2.overhead_y)) - ) - - l1_straight = length - fixed_dxy[0] - l2_straight = (jog - fixed_dxy[1]) / mid_axis[1] - - if plan1.straight.length_range[0] <= l1_straight < plan1.straight.length_range[1] \ - and straight_mid.length_range[0] <= l2_straight < straight_mid.length_range[1]: - l3_straight = 0 - if plan2.straight.length_range[0] <= l3_straight < plan2.straight.length_range[1]: - ldata0 = self.LData( - l1_straight, plan1.straight, kwargs, ccw1, plan1.bend, - plan1.in_trans, plan1.b_trans, plan1.out_trans, - ) - ldata1 = self.LData( - l3_straight, plan2.straight, kwargs, ccw2, plan2.bend, - plan2.in_trans, plan2.b_trans, plan2.out_trans, - ) - - data = self.UData(ldata0, ldata1, straight_mid, l2_straight, mid_trans) - out_port = Port((length, jog), rotation=out_rot, ptype=plan2.out_ptype) - return out_port, data - raise BuildError(f"Failed to find a valid double-L configuration for {length=}, {jog=}") - straights: list[Straight] """ List of straight-generators to choose from, in order of priority """ @@ -855,10 +543,9 @@ class AutoTool(Tool, metaclass=ABCMeta): return self @staticmethod - def _bend2dxy(bend: Bend | None, ccw: SupportsBool | None) -> tuple[NDArray[numpy.float64], float]: + def _bend2dxy(bend: Bend, ccw: SupportsBool | None) -> tuple[NDArray[numpy.float64], float]: if ccw is None: return numpy.zeros(2), pi - assert bend is not None bend_dxy, bend_angle = bend.in_port.measure_travel(bend.out_port) assert bend_angle is not None if bool(ccw): @@ -902,23 +589,54 @@ class AutoTool(Tool, metaclass=ABCMeta): **kwargs, ) -> tuple[Port, LData]: - for plan in self._iter_l_plans(ccw, in_ptype, out_ptype): - straight_length = length - plan.overhead_x - if plan.straight.length_range[0] <= straight_length < plan.straight.length_range[1]: - data = self.LData( - straight_length = straight_length, - straight = plan.straight, - straight_kwargs = kwargs, - ccw = ccw, - bend = plan.bend, - in_transition = plan.in_trans, - b_transition = plan.b_trans, - out_transition = plan.out_trans, - ) - out_port = Port((length, plan.overhead_y), rotation=plan.bend_angle, ptype=plan.out_ptype) - return out_port, data + success = False + for straight in self.straights: + for bend in self.bends: + bend_dxy, bend_angle = self._bend2dxy(bend, ccw) - raise BuildError(f'Failed to find a valid L-path configuration for {length=:,g}, {ccw=}, {in_ptype=}, {out_ptype=}') + in_ptype_pair = ('unk' if in_ptype is None else in_ptype, straight.ptype) + in_transition = self.transitions.get(in_ptype_pair, None) + itrans_dxy = self._itransition2dxy(in_transition) + + out_ptype_pair = ( + 'unk' if out_ptype is None else out_ptype, + straight.ptype if ccw is None else bend.out_port.ptype + ) + out_transition = self.transitions.get(out_ptype_pair, None) + otrans_dxy = self._otransition2dxy(out_transition, bend_angle) + + b_transition = None + if ccw is not None and bend.in_port.ptype != straight.ptype: + b_transition = self.transitions.get((bend.in_port.ptype, straight.ptype), None) + btrans_dxy = self._itransition2dxy(b_transition) + + straight_length = length - bend_dxy[0] - itrans_dxy[0] - btrans_dxy[0] - otrans_dxy[0] + bend_run = bend_dxy[1] + itrans_dxy[1] + btrans_dxy[1] + otrans_dxy[1] + success = straight.length_range[0] <= straight_length < straight.length_range[1] + if success: + break + if success: + break + else: + # Failed to break + raise BuildError( + f'Asked to draw L-path with total length {length:,g}, shorter than required bends and transitions:\n' + f'bend: {bend_dxy[0]:,g} in_trans: {itrans_dxy[0]:,g}\n' + f'out_trans: {otrans_dxy[0]:,g} bend_trans: {btrans_dxy[0]:,g}' + ) + + if out_transition is not None: + out_ptype_actual = out_transition.their_port.ptype + elif ccw is not None: + out_ptype_actual = bend.out_port.ptype + elif not numpy.isclose(straight_length, 0): + out_ptype_actual = straight.ptype + else: + out_ptype_actual = self.default_out_ptype + + data = self.LData(straight_length, straight, kwargs, ccw, bend, in_transition, b_transition, out_transition) + out_port = Port((length, bend_run), rotation=bend_angle, ptype=out_ptype_actual) + return out_port, data def _renderL( self, @@ -955,7 +673,7 @@ class AutoTool(Tool, metaclass=ABCMeta): pat.plug(data.out_transition.abstract, {port_names[1]: data.out_transition.our_port_name}) return tree - def traceL( + def path( self, ccw: SupportsBool | None, length: float, @@ -972,7 +690,7 @@ class AutoTool(Tool, metaclass=ABCMeta): out_ptype = out_ptype, ) - tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'traceL') + tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'path') pat.add_port_pair(names=port_names, ptype='unk' if in_ptype is None else in_ptype) self._renderL(data=data, tree=tree, port_names=port_names, straight_kwargs=kwargs) return tree @@ -1028,7 +746,7 @@ class AutoTool(Tool, metaclass=ABCMeta): jog_remaining = jog - itrans_dxy[1] - otrans_dxy[1] if sbend.jog_range[0] <= jog_remaining < sbend.jog_range[1]: sbend_dxy = self._sbend2dxy(sbend, jog_remaining) - success = numpy.isclose(length, sbend_dxy[0] + itrans_dxy[0] + otrans_dxy[0]) + success = numpy.isclose(length, sbend_dxy[0] + itrans_dxy[1] + otrans_dxy[1]) if success: b_transition = None straight_length = 0 @@ -1037,8 +755,26 @@ class AutoTool(Tool, metaclass=ABCMeta): break if not success: - ccw0 = jog > 0 - return self._solve_double_l(length, jog, ccw0, not ccw0, in_ptype, out_ptype, **kwargs) + try: + ccw0 = jog > 0 + p_test0, ldata_test0 = self.planL(length / 2, ccw0, in_ptype=in_ptype) + p_test1, ldata_test1 = self.planL(jog - p_test0.y, not ccw0, in_ptype=p_test0.ptype, out_ptype=out_ptype) + + dx = p_test1.x - length / 2 + p0, ldata0 = self.planL(length - dx, ccw0, in_ptype=in_ptype) + p1, ldata1 = self.planL(jog - p0.y, not ccw0, in_ptype=p0.ptype, out_ptype=out_ptype) + success = True + except BuildError as err: + l2_err: BuildError | None = err + else: + l2_err = None + raise NotImplementedError('TODO need to handle ldata below') + + if not success: + # Failed to break + raise BuildError( + f'Failed to find a valid s-bend configuration for {length=:,g}, {jog=:,g}, {in_ptype=}, {out_ptype=}' + ) from l2_err if out_transition is not None: out_ptype_actual = out_transition.their_port.ptype @@ -1093,7 +829,7 @@ class AutoTool(Tool, metaclass=ABCMeta): pat.plug(data.out_transition.abstract, {port_names[1]: data.out_transition.our_port_name}) return tree - def traceS( + def pathS( self, length: float, jog: float, @@ -1109,74 +845,9 @@ class AutoTool(Tool, metaclass=ABCMeta): in_ptype = in_ptype, out_ptype = out_ptype, ) - tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'traceS') + tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'pathS') pat.add_port_pair(names=port_names, ptype='unk' if in_ptype is None else in_ptype) - if isinstance(data, self.UData): - self._renderU(data=data, tree=tree, port_names=port_names, gen_kwargs=kwargs) - else: - self._renderS(data=data, tree=tree, port_names=port_names, gen_kwargs=kwargs) - return tree - - def planU( - self, - jog: float, - *, - length: float = 0, - in_ptype: str | None = None, - out_ptype: str | None = None, - **kwargs, - ) -> tuple[Port, UData]: - ccw = jog > 0 - return self._solve_double_l(length, jog, ccw, ccw, in_ptype, out_ptype, **kwargs) - - def _renderU( - self, - data: UData, - tree: ILibrary, - port_names: tuple[str, str], - gen_kwargs: dict[str, Any], - ) -> ILibrary: - pat = tree.top_pattern() - # 1. First L-bend - self._renderL(data.ldata0, tree, port_names, gen_kwargs) - # 2. Connecting straight - if data.mid_transition: - pat.plug(data.mid_transition.abstract, {port_names[1]: data.mid_transition.their_port_name}) - if not numpy.isclose(data.l2_length, 0): - s2_pat_or_tree = data.straight2.fn(data.l2_length, **(gen_kwargs | data.ldata0.straight_kwargs)) - pmap = {port_names[1]: data.straight2.in_port_name} - if isinstance(s2_pat_or_tree, Pattern): - pat.plug(s2_pat_or_tree, pmap, append=True) - else: - s2_tree = s2_pat_or_tree - top = s2_tree.top() - s2_tree.flatten(top, dangling_ok=True) - pat.plug(s2_tree[top], pmap, append=True) - # 3. Second L-bend - self._renderL(data.ldata1, tree, port_names, gen_kwargs) - return tree - - def traceU( - self, - jog: float, - *, - length: float = 0, - in_ptype: str | None = None, - out_ptype: str | None = None, - port_names: tuple[str, str] = ('A', 'B'), - **kwargs, - ) -> Library: - _out_port, data = self.planU( - jog, - length = length, - in_ptype = in_ptype, - out_ptype = out_ptype, - **kwargs, - ) - - tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'traceU') - pat.add_port_pair(names=port_names, ptype='unk' if in_ptype is None else in_ptype) - self._renderU(data=data, tree=tree, port_names=port_names, gen_kwargs=kwargs) + self._renderS(data=data, tree=tree, port_names=port_names, gen_kwargs=kwargs) return tree def render( @@ -1187,7 +858,7 @@ class AutoTool(Tool, metaclass=ABCMeta): **kwargs, ) -> ILibrary: - tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'traceL') + tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'path') pat.add_port_pair(names=(port_names[0], port_names[1])) for step in batch: @@ -1195,12 +866,7 @@ class AutoTool(Tool, metaclass=ABCMeta): if step.opcode == 'L': self._renderL(data=step.data, tree=tree, port_names=port_names, straight_kwargs=kwargs) elif step.opcode == 'S': - if isinstance(step.data, self.UData): - self._renderU(data=step.data, tree=tree, port_names=port_names, gen_kwargs=kwargs) - else: - self._renderS(data=step.data, tree=tree, port_names=port_names, gen_kwargs=kwargs) - elif step.opcode == 'U': - self._renderU(data=step.data, tree=tree, port_names=port_names, gen_kwargs=kwargs) + self._renderS(data=step.data, tree=tree, port_names=port_names, gen_kwargs=kwargs) return tree @@ -1231,40 +897,7 @@ class PathTool(Tool, metaclass=ABCMeta): # self.width = width # self.ptype: str - def _check_out_ptype(self, out_ptype: str | None) -> None: - if out_ptype and out_ptype != self.ptype: - raise BuildError(f'Requested {out_ptype=} does not match path ptype {self.ptype}') - - def _bend_radius(self) -> float: - return self.width / 2 - - def _plan_l_vertices(self, length: float, bend_run: float) -> NDArray[numpy.float64]: - vertices = [(0.0, 0.0), (length, 0.0)] - if not numpy.isclose(bend_run, 0): - vertices.append((length, bend_run)) - return numpy.array(vertices, dtype=float) - - def _plan_s_vertices(self, length: float, jog: float) -> NDArray[numpy.float64]: - if numpy.isclose(jog, 0): - return numpy.array([(0.0, 0.0), (length, 0.0)], dtype=float) - - if length < self.width: - raise BuildError( - f'Asked to draw S-path with total length {length:,g}, shorter than required bend: {self.width:,g}' - ) - - # Match AutoTool's straight-then-s-bend placement so the jog happens - # width/2 before the end while still allowing smaller lateral offsets. - jog_x = length - self._bend_radius() - vertices = [ - (0.0, 0.0), - (jog_x, 0.0), - (jog_x, jog), - (length, jog), - ] - return numpy.array(vertices, dtype=float) - - def traceL( + def path( self, ccw: SupportsBool | None, length: float, @@ -1274,15 +907,15 @@ class PathTool(Tool, metaclass=ABCMeta): port_names: tuple[str, str] = ('A', 'B'), **kwargs, # noqa: ARG002 (unused) ) -> Library: - out_port, data = self.planL( + out_port, dxy = self.planL( ccw, length, in_ptype=in_ptype, out_ptype=out_ptype, ) - tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'traceL') - pat.path(layer=self.layer, width=self.width, vertices=self._plan_l_vertices(length, float(out_port.y))) + tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'path') + pat.path(layer=self.layer, width=self.width, vertices=[(0, 0), (length, 0)]) if ccw is None: out_rot = pi @@ -1293,7 +926,7 @@ class PathTool(Tool, metaclass=ABCMeta): pat.ports = { port_names[0]: Port((0, 0), rotation=0, ptype=self.ptype), - port_names[1]: Port(out_port.offset, rotation=out_rot, ptype=self.ptype), + port_names[1]: Port(dxy, rotation=out_rot, ptype=self.ptype), } return tree @@ -1309,10 +942,11 @@ class PathTool(Tool, metaclass=ABCMeta): ) -> tuple[Port, NDArray[numpy.float64]]: # TODO check all the math for L-shaped bends - self._check_out_ptype(out_ptype) + if out_ptype and out_ptype != self.ptype: + raise BuildError(f'Requested {out_ptype=} does not match path ptype {self.ptype}') if ccw is not None: - bend_dxy = numpy.array([1, -1]) * self._bend_radius() + bend_dxy = numpy.array([1, -1]) * self.width / 2 bend_angle = pi / 2 if bool(ccw): @@ -1333,46 +967,6 @@ class PathTool(Tool, metaclass=ABCMeta): out_port = Port(data, rotation=bend_angle, ptype=self.ptype) return out_port, data - def traceS( - self, - length: float, - jog: float, - *, - in_ptype: str | None = None, - out_ptype: str | None = None, - port_names: tuple[str, str] = ('A', 'B'), - **kwargs, # noqa: ARG002 (unused) - ) -> Library: - out_port, _data = self.planS( - length, - jog, - in_ptype=in_ptype, - out_ptype=out_ptype, - ) - - tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'traceS') - pat.path(layer=self.layer, width=self.width, vertices=self._plan_s_vertices(length, jog)) - pat.ports = { - port_names[0]: Port((0, 0), rotation=0, ptype=self.ptype), - port_names[1]: out_port, - } - return tree - - def planS( - self, - length: float, - jog: float, - *, - in_ptype: str | None = None, # noqa: ARG002 (unused) - out_ptype: str | None = None, - **kwargs, # noqa: ARG002 (unused) - ) -> tuple[Port, NDArray[numpy.float64]]: - self._check_out_ptype(out_ptype) - self._plan_s_vertices(length, jog) - data = numpy.array((length, jog)) - out_port = Port((length, jog), rotation=pi, ptype=self.ptype) - return out_port, data - def render( self, batch: Sequence[RenderStep], @@ -1381,43 +975,29 @@ class PathTool(Tool, metaclass=ABCMeta): **kwargs, # noqa: ARG002 (unused) ) -> ILibrary: - # Transform the batch so the first port is local (at 0,0) but retains its global rotation. - # This allows the path to be rendered with its original orientation, simplified by - # translation to the origin. Pather.render will handle the final placement - # (including rotation alignment) via `pat.plug`. - first_port = batch[0].start_port - translation = -first_port.offset - rotation = 0 - pivot = first_port.offset - - # Localize the batch for rendering - local_batch = [step.transformed(translation, rotation, pivot) for step in batch] - - path_vertices = [local_batch[0].start_port.offset] - for step in local_batch: + path_vertices = [batch[0].start_port.offset] + for step in batch: assert step.tool == self port_rot = step.start_port.rotation - # Masque convention: Port rotation points INTO the device. - # So the direction of travel for the path is AWAY from the port, i.e., port_rot + pi. assert port_rot is not None - transform = rotation_matrix_2d(port_rot + pi) - delta = step.end_port.offset - step.start_port.offset - local_end = rotation_matrix_2d(-(port_rot + pi)) @ delta + if step.opcode == 'L': - local_vertices = self._plan_l_vertices(float(local_end[0]), float(local_end[1])) - elif step.opcode == 'S': - local_vertices = self._plan_s_vertices(float(local_end[0]), float(local_end[1])) + length, bend_run = step.data + dxy = rotation_matrix_2d(port_rot + pi) @ (length, 0) + #path_vertices.append(step.start_port.offset) + path_vertices.append(step.start_port.offset + dxy) else: raise BuildError(f'Unrecognized opcode "{step.opcode}"') - for vertex in local_vertices[1:]: - path_vertices.append(step.start_port.offset + transform @ vertex) + if (path_vertices[-1] != batch[-1].end_port.offset).any(): + # If the path ends in a bend, we need to add the final vertex + path_vertices.append(batch[-1].end_port.offset) - tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'traceL') + tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'path') pat.path(layer=self.layer, width=self.width, vertices=path_vertices) pat.ports = { - port_names[0]: local_batch[0].start_port.copy().rotate(pi), - port_names[1]: local_batch[-1].end_port.copy().rotate(pi), + port_names[0]: batch[0].start_port.copy().rotate(pi), + port_names[1]: batch[-1].end_port.copy().rotate(pi), } return tree diff --git a/masque/builder/utils.py b/masque/builder/utils.py index ca36fff..3109f46 100644 --- a/masque/builder/utils.py +++ b/masque/builder/utils.py @@ -46,7 +46,7 @@ def ell( ccw: Turn direction. `True` means counterclockwise, `False` means clockwise, and `None` means no bend. If `None`, spacing must remain `None` or `0` (default), Otherwise, spacing must be set to a non-`None` value. - bound_type: Method used for determining the travel distance; see diagram above. + bound_method: Method used for determining the travel distance; see diagram above. Valid values are: - 'min_extension' or 'emin': The total extension value for the furthest-out port (B in the diagram). @@ -64,7 +64,7 @@ def ell( the x- and y- axes. If specifying a position, it is projected onto the extension direction. - bound: Value associated with `bound_type`, see above. + bound_value: Value associated with `bound_type`, see above. spacing: Distance between adjacent channels. Can be scalar, resulting in evenly spaced channels, or a vector with length one less than `ports`, allowing non-uniform spacing. @@ -84,7 +84,7 @@ def ell( raise BuildError('Empty port list passed to `ell()`') if ccw is None: - if spacing is not None and not numpy.allclose(spacing, 0): + if spacing is not None and not numpy.isclose(spacing, 0): raise BuildError('Spacing must be 0 or None when ccw=None') spacing = 0 elif spacing is None: @@ -106,7 +106,7 @@ def ell( raise BuildError('Asked to find aggregation for ports that face in different directions:\n' + pformat(port_rotations)) else: - if set_rotation is None: + if set_rotation is not None: raise BuildError('set_rotation must be specified if no ports have rotations!') rotations = numpy.full_like(has_rotation, set_rotation, dtype=float) @@ -132,17 +132,8 @@ def ell( if spacing is None: ch_offsets = numpy.zeros_like(y_order) else: - spacing_arr = numpy.asarray(spacing, dtype=float).reshape(-1) steps = numpy.zeros_like(y_order) - if spacing_arr.size == 1: - steps[1:] = spacing_arr[0] - elif spacing_arr.size == len(ports) - 1: - steps[1:] = spacing_arr - else: - raise BuildError( - f'spacing must be scalar or have length {len(ports) - 1} for {len(ports)} ports; ' - f'got length {spacing_arr.size}' - ) + steps[1:] = spacing ch_offsets = numpy.cumsum(steps)[y_ind] x_start = rot_offsets[:, 0] diff --git a/masque/file/dxf.py b/masque/file/dxf.py index 237b1d8..0f6dd32 100644 --- a/masque/file/dxf.py +++ b/masque/file/dxf.py @@ -16,7 +16,7 @@ import gzip import numpy import ezdxf from ezdxf.enums import TextEntityAlignment -from ezdxf.entities import LWPolyline, Polyline, Text, Insert, Solid, Trace +from ezdxf.entities import LWPolyline, Polyline, Text, Insert from .utils import is_gzipped, tmpfile from .. import Pattern, Ref, PatternError, Label @@ -55,7 +55,8 @@ def write( tuple: (1, 2) -> '1.2' str: '1.2' -> '1.2' (no change) - Shape repetitions are expanded into individual DXF entities. + DXF does not support shape repetition (only block repeptition). Please call + library.wrap_repeated_shapes() before writing to file. Other functions you may want to call: - `masque.file.oasis.check_valid_names(library.keys())` to check for invalid names @@ -192,37 +193,8 @@ def read( top_name, top_pat = _read_block(msp) mlib = Library({top_name: top_pat}) - - blocks_by_name = { - bb.name: bb - for bb in lib.blocks - if not bb.is_any_layout - } - - referenced: set[str] = set() - pending = [msp] - seen_blocks: set[str] = set() - while pending: - block = pending.pop() - block_name = getattr(block, 'name', None) - if block_name is not None and block_name in seen_blocks: - continue - if block_name is not None: - seen_blocks.add(block_name) - for element in block: - if not isinstance(element, Insert): - continue - target = element.dxfattribs().get('name') - if target is None or target in referenced: - continue - referenced.add(target) - if target in blocks_by_name: - pending.append(blocks_by_name[target]) - for bb in lib.blocks: - if bb.is_any_layout: - continue - if bb.name.startswith('_') and bb.name not in referenced: + if bb.name == '*Model_Space': continue name, pat = _read_block(bb) mlib[name] = pat @@ -241,60 +213,32 @@ def _read_block(block: ezdxf.layouts.BlockLayout | ezdxf.layouts.Modelspace) -> if isinstance(element, LWPolyline | Polyline): if isinstance(element, LWPolyline): points = numpy.asarray(element.get_points()) - is_closed = element.closed - else: + elif isinstance(element, Polyline): points = numpy.asarray([pp.xyz for pp in element.points()]) - is_closed = element.is_closed attr = element.dxfattribs() layer = attr.get('layer', DEFAULT_LAYER) - width = 0 - if isinstance(element, LWPolyline): - # ezdxf 1.4+ get_points() returns (x, y, start_width, end_width, bulge) - if points.shape[1] >= 5: - if (points[:, 4] != 0).any(): - raise PatternError('LWPolyline has bulge (not yet representable in masque!)') - if (points[:, 2] != points[:, 3]).any() or (points[:, 2] != points[0, 2]).any(): - raise PatternError('LWPolyline has non-constant width (not yet representable in masque!)') - width = points[0, 2] - elif points.shape[1] == 3: - # width used to be in column 2 - width = points[0, 2] + if points.shape[1] == 2: + raise PatternError('Invalid or unimplemented polygon?') - if width == 0: - width = attr.get('const_width', 0) + if points.shape[1] > 2: + if (points[0, 2] != points[:, 2]).any(): + raise PatternError('PolyLine has non-constant width (not yet representable in masque!)') + if points.shape[1] == 4 and (points[:, 3] != 0).any(): + raise PatternError('LWPolyLine has bulge (not yet representable in masque!)') - verts = points[:, :2] - if is_closed and (len(verts) < 2 or not numpy.allclose(verts[0], verts[-1])): - verts = numpy.vstack((verts, verts[0])) + width = points[0, 2] + if width == 0: + width = attr.get('const_width', 0) - shape: Path | Polygon - if width == 0 and is_closed: - # Use Polygon if it has at least 3 unique vertices - shape_verts = verts[:-1] if len(verts) > 1 else verts - if len(shape_verts) >= 3: - shape = Polygon(vertices=shape_verts) + shape: Path | Polygon + if width == 0 and len(points) > 2 and numpy.array_equal(points[0], points[-1]): + shape = Polygon(vertices=points[:-1, :2]) else: - shape = Path(width=width, vertices=verts) - else: - shape = Path(width=width, vertices=verts) + shape = Path(width=width, vertices=points[:, :2]) pat.shapes[layer].append(shape) - elif isinstance(element, Solid | Trace): - attr = element.dxfattribs() - layer = attr.get('layer', DEFAULT_LAYER) - points = numpy.array([element.get_dxf_attrib(f'vtx{i}') for i in range(4) - if element.has_dxf_attrib(f'vtx{i}')]) - if len(points) >= 3: - # If vtx2 == vtx3, it's a triangle. ezdxf handles this. - if len(points) == 4 and numpy.allclose(points[2], points[3]): - verts = points[:3, :2] - # DXF Solid/Trace uses 0-1-3-2 vertex order for quadrilaterals! - elif len(points) == 4: - verts = points[[0, 1, 3, 2], :2] - else: - verts = points[:, :2] - pat.shapes[layer].append(Polygon(vertices=verts)) + elif isinstance(element, Text): args = dict( offset=numpy.asarray(element.get_placement()[1])[:2], @@ -329,57 +273,12 @@ def _read_block(block: ezdxf.layouts.BlockLayout | ezdxf.layouts.Modelspace) -> ) if 'column_count' in attr: - col_spacing = attr['column_spacing'] - row_spacing = attr['row_spacing'] - col_count = attr['column_count'] - row_count = attr['row_count'] - local_x = numpy.array((col_spacing, 0.0)) - local_y = numpy.array((0.0, row_spacing)) - inv_rot = rotation_matrix_2d(-rotation) - - candidates = ( - (inv_rot @ local_x, inv_rot @ local_y, col_count, row_count), - (inv_rot @ local_y, inv_rot @ local_x, row_count, col_count), + args['repetition'] = Grid( + a_vector=(attr['column_spacing'], 0), + b_vector=(0, attr['row_spacing']), + a_count=attr['column_count'], + b_count=attr['row_count'], ) - repetition = None - for a_vector, b_vector, a_count, b_count in candidates: - rotated_a = rotation_matrix_2d(rotation) @ a_vector - rotated_b = rotation_matrix_2d(rotation) @ b_vector - if (numpy.isclose(rotated_a[1], 0, atol=1e-8) - and numpy.isclose(rotated_b[0], 0, atol=1e-8) - and numpy.isclose(rotated_a[0], col_spacing, atol=1e-8) - and numpy.isclose(rotated_b[1], row_spacing, atol=1e-8) - and a_count == col_count - and b_count == row_count): - repetition = Grid( - a_vector=a_vector, - b_vector=b_vector, - a_count=a_count, - b_count=b_count, - ) - break - if (numpy.isclose(rotated_a[0], 0, atol=1e-8) - and numpy.isclose(rotated_b[1], 0, atol=1e-8) - and numpy.isclose(rotated_b[0], col_spacing, atol=1e-8) - and numpy.isclose(rotated_a[1], row_spacing, atol=1e-8) - and b_count == col_count - and a_count == row_count): - repetition = Grid( - a_vector=a_vector, - b_vector=b_vector, - a_count=a_count, - b_count=b_count, - ) - break - - if repetition is None: - repetition = Grid( - a_vector=inv_rot @ local_x, - b_vector=inv_rot @ local_y, - a_count=col_count, - b_count=row_count, - ) - args['repetition'] = repetition pat.ref(**args) else: logger.warning(f'Ignoring DXF element {element.dxftype()} (not implemented).') @@ -404,23 +303,15 @@ def _mrefs_to_drefs( elif isinstance(rep, Grid): a = rep.a_vector b = rep.b_vector if rep.b_vector is not None else numpy.zeros(2) - # In masque, the grid basis vectors are NOT rotated by the reference's rotation. - # In DXF, the grid basis vectors are [column_spacing, 0] and [0, row_spacing], - # which ARE then rotated by the block reference's rotation. - # Therefore, we can only use a DXF array if ref.rotation is 0 (or a multiple of 90) - # AND the grid is already manhattan. - - # Rotate basis vectors by the reference rotation to see where they end up in the DXF frame - rotated_a = rotation_matrix_2d(ref.rotation) @ a - rotated_b = rotation_matrix_2d(ref.rotation) @ b - - if numpy.isclose(rotated_a[1], 0, atol=1e-8) and numpy.isclose(rotated_b[0], 0, atol=1e-8): + rotated_a = rotation_matrix_2d(-ref.rotation) @ a + rotated_b = rotation_matrix_2d(-ref.rotation) @ b + if rotated_a[1] == 0 and rotated_b[0] == 0: attribs['column_count'] = rep.a_count attribs['row_count'] = rep.b_count attribs['column_spacing'] = rotated_a[0] attribs['row_spacing'] = rotated_b[1] block.add_blockref(encoded_name, ref.offset, dxfattribs=attribs) - elif numpy.isclose(rotated_a[0], 0, atol=1e-8) and numpy.isclose(rotated_b[1], 0, atol=1e-8): + elif rotated_a[0] == 0 and rotated_b[1] == 0: attribs['column_count'] = rep.b_count attribs['row_count'] = rep.a_count attribs['column_spacing'] = rotated_b[0] @@ -453,23 +344,16 @@ def _shapes_to_elements( for layer, sseq in shapes.items(): attribs = dict(layer=_mlayer2dxf(layer)) for shape in sseq: - displacements = [numpy.zeros(2)] if shape.repetition is not None: - displacements = shape.repetition.displacements + raise PatternError( + 'Shape repetitions are not supported by DXF.' + ' Please call library.wrap_repeated_shapes() before writing to file.' + ) - for dd in displacements: - if isinstance(shape, Path): - # preserve path. - # Note: DXF paths don't support endcaps well, so this is still a bit limited. - xy = shape.vertices + dd - attribs_path = {**attribs} - if shape.width > 0: - attribs_path['const_width'] = shape.width - block.add_lwpolyline(xy, dxfattribs=attribs_path) - else: - for polygon in shape.to_polygons(): - xy_open = polygon.vertices + dd - block.add_lwpolyline(xy_open, close=True, dxfattribs=attribs) + for polygon in shape.to_polygons(): + xy_open = polygon.vertices + xy_closed = numpy.vstack((xy_open, xy_open[0, :])) + block.add_lwpolyline(xy_closed, dxfattribs=attribs) def _labels_to_texts( @@ -479,17 +363,11 @@ def _labels_to_texts( for layer, lseq in labels.items(): attribs = dict(layer=_mlayer2dxf(layer)) for label in lseq: - if label.repetition is None: - block.add_text( - label.string, - dxfattribs=attribs - ).set_placement(label.offset, align=TextEntityAlignment.BOTTOM_LEFT) - else: - for dd in label.repetition.displacements: - block.add_text( - label.string, - dxfattribs=attribs - ).set_placement(label.offset + dd, align=TextEntityAlignment.BOTTOM_LEFT) + xy = label.offset + block.add_text( + label.string, + dxfattribs=attribs + ).set_placement(xy, align=TextEntityAlignment.BOTTOM_LEFT) def _mlayer2dxf(layer: layer_t) -> str: diff --git a/masque/file/gdsii.py b/masque/file/gdsii.py index 1d8c3d1..6972cfa 100644 --- a/masque/file/gdsii.py +++ b/masque/file/gdsii.py @@ -37,7 +37,7 @@ from klamath import records from .utils import is_gzipped, tmpfile from .. import Pattern, Ref, PatternError, LibraryError, Label, Shape -from ..shapes import Polygon, Path, RectCollection +from ..shapes import Polygon, Path from ..repetition import Grid from ..utils import layer_t, annotations_t from ..library import LazyLibrary, Library, ILibrary, ILibraryView @@ -82,7 +82,7 @@ def write( datatype is chosen to be `shape.layer[1]` if available, otherwise `0` - GDS does not support shape repetition (only cell repetition). Please call + GDS does not support shape repetition (only cell repeptition). Please call `library.wrap_repeated_shapes()` before writing to file. Other functions you may want to call: @@ -323,40 +323,26 @@ def _gpath_to_mpath(gpath: klamath.library.Path, raw_mode: bool) -> tuple[layer_ else: raise PatternError(f'Unrecognized path type: {gpath.path_type}') - vertices = gpath.xy.astype(float) - annotations = _properties_to_annotations(gpath.properties) - cap_extensions = None + mpath = Path( + vertices=gpath.xy.astype(float), + width=gpath.width, + cap=cap, + offset=numpy.zeros(2), + annotations=_properties_to_annotations(gpath.properties), + raw=raw_mode, + ) if cap == Path.Cap.SquareCustom: - cap_extensions = numpy.asarray(gpath.extension, dtype=float) - - if raw_mode: - mpath = Path._from_raw( - vertices=vertices, - width=gpath.width, - cap=cap, - cap_extensions=cap_extensions, - annotations=annotations, - ) - else: - mpath = Path( - vertices=vertices, - width=gpath.width, - cap=cap, - cap_extensions=cap_extensions, - offset=numpy.zeros(2), - annotations=annotations, - ) + mpath.cap_extensions = gpath.extension return gpath.layer, mpath def _boundary_to_polygon(boundary: klamath.library.Boundary, raw_mode: bool) -> tuple[layer_t, Polygon]: - vertices = boundary.xy[:-1].astype(float) - annotations = _properties_to_annotations(boundary.properties) - if raw_mode: - poly = Polygon._from_raw(vertices=vertices, annotations=annotations) - else: - poly = Polygon(vertices=vertices, offset=numpy.zeros(2), annotations=annotations) - return boundary.layer, poly + return boundary.layer, Polygon( + vertices=boundary.xy[:-1].astype(float), + offset=numpy.zeros(2), + annotations=_properties_to_annotations(boundary.properties), + raw=raw_mode, + ) def _mrefs_to_grefs(refs: dict[str | None, list[Ref]]) -> list[klamath.library.Reference]: @@ -467,7 +453,7 @@ def _shapes_to_elements( extension: tuple[int, int] if shape.cap == Path.Cap.SquareCustom and shape.cap_extensions is not None: - extension = tuple(rint_cast(shape.cap_extensions)) + extension = tuple(shape.cap_extensions) # type: ignore else: extension = (0, 0) @@ -480,20 +466,6 @@ def _shapes_to_elements( properties=properties, ) elements.append(path) - elif isinstance(shape, RectCollection): - for rect in shape.rects: - xy_closed = numpy.empty((5, 2), dtype=numpy.int32) - xy_closed[0] = rint_cast((rect[0], rect[1])) - xy_closed[1] = rint_cast((rect[0], rect[3])) - xy_closed[2] = rint_cast((rect[2], rect[3])) - xy_closed[3] = rint_cast((rect[2], rect[1])) - xy_closed[4] = xy_closed[0] - boundary = klamath.elements.Boundary( - layer=(layer, data_type), - xy=xy_closed, - properties=properties, - ) - elements.append(boundary) elif isinstance(shape, Polygon): polygon = shape xy_closed = numpy.empty((polygon.vertices.shape[0] + 1, 2), dtype=numpy.int32) @@ -645,12 +617,7 @@ def load_libraryfile( stream = mmap.mmap(base_stream.fileno(), 0, access=mmap.ACCESS_READ) # type: ignore else: stream = path.open(mode='rb') # noqa: SIM115 - - try: - return load_library(stream, full_load=full_load, postprocess=postprocess) - finally: - if full_load: - stream.close() + return load_library(stream, full_load=full_load, postprocess=postprocess) def check_valid_names( @@ -665,7 +632,6 @@ def check_valid_names( max_length: Max allowed length """ - names = tuple(names) allowed_chars = set(string.ascii_letters + string.digits + '_?$') bad_chars = [ @@ -682,7 +648,7 @@ def check_valid_names( logger.error('Names contain invalid characters:\n' + pformat(bad_chars)) if bad_lengths: - logger.error(f'Names too long (>{max_length}):\n' + pformat(bad_lengths)) + logger.error(f'Names too long (>{max_length}:\n' + pformat(bad_chars)) if bad_chars or bad_lengths: raise LibraryError('Library contains invalid names, see log above') diff --git a/masque/file/gdsii_arrow.py b/masque/file/gdsii_arrow.py index b97005b..e56a48e 100644 --- a/masque/file/gdsii_arrow.py +++ b/masque/file/gdsii_arrow.py @@ -25,16 +25,12 @@ Notes: """ from typing import IO, cast, Any from collections.abc import Iterable, Mapping, Callable -from importlib.machinery import EXTENSION_SUFFIXES -import importlib.util import io import mmap import logging -import os import pathlib import gzip import string -import sys from pprint import pformat import numpy @@ -45,7 +41,7 @@ from pyarrow.cffi import ffi from .utils import is_gzipped, tmpfile from .. import Pattern, Ref, PatternError, LibraryError, Label, Shape -from ..shapes import Polygon, Path, PolyCollection, RectCollection +from ..shapes import Polygon, Path, PolyCollection from ..repetition import Grid from ..utils import layer_t, annotations_t from ..library import LazyLibrary, Library, ILibrary, ILibraryView @@ -53,24 +49,8 @@ from ..library import LazyLibrary, Library, ILibrary, ILibraryView logger = logging.getLogger(__name__) -ffi.cdef( - """ - void read_path(char* path, struct ArrowArray* array, struct ArrowSchema* schema); - void scan_bytes(uint8_t* data, size_t size, struct ArrowArray* array, struct ArrowSchema* schema); - void read_cells_bytes( - uint8_t* data, - size_t size, - uint64_t* ranges, - size_t range_count, - struct ArrowArray* array, - struct ArrowSchema* schema - ); - """ -) - -clib: Any | None = None - -ZERO_OFFSET = numpy.zeros(2) +clib = ffi.dlopen('/home/jan/projects/klamath-rs/target/release/libklamath_rs_ext.so') +ffi.cdef('void read_path(char* path, struct ArrowArray* array, struct ArrowSchema* schema);') path_cap_map = { @@ -85,153 +65,22 @@ def rint_cast(val: ArrayLike) -> NDArray[numpy.int32]: return numpy.rint(val).astype(numpy.int32) -def _packed_layer_u32_to_pairs(values: NDArray[numpy.unsignedinteger[Any]]) -> NDArray[numpy.int16]: - layer = (values >> numpy.uint32(16)).astype(numpy.uint16).view(numpy.int16) - dtype = (values & numpy.uint32(0xffff)).astype(numpy.uint16).view(numpy.int16) - return numpy.stack((layer, dtype), axis=-1) - - -def _packed_counts_u32_to_pairs(values: NDArray[numpy.unsignedinteger[Any]]) -> NDArray[numpy.int64]: - a_count = (values >> numpy.uint32(16)).astype(numpy.uint16).astype(numpy.int64) - b_count = (values & numpy.uint32(0xffff)).astype(numpy.uint16).astype(numpy.int64) - return numpy.stack((a_count, b_count), axis=-1) - - -def _packed_xy_u64_to_pairs(values: NDArray[numpy.unsignedinteger[Any]]) -> NDArray[numpy.int32]: - xx = (values >> numpy.uint64(32)).astype(numpy.uint32).view(numpy.int32) - yy = (values & numpy.uint64(0xffff_ffff)).astype(numpy.uint32).view(numpy.int32) - return numpy.stack((xx, yy), axis=-1) - - -def _local_library_filename() -> str: - if sys.platform.startswith('linux'): - return 'libklamath_rs_ext.so' - if sys.platform == 'darwin': - return 'libklamath_rs_ext.dylib' - if sys.platform == 'win32': - return 'klamath_rs_ext.dll' - raise OSError(f'Unsupported platform for klamath_rs_ext: {sys.platform!r}') - - -def _installed_library_candidates() -> list[pathlib.Path]: - candidates: list[pathlib.Path] = [] - - try: - spec = importlib.util.find_spec('klamath_rs_ext.klamath_rs_ext') - except ModuleNotFoundError: - spec = None - if spec is not None and spec.origin is not None: - candidates.append(pathlib.Path(spec.origin)) - - try: - pkg_spec = importlib.util.find_spec('klamath_rs_ext') - except ModuleNotFoundError: - pkg_spec = None - if pkg_spec is not None and pkg_spec.submodule_search_locations is not None: - for location in pkg_spec.submodule_search_locations: - pkg_dir = pathlib.Path(location) - for suffix in EXTENSION_SUFFIXES: - candidates.extend(sorted(pkg_dir.glob(f'klamath_rs_ext*{suffix}'))) - - return candidates - - -def _repo_library_candidates() -> list[pathlib.Path]: - repo_root = pathlib.Path(__file__).resolve().parents[2] - library_name = _local_library_filename() - return [ - repo_root / 'klamath-rs' / 'target' / 'release' / library_name, - repo_root / 'klamath-rs' / 'target' / 'debug' / library_name, - ] - - -def find_klamath_rs_library() -> pathlib.Path | None: - env_path = os.environ.get('KLAMATH_RS_EXT_LIB') - if env_path: - candidate = pathlib.Path(env_path).expanduser() - if candidate.exists(): - return candidate.resolve() - - seen: set[pathlib.Path] = set() - for candidate in _installed_library_candidates() + _repo_library_candidates(): - resolved = candidate.expanduser() - if resolved in seen: - continue - seen.add(resolved) - if resolved.exists(): - return resolved.resolve() - return None - - -def is_available() -> bool: - return find_klamath_rs_library() is not None - - -def _get_clib() -> Any: - global clib - if clib is None: - lib_path = find_klamath_rs_library() - if lib_path is None: - raise ImportError( - 'Could not locate klamath_rs_ext shared library. ' - 'Build klamath-rs with `cargo build --release --manifest-path klamath-rs/Cargo.toml` ' - 'or set KLAMATH_RS_EXT_LIB to the built library path.' - ) - clib = ffi.dlopen(str(lib_path)) - return clib - - -def _read_annotations( - prop_offs: NDArray[numpy.integer[Any]], - prop_key: NDArray[numpy.integer[Any]], - prop_val: list[str], - ee: int, - ) -> annotations_t: - prop_ii, prop_ff = prop_offs[ee], prop_offs[ee + 1] - if prop_ii >= prop_ff: - return None - return {str(prop_key[off]): [prop_val[off]] for off in range(prop_ii, prop_ff)} - - def _read_to_arrow( filename: str | pathlib.Path, *args, **kwargs, ) -> pyarrow.Array: - path = pathlib.Path(filename).expanduser().resolve() + path = pathlib.Path(filename) + path.resolve() ptr_array = ffi.new('struct ArrowArray[]', 1) ptr_schema = ffi.new('struct ArrowSchema[]', 1) - _get_clib().read_path(str(path).encode(), ptr_array, ptr_schema) - return _import_arrow_array(ptr_array, ptr_schema) + clib.read_path(str(path).encode(), ptr_array, ptr_schema) - -def _import_arrow_array(ptr_array: Any, ptr_schema: Any) -> pyarrow.Array: iptr_schema = int(ffi.cast('uintptr_t', ptr_schema)) iptr_array = int(ffi.cast('uintptr_t', ptr_array)) - return pyarrow.Array._import_from_c(iptr_array, iptr_schema) + arrow_arr = pyarrow.Array._import_from_c(iptr_array, iptr_schema) - -def _scan_buffer_to_arrow(buffer: bytes | mmap.mmap | memoryview) -> pyarrow.Array: - ptr_array = ffi.new('struct ArrowArray[]', 1) - ptr_schema = ffi.new('struct ArrowSchema[]', 1) - buf_view = memoryview(buffer) - cbuf = ffi.from_buffer('uint8_t[]', buf_view) - _get_clib().scan_bytes(cbuf, len(buf_view), ptr_array, ptr_schema) - return _import_arrow_array(ptr_array, ptr_schema) - - -def _read_selected_cells_to_arrow( - buffer: bytes | mmap.mmap | memoryview, - ranges: NDArray[numpy.uint64], - ) -> pyarrow.Array: - ptr_array = ffi.new('struct ArrowArray[]', 1) - ptr_schema = ffi.new('struct ArrowSchema[]', 1) - buf_view = memoryview(buffer) - cbuf = ffi.from_buffer('uint8_t[]', buf_view) - flat_ranges = numpy.require(ranges, dtype=numpy.uint64, requirements=('C_CONTIGUOUS', 'ALIGNED')) - cranges = ffi.from_buffer('uint64_t[]', flat_ranges) - _get_clib().read_cells_bytes(cbuf, len(buf_view), cranges, int(flat_ranges.shape[0]), ptr_array, ptr_schema) - return _import_arrow_array(ptr_array, ptr_schema) + return arrow_arr def readfile( @@ -248,9 +97,6 @@ def readfile( filename: Filename to save to. *args: passed to `read()` **kwargs: passed to `read()` - - For callers that can consume Arrow directly, prefer `readfile_arrow()` - to skip Python `Pattern` construction entirely. """ arrow_arr = _read_to_arrow(filename) assert len(arrow_arr) == 1 @@ -260,28 +106,6 @@ def readfile( return results -def readfile_arrow( - filename: str | pathlib.Path, - ) -> tuple[pyarrow.StructScalar, dict[str, Any]]: - """ - Read a GDSII file into the native Arrow representation without converting - it into `masque.Library` / `Pattern` objects. - - This is the lowest-overhead public read path exposed by this module. - - Args: - filename: Filename to read. - - Returns: - - Arrow struct scalar for the library payload - - dict of GDSII library info - """ - arrow_arr = _read_to_arrow(filename) - assert len(arrow_arr) == 1 - libarr = arrow_arr[0] - return libarr, _read_header(libarr) - - def read_arrow( libarr: pyarrow.Array, raw_mode: bool = True, @@ -308,8 +132,8 @@ def read_arrow( """ library_info = _read_header(libarr) - layer_names_np = _packed_layer_u32_to_pairs(libarr['layers'].values.to_numpy()) - layer_tups = [(int(pair[0]), int(pair[1])) for pair in layer_names_np] + layer_names_np = libarr['layers'].values.to_numpy().view('i2').reshape((-1, 2)) + layer_tups = [tuple(pair) for pair in layer_names_np] cell_ids = libarr['cells'].values.field('id').to_numpy() cell_names = libarr['cell_names'].as_py() @@ -327,84 +151,28 @@ def read_arrow( ) return elem - def get_boundary_batches(libarr: pyarrow.Array) -> dict[str, Any]: - batches = libarr['cells'].values.field('boundary_batches') - return dict( - offsets = batches.offsets.to_numpy(), - layer_inds = batches.values.field('layer').to_numpy(), - vert_arr = batches.values.field('vertices').values.to_numpy().reshape((-1, 2)), - vert_off = batches.values.field('vertices').offsets.to_numpy() // 2, - poly_off = batches.values.field('vertex_offsets').offsets.to_numpy(), - poly_offsets = batches.values.field('vertex_offsets').values.to_numpy(), - ) - - def get_rect_batches(libarr: pyarrow.Array) -> dict[str, Any]: - batches = libarr['cells'].values.field('rect_batches') - return dict( - offsets = batches.offsets.to_numpy(), - layer_inds = batches.values.field('layer').to_numpy(), - rect_arr = batches.values.field('rects').values.to_numpy().reshape((-1, 4)), - rect_off = batches.values.field('rects').offsets.to_numpy() // 4, - ) - - def get_boundary_props(libarr: pyarrow.Array) -> dict[str, Any]: - boundaries = libarr['cells'].values.field('boundary_props') - return dict( - offsets = boundaries.offsets.to_numpy(), - layer_inds = boundaries.values.field('layer').to_numpy(), - vert_arr = boundaries.values.field('vertices').values.to_numpy().reshape((-1, 2)), - vert_off = boundaries.values.field('vertices').offsets.to_numpy() // 2, - prop_off = boundaries.values.field('properties').offsets.to_numpy(), - prop_key = boundaries.values.field('properties').values.field('key').to_numpy(), - prop_val = boundaries.values.field('properties').values.field('value').to_pylist(), - ) - - def get_refs(libarr: pyarrow.Array, geom_type: str, has_repetition: bool) -> dict[str, Any]: - refs = libarr['cells'].values.field(geom_type) - values = refs.values - elem = dict( - offsets = refs.offsets.to_numpy(), - targets = values.field('target').to_numpy(), - xy = _packed_xy_u64_to_pairs(values.field('xy').to_numpy()), - invert_y = values.field('invert_y').to_numpy(zero_copy_only=False), - angle_rad = values.field('angle_rad').to_numpy(), - scale = values.field('scale').to_numpy(), - ) - if has_repetition: - elem.update(dict( - xy0 = _packed_xy_u64_to_pairs(values.field('xy0').to_numpy()), - xy1 = _packed_xy_u64_to_pairs(values.field('xy1').to_numpy()), - counts = _packed_counts_u32_to_pairs(values.field('counts').to_numpy()), - )) - return elem - - def get_ref_props(libarr: pyarrow.Array, geom_type: str, has_repetition: bool) -> dict[str, Any]: - refs = libarr['cells'].values.field(geom_type) - values = refs.values - elem = dict( - offsets = refs.offsets.to_numpy(), - targets = values.field('target').to_numpy(), - xy = _packed_xy_u64_to_pairs(values.field('xy').to_numpy()), - invert_y = values.field('invert_y').to_numpy(zero_copy_only=False), - angle_rad = values.field('angle_rad').to_numpy(), - scale = values.field('scale').to_numpy(), - prop_off = values.field('properties').offsets.to_numpy(), - prop_key = values.field('properties').values.field('key').to_numpy(), - prop_val = values.field('properties').values.field('value').to_pylist(), - ) - if has_repetition: - elem.update(dict( - xy0 = _packed_xy_u64_to_pairs(values.field('xy0').to_numpy()), - xy1 = _packed_xy_u64_to_pairs(values.field('xy1').to_numpy()), - counts = _packed_counts_u32_to_pairs(values.field('counts').to_numpy()), - )) - return elem + rf = libarr['cells'].values.field('refs') + refs = dict( + offsets = rf.offsets.to_numpy(), + targets = rf.values.field('target').to_numpy(), + xy = rf.values.field('xy').to_numpy().view('i4').reshape((-1, 2)), + invert_y = rf.values.field('invert_y').fill_null(False).to_numpy(zero_copy_only=False), + angle_rad = numpy.rad2deg(rf.values.field('angle_deg').fill_null(0).to_numpy()), + scale = rf.values.field('mag').fill_null(1).to_numpy(), + rep_valid = rf.values.field('repetition').is_valid().to_numpy(zero_copy_only=False), + rep_xy0 = rf.values.field('repetition').field('xy0').fill_null(0).to_numpy().view('i4').reshape((-1, 2)), + rep_xy1 = rf.values.field('repetition').field('xy1').fill_null(0).to_numpy().view('i4').reshape((-1, 2)), + rep_counts = rf.values.field('repetition').field('counts').fill_null(0).to_numpy().view('i2').reshape((-1, 2)), + prop_off = rf.values.field('properties').offsets.to_numpy(), + prop_key = rf.values.field('properties').values.field('key').to_numpy(), + prop_val = rf.values.field('properties').values.field('value').to_pylist(), + ) txt = libarr['cells'].values.field('texts') texts = dict( offsets = txt.offsets.to_numpy(), layer_inds = txt.values.field('layer').to_numpy(), - xy = _packed_xy_u64_to_pairs(txt.values.field('xy').to_numpy()), + xy = txt.values.field('xy').to_numpy().view('i4').reshape((-1, 2)), string = txt.values.field('string').to_pylist(), prop_off = txt.values.field('properties').offsets.to_numpy(), prop_key = txt.values.field('properties').values.field('key').to_numpy(), @@ -412,15 +180,12 @@ def read_arrow( ) elements = dict( - srefs = get_refs(libarr, 'srefs', has_repetition=False), - arefs = get_refs(libarr, 'arefs', has_repetition=True), - sref_props = get_ref_props(libarr, 'sref_props', has_repetition=False), - aref_props = get_ref_props(libarr, 'aref_props', has_repetition=True), - rect_batches = get_rect_batches(libarr), - boundary_batches = get_boundary_batches(libarr), - boundary_props = get_boundary_props(libarr), + boundaries = get_geom(libarr, 'boundaries'), paths = get_geom(libarr, 'paths'), + boxes = get_geom(libarr, 'boxes'), + nodes = get_geom(libarr, 'nodes'), texts = texts, + refs = refs, ) paths = libarr['cells'].values.field('paths') @@ -441,16 +206,11 @@ def read_arrow( mlib = Library() for cc in range(len(libarr['cells'])): - name = cell_names[int(cell_ids[cc])] + name = cell_names[cell_ids[cc]] pat = Pattern() - _rect_batches_to_rectcollections(pat, global_args, elements['rect_batches'], cc) - _boundary_batches_to_polygons(pat, global_args, elements['boundary_batches'], cc) - _boundary_props_to_polygons(pat, global_args, elements['boundary_props'], cc) + _boundaries_to_polygons(pat, global_args, elements['boundaries'], cc) _gpaths_to_mpaths(pat, global_args, elements['paths'], cc) - _srefs_to_mrefs(pat, global_args, elements['srefs'], cc) - _arefs_to_mrefs(pat, global_args, elements['arefs'], cc) - _sref_props_to_mrefs(pat, global_args, elements['sref_props'], cc) - _aref_props_to_mrefs(pat, global_args, elements['aref_props'], cc) + _grefs_to_mrefs(pat, global_args, elements['refs'], cc) _texts_to_labels(pat, global_args, elements['texts'], cc) mlib[name] = pat @@ -462,216 +222,59 @@ def _read_header(libarr: pyarrow.Array) -> dict[str, Any]: Read the file header and create the library_info dict. """ library_info = dict( - name = libarr['lib_name'].as_py(), - meters_per_unit = libarr['meters_per_db_unit'].as_py(), - logical_units_per_unit = libarr['user_units_per_db_unit'].as_py(), + name = libarr['lib_name'], + meters_per_unit = libarr['meters_per_db_unit'], + logical_units_per_unit = libarr['user_units_per_db_unit'], ) return library_info -def _srefs_to_mrefs( +def _grefs_to_mrefs( pat: Pattern, global_args: dict[str, Any], elem: dict[str, Any], cc: int, ) -> None: cell_names = global_args['cell_names'] - elem_off = elem['offsets'] - elem_count = elem_off[cc + 1] - elem_off[cc] - if elem_count == 0: - return - - start = elem_off[cc] - stop = elem_off[cc + 1] - elem_targets = elem['targets'][start:stop] - elem_xy = elem['xy'][start:stop] - elem_invert_y = elem['invert_y'][start:stop] - elem_angle_rad = elem['angle_rad'][start:stop] - elem_scale = elem['scale'][start:stop] - raw_mode = global_args['raw_mode'] - - _append_plain_refs_sorted( - pat=pat, - cell_names=cell_names, - elem_targets=elem_targets, - elem_xy=elem_xy, - elem_invert_y=elem_invert_y, - elem_angle_rad=elem_angle_rad, - elem_scale=elem_scale, - raw_mode=raw_mode, - ) - - -def _append_plain_refs_sorted( - *, - pat: Pattern, - cell_names: list[str], - elem_targets: NDArray[numpy.integer[Any]], - elem_xy: NDArray[numpy.integer[Any]], - elem_invert_y: NDArray[numpy.bool_ | numpy.bool], - elem_angle_rad: NDArray[numpy.floating[Any]], - elem_scale: NDArray[numpy.floating[Any]], - raw_mode: bool, - ) -> None: - elem_count = len(elem_targets) - if elem_count == 0: - return - - make_ref = Ref._from_raw if raw_mode else Ref - - target_start = 0 - while target_start < elem_count: - target_id = int(elem_targets[target_start]) - target_stop = target_start + 1 - while target_stop < elem_count and elem_targets[target_stop] == target_id: - target_stop += 1 - - append_refs = pat.refs[cell_names[target_id]].extend - append_refs( - make_ref( - offset=elem_xy[ee], - mirrored=elem_invert_y[ee], - rotation=elem_angle_rad[ee], - scale=elem_scale[ee], - repetition=None, - annotations=None, - ) - for ee in range(target_start, target_stop) - ) - - target_start = target_stop - - -def _arefs_to_mrefs( - pat: Pattern, - global_args: dict[str, Any], - elem: dict[str, Any], - cc: int, - ) -> None: - cell_names = global_args['cell_names'] - elem_off = elem['offsets'] - elem_count = elem_off[cc + 1] - elem_off[cc] - if elem_count == 0: - return - - start = elem_off[cc] - stop = elem_off[cc + 1] - elem_targets = elem['targets'][start:stop] - elem_xy = elem['xy'][start:stop] - elem_invert_y = elem['invert_y'][start:stop] - elem_angle_rad = elem['angle_rad'][start:stop] - elem_scale = elem['scale'][start:stop] - elem_xy0 = elem['xy0'][start:stop] - elem_xy1 = elem['xy1'][start:stop] - elem_counts = elem['counts'][start:stop] - raw_mode = global_args['raw_mode'] - - make_ref = Ref._from_raw if raw_mode else Ref - make_grid = Grid._from_raw if raw_mode else Grid - - if len(elem_targets) == 0: - return - - target = None - append_ref: Callable[[Ref], Any] | None = None - for ee in range(len(elem_targets)): - target_id = int(elem_targets[ee]) - if target != target_id: - target = target_id - append_ref = pat.refs[cell_names[target_id]].append - assert append_ref is not None - a_count, b_count = elem_counts[ee] - append_ref(make_ref( - offset=elem_xy[ee], - mirrored=elem_invert_y[ee], - rotation=elem_angle_rad[ee], - scale=elem_scale[ee], - repetition=make_grid(a_vector=elem_xy0[ee], b_vector=elem_xy1[ee], a_count=a_count, b_count=b_count), - annotations=None, - )) - - -def _sref_props_to_mrefs( - pat: Pattern, - global_args: dict[str, Any], - elem: dict[str, Any], - cc: int, - ) -> None: - cell_names = global_args['cell_names'] - elem_off = elem['offsets'] + elem_off = elem['offsets'] # which elements belong to each cell + xy = elem['xy'] prop_key = elem['prop_key'] prop_val = elem['prop_val'] + targets = elem['targets'] elem_count = elem_off[cc + 1] - elem_off[cc] - if elem_count == 0: - return + elem_slc = slice(elem_off[cc], elem_off[cc] + elem_count + 1) # +1 to capture ending location for last elem + prop_offs = elem['prop_off'][elem_slc] # which props belong to each element + elem_invert_y = elem['invert_y'][elem_slc][:elem_count] + elem_angle_rad = elem['angle_rad'][elem_slc][:elem_count] + elem_scale = elem['scale'][elem_slc][:elem_count] + elem_rep_xy0 = elem['rep_xy0'][elem_slc][:elem_count] + elem_rep_xy1 = elem['rep_xy1'][elem_slc][:elem_count] + elem_rep_counts = elem['rep_counts'][elem_slc][:elem_count] + rep_valid = elem['rep_valid'][elem_slc][:elem_count] - elem_slc = slice(elem_off[cc], elem_off[cc] + elem_count + 1) - prop_offs = elem['prop_off'][elem_slc] - elem_targets = elem['targets'][elem_off[cc]:elem_off[cc + 1]] - elem_xy = elem['xy'][elem_off[cc]:elem_off[cc + 1]] - elem_invert_y = elem['invert_y'][elem_off[cc]:elem_off[cc + 1]] - elem_angle_rad = elem['angle_rad'][elem_off[cc]:elem_off[cc + 1]] - elem_scale = elem['scale'][elem_off[cc]:elem_off[cc + 1]] - raw_mode = global_args['raw_mode'] - - make_ref = Ref._from_raw if raw_mode else Ref for ee in range(elem_count): - annotations = _read_annotations(prop_offs, prop_key, prop_val, ee) - ref = make_ref( - offset=elem_xy[ee], - mirrored=elem_invert_y[ee], - rotation=elem_angle_rad[ee], - scale=elem_scale[ee], - repetition=None, - annotations=annotations, - ) - pat.refs[cell_names[int(elem_targets[ee])]].append(ref) + target = cell_names[targets[ee]] + offset = xy[ee] + mirr = elem_invert_y[ee] + rot = elem_angle_rad[ee] + mag = elem_scale[ee] + rep: None | Grid = None + if rep_valid[ee]: + a_vector = elem_rep_xy0[ee] + b_vector = elem_rep_xy1[ee] + a_count, b_count = elem_rep_counts[ee] + rep = Grid(a_vector=a_vector, b_vector=b_vector, a_count=a_count, b_count=b_count) -def _aref_props_to_mrefs( - pat: Pattern, - global_args: dict[str, Any], - elem: dict[str, Any], - cc: int, - ) -> None: - cell_names = global_args['cell_names'] - elem_off = elem['offsets'] - prop_key = elem['prop_key'] - prop_val = elem['prop_val'] + annotations: None | dict[str, list[int | float | str]] = None + prop_ii, prop_ff = prop_offs[ee], prop_offs[ee + 1] + if prop_ii < prop_ff: + annotations = {str(prop_key[off]): [prop_val[off]] for off in range(prop_ii, prop_ff)} - elem_count = elem_off[cc + 1] - elem_off[cc] - if elem_count == 0: - return - - elem_slc = slice(elem_off[cc], elem_off[cc] + elem_count + 1) - prop_offs = elem['prop_off'][elem_slc] - elem_targets = elem['targets'][elem_off[cc]:elem_off[cc + 1]] - elem_xy = elem['xy'][elem_off[cc]:elem_off[cc + 1]] - elem_invert_y = elem['invert_y'][elem_off[cc]:elem_off[cc + 1]] - elem_angle_rad = elem['angle_rad'][elem_off[cc]:elem_off[cc + 1]] - elem_scale = elem['scale'][elem_off[cc]:elem_off[cc + 1]] - elem_xy0 = elem['xy0'][elem_off[cc]:elem_off[cc + 1]] - elem_xy1 = elem['xy1'][elem_off[cc]:elem_off[cc + 1]] - elem_counts = elem['counts'][elem_off[cc]:elem_off[cc + 1]] - raw_mode = global_args['raw_mode'] - - make_ref = Ref._from_raw if raw_mode else Ref - make_grid = Grid._from_raw if raw_mode else Grid - - for ee in range(elem_count): - a_count, b_count = elem_counts[ee] - annotations = _read_annotations(prop_offs, prop_key, prop_val, ee) - ref = make_ref( - offset=elem_xy[ee], - mirrored=elem_invert_y[ee], - rotation=elem_angle_rad[ee], - scale=elem_scale[ee], - repetition=make_grid(a_vector=elem_xy0[ee], b_vector=elem_xy1[ee], a_count=a_count, b_count=b_count), - annotations=annotations, - ) - pat.refs[cell_names[int(elem_targets[ee])]].append(ref) + ref = Ref(offset=offset, mirrored=mirr, rotation=rot, scale=mag, repetition=rep, annotations=annotations) + pat.refs[target].append(ref) def _texts_to_labels( @@ -690,21 +293,20 @@ def _texts_to_labels( elem_count = elem_off[cc + 1] - elem_off[cc] elem_slc = slice(elem_off[cc], elem_off[cc] + elem_count + 1) # +1 to capture ending location for last elem prop_offs = elem['prop_off'][elem_slc] # which props belong to each element - elem_xy = xy[elem_slc][:elem_count] elem_layer_inds = layer_inds[elem_slc][:elem_count] elem_strings = elem['string'][elem_slc][:elem_count] - raw_mode = global_args['raw_mode'] for ee in range(elem_count): - layer = layer_tups[int(elem_layer_inds[ee])] - offset = elem_xy[ee] + layer = layer_tups[elem_layer_inds[ee]] + offset = xy[ee] string = elem_strings[ee] - annotations = _read_annotations(prop_offs, prop_key, prop_val, ee) - if raw_mode: - mlabel = Label._from_raw(string=string, offset=offset, annotations=annotations) - else: - mlabel = Label(string=string, offset=offset, annotations=annotations) + annotations: None | dict[str, list[int | float | str]] = None + prop_ii, prop_ff = prop_offs[ee], prop_offs[ee + 1] + if prop_ii < prop_ff: + annotations = {str(prop_key[off]): [prop_val[off]] for off in range(prop_ii, prop_ff)} + + mlabel = Label(string=string, offset=offset, annotations=annotations) pat.labels[layer].append(mlabel) @@ -730,9 +332,10 @@ def _gpaths_to_mpaths( elem_path_types = elem['path_type'][elem_slc][:elem_count] elem_extensions = elem['extensions'][elem_slc][:elem_count] + zeros = numpy.zeros((elem_count, 2)) raw_mode = global_args['raw_mode'] for ee in range(elem_count): - layer = layer_tups[int(elem_layer_inds[ee])] + layer = layer_tups[elem_layer_inds[ee]] vertices = xy_val[xy_offs[ee]:xy_offs[ee + 1]] width = elem_widths[ee] cap_int = elem_path_types[ee] @@ -742,134 +345,74 @@ def _gpaths_to_mpaths( else: cap_extensions = None - annotations = _read_annotations(prop_offs, prop_key, prop_val, ee) - if raw_mode: - path = Path._from_raw( - vertices=vertices, - width=width, - cap=cap, - cap_extensions=cap_extensions, - annotations=annotations, - ) - else: - path = Path( - vertices=vertices, - width=width, - cap=cap, - cap_extensions=cap_extensions, - offset=ZERO_OFFSET, - annotations=annotations, - ) + annotations: None | dict[str, list[int | float | str]] = None + prop_ii, prop_ff = prop_offs[ee], prop_offs[ee + 1] + if prop_ii < prop_ff: + annotations = {str(prop_key[off]): [prop_val[off]] for off in range(prop_ii, prop_ff)} + + path = Path(vertices=vertices, offset=zeros[ee], annotations=annotations, raw=raw_mode, + width=width, cap=cap,cap_extensions=cap_extensions) pat.shapes[layer].append(path) -def _boundary_batches_to_polygons( +def _boundaries_to_polygons( pat: Pattern, global_args: dict[str, Any], elem: dict[str, Any], cc: int, ) -> None: elem_off = elem['offsets'] # which elements belong to each cell - vert_arr = elem['vert_arr'] - vert_off = elem['vert_off'] - layer_inds = elem['layer_inds'] - layer_tups = global_args['layer_tups'] - poly_off = elem['poly_off'] - poly_offsets = elem['poly_offsets'] - - batch_count = elem_off[cc + 1] - elem_off[cc] - if batch_count == 0: - return - - elem_slc = slice(elem_off[cc], elem_off[cc] + batch_count + 1) # +1 to capture ending location for last elem - elem_vert_off = vert_off[elem_slc] - elem_poly_off = poly_off[elem_slc] - elem_layer_inds = layer_inds[elem_slc][:batch_count] - - raw_mode = global_args['raw_mode'] - for bb in range(batch_count): - layer = layer_tups[int(elem_layer_inds[bb])] - vertices = vert_arr[elem_vert_off[bb]:elem_vert_off[bb + 1]] - vertex_offsets = poly_offsets[elem_poly_off[bb]:elem_poly_off[bb + 1]] - - if vertex_offsets.size == 1: - if raw_mode: - poly = Polygon._from_raw(vertices=vertices, annotations=None) - else: - poly = Polygon(vertices=vertices, offset=ZERO_OFFSET, annotations=None) - pat.shapes[layer].append(poly) - else: - if raw_mode: - polys = PolyCollection._from_raw(vertex_lists=vertices, vertex_offsets=vertex_offsets, annotations=None) - else: - polys = PolyCollection(vertex_lists=vertices, vertex_offsets=vertex_offsets, offset=ZERO_OFFSET, annotations=None) - pat.shapes[layer].append(polys) - - -def _rect_batches_to_rectcollections( - pat: Pattern, - global_args: dict[str, Any], - elem: dict[str, Any], - cc: int, - ) -> None: - elem_off = elem['offsets'] - rect_arr = elem['rect_arr'] - rect_off = elem['rect_off'] - layer_inds = elem['layer_inds'] - layer_tups = global_args['layer_tups'] - - batch_count = elem_off[cc + 1] - elem_off[cc] - if batch_count == 0: - return - - elem_slc = slice(elem_off[cc], elem_off[cc] + batch_count + 1) - elem_rect_off = rect_off[elem_slc] - elem_layer_inds = layer_inds[elem_slc][:batch_count] - - raw_mode = global_args['raw_mode'] - for bb in range(batch_count): - layer = layer_tups[int(elem_layer_inds[bb])] - rects = rect_arr[elem_rect_off[bb]:elem_rect_off[bb + 1]] - if raw_mode: - rect_collection = RectCollection._from_raw(rects=rects, annotations=None) - else: - rect_collection = RectCollection(rects=rects, offset=ZERO_OFFSET, annotations=None) - pat.shapes[layer].append(rect_collection) - - -def _boundary_props_to_polygons( - pat: Pattern, - global_args: dict[str, Any], - elem: dict[str, Any], - cc: int, - ) -> None: - elem_off = elem['offsets'] - vert_arr = elem['vert_arr'] - vert_off = elem['vert_off'] + xy_val = elem['xy_arr'] layer_inds = elem['layer_inds'] layer_tups = global_args['layer_tups'] prop_key = elem['prop_key'] prop_val = elem['prop_val'] elem_count = elem_off[cc + 1] - elem_off[cc] - if elem_count == 0: - return - - elem_slc = slice(elem_off[cc], elem_off[cc] + elem_count + 1) - elem_vert_off = vert_off[elem_slc] - prop_offs = elem['prop_off'][elem_slc] + elem_slc = slice(elem_off[cc], elem_off[cc] + elem_count + 1) # +1 to capture ending location for last elem + xy_offs = elem['xy_off'][elem_slc] # which xy coords belong to each element + xy_counts = xy_offs[1:] - xy_offs[:-1] + prop_offs = elem['prop_off'][elem_slc] # which props belong to each element + prop_counts = prop_offs[1:] - prop_offs[:-1] elem_layer_inds = layer_inds[elem_slc][:elem_count] + order = numpy.argsort(elem_layer_inds, stable=True) + unilayer_inds, unilayer_first, unilayer_count = numpy.unique(elem_layer_inds, return_index=True, return_counts=True) + + zeros = numpy.zeros((elem_count, 2)) raw_mode = global_args['raw_mode'] - for ee in range(elem_count): - layer = layer_tups[int(elem_layer_inds[ee])] - vertices = vert_arr[elem_vert_off[ee]:elem_vert_off[ee + 1]] - annotations = _read_annotations(prop_offs, prop_key, prop_val, ee) - if raw_mode: - poly = Polygon._from_raw(vertices=vertices, annotations=annotations) - else: - poly = Polygon(vertices=vertices, offset=ZERO_OFFSET, annotations=annotations) - pat.shapes[layer].append(poly) + for layer_ind, ff, nn in zip(unilayer_inds, unilayer_first, unilayer_count, strict=True): + ee_inds = order[ff:ff + nn] + layer = layer_tups[layer_ind] + propless_mask = prop_counts[ee_inds] == 0 + + poly_count_on_layer = propless_mask.sum() + if poly_count_on_layer == 1: + propless_mask[:] = 0 # Never make a 1-element collection + elif poly_count_on_layer > 1: + propless_vert_counts = xy_counts[ee_inds[propless_mask]] - 1 # -1 to drop closing point + vertex_lists = numpy.empty((propless_vert_counts.sum(), 2), dtype=numpy.float64) + vertex_offsets = numpy.cumsum(numpy.concatenate([[0], propless_vert_counts])) + + for ii, ee in enumerate(ee_inds[propless_mask]): + vo = vertex_offsets[ii] + vertex_lists[vo:vo + propless_vert_counts[ii]] = xy_val[xy_offs[ee]:xy_offs[ee + 1] - 1] + + polys = PolyCollection(vertex_lists=vertex_lists, vertex_offsets=vertex_offsets, offset=zeros[ee]) + pat.shapes[layer].append(polys) + + # Handle single polygons + for ee in ee_inds[~propless_mask]: + layer = layer_tups[elem_layer_inds[ee]] + vertices = xy_val[xy_offs[ee]:xy_offs[ee + 1] - 1] # -1 to drop closing point + + annotations: None | dict[str, list[int | float | str]] = None + prop_ii, prop_ff = prop_offs[ee], prop_offs[ee + 1] + if prop_ii < prop_ff: + annotations = {str(prop_key[off]): prop_val[off] for off in range(prop_ii, prop_ff)} + + poly = Polygon(vertices=vertices, offset=zeros[ee], annotations=annotations, raw=raw_mode) + pat.shapes[layer].append(poly) #def _properties_to_annotations(properties: pyarrow.Array) -> annotations_t: diff --git a/masque/file/gdsii_lazy_arrow.py b/masque/file/gdsii_lazy_arrow.py deleted file mode 100644 index 9a03960..0000000 --- a/masque/file/gdsii_lazy_arrow.py +++ /dev/null @@ -1,960 +0,0 @@ -""" -Lazy GDSII readers and writers backed by native Arrow scan/materialize paths. - -This module is intentionally separate from `gdsii_arrow` so the eager read path -keeps its current behavior and performance profile. -""" -from __future__ import annotations - -from dataclasses import dataclass -from typing import IO, Any, cast -from collections import defaultdict -from collections.abc import Callable, Iterator, Mapping, Sequence -import copy -import gzip -import logging -import mmap -import pathlib - -import numpy -from numpy.typing import NDArray -import pyarrow -import klamath - -from . import gdsii, gdsii_arrow -from .utils import is_gzipped, tmpfile -from ..error import LibraryError -from ..library import ILibrary, ILibraryView, Library, LibraryView, dangling_mode_t -from ..pattern import Pattern, map_targets -from ..utils import apply_transforms - - -logger = logging.getLogger(__name__) - - -@dataclass(frozen=True) -class _StructRange: - start: int - end: int - - -@dataclass -class _SourceBuffer: - path: pathlib.Path - data: bytes | mmap.mmap - handle: IO[bytes] | None = None - - def raw_slice(self, start: int, end: int) -> bytes: - return self.data[start:end] - - -@dataclass -class _ScanRefs: - offsets: NDArray[numpy.integer[Any]] - targets: NDArray[numpy.integer[Any]] - xy: NDArray[numpy.int32] - xy0: NDArray[numpy.int32] - xy1: NDArray[numpy.int32] - counts: NDArray[numpy.int64] - invert_y: NDArray[numpy.bool_ | numpy.bool] - angle_rad: NDArray[numpy.floating[Any]] - scale: NDArray[numpy.floating[Any]] - - -@dataclass(frozen=True) -class _CellScan: - cell_id: int - struct_range: _StructRange - ref_start: int - ref_stop: int - children: set[str] - - -@dataclass -class _ScanPayload: - libarr: pyarrow.StructScalar - library_info: dict[str, Any] - cell_names: list[str] - cell_order: list[str] - cells: dict[str, _CellScan] - refs: _ScanRefs - - -@dataclass -class _SourceLayer: - library: ILibraryView - source_to_visible: dict[str, str] - visible_to_source: dict[str, str] - child_graph: dict[str, set[str]] - order: list[str] - - -@dataclass(frozen=True) -class _SourceEntry: - layer_index: int - source_name: str - - -def is_available() -> bool: - return gdsii_arrow.is_available() - - -def _read_header(libarr: pyarrow.StructScalar) -> dict[str, Any]: - return gdsii_arrow._read_header(libarr) - - -def _open_source_buffer(path: pathlib.Path) -> _SourceBuffer: - if is_gzipped(path): - with gzip.open(path, mode='rb') as stream: - data = stream.read() - return _SourceBuffer(path=path, data=data) - - handle = path.open(mode='rb', buffering=0) - mapped = mmap.mmap(handle.fileno(), 0, access=mmap.ACCESS_READ) - return _SourceBuffer(path=path, data=mapped, handle=handle) - - -def _extract_scan_payload(libarr: pyarrow.StructScalar) -> _ScanPayload: - library_info = _read_header(libarr) - cell_names = libarr['cell_names'].as_py() - - cells = libarr['cells'] - cell_values = cells.values - cell_ids = cell_values.field('id').to_numpy() - struct_starts = cell_values.field('struct_start_offset').to_numpy() - struct_ends = cell_values.field('struct_end_offset').to_numpy() - - refs = cell_values.field('refs') - ref_values = refs.values - ref_offsets = refs.offsets.to_numpy() - targets = ref_values.field('target').to_numpy() - xy = gdsii_arrow._packed_xy_u64_to_pairs(ref_values.field('xy').to_numpy()) - xy0 = gdsii_arrow._packed_xy_u64_to_pairs(ref_values.field('xy0').to_numpy()) - xy1 = gdsii_arrow._packed_xy_u64_to_pairs(ref_values.field('xy1').to_numpy()) - counts = gdsii_arrow._packed_counts_u32_to_pairs(ref_values.field('counts').to_numpy()) - invert_y = ref_values.field('invert_y').to_numpy(zero_copy_only=False) - angle_rad = ref_values.field('angle_rad').to_numpy() - scale = ref_values.field('scale').to_numpy() - - ref_payload = _ScanRefs( - offsets=ref_offsets, - targets=targets, - xy=xy, - xy0=xy0, - xy1=xy1, - counts=counts, - invert_y=invert_y, - angle_rad=angle_rad, - scale=scale, - ) - - cell_order = [cell_names[int(cell_id)] for cell_id in cell_ids] - cell_scan: dict[str, _CellScan] = {} - for cc, name in enumerate(cell_order): - ref_start = int(ref_offsets[cc]) - ref_stop = int(ref_offsets[cc + 1]) - children = { - cell_names[int(target)] - for target in targets[ref_start:ref_stop] - } - cell_scan[name] = _CellScan( - cell_id=int(cell_ids[cc]), - struct_range=_StructRange(int(struct_starts[cc]), int(struct_ends[cc])), - ref_start=ref_start, - ref_stop=ref_stop, - children=children, - ) - - return _ScanPayload( - libarr=libarr, - library_info=library_info, - cell_names=cell_names, - cell_order=cell_order, - cells=cell_scan, - refs=ref_payload, - ) - - -def _pattern_children(pat: Pattern) -> set[str]: - return {child for child, refs in pat.refs.items() if child is not None and refs} - - -def _remap_pattern_targets(pat: Pattern, remap: Callable[[str | None], str | None]) -> Pattern: - if not pat.refs: - return pat - pat.refs = map_targets(pat.refs, remap) - return pat - - -def _coerce_library_view(source: Mapping[str, Pattern] | ILibraryView) -> ILibraryView: - if isinstance(source, ILibraryView): - return source - return LibraryView(source) - - -def _source_order(source: ILibraryView) -> list[str]: - if isinstance(source, ArrowLibrary): - return list(source.source_order()) - return list(source.keys()) - - -def _make_ref_rows( - xy: NDArray[numpy.integer[Any]], - angle_rad: NDArray[numpy.floating[Any]], - invert_y: NDArray[numpy.bool_ | numpy.bool], - scale: NDArray[numpy.floating[Any]], - ) -> NDArray[numpy.float64]: - rows = numpy.empty((len(xy), 5), dtype=float) - rows[:, :2] = xy - rows[:, 2] = angle_rad - rows[:, 3] = invert_y.astype(float) - rows[:, 4] = scale - return rows - - -def _expand_aref_row( - xy: NDArray[numpy.integer[Any]], - xy0: NDArray[numpy.integer[Any]], - xy1: NDArray[numpy.integer[Any]], - counts: NDArray[numpy.integer[Any]], - angle_rad: float, - invert_y: bool, - scale: float, - ) -> NDArray[numpy.float64]: - a_count = int(counts[0]) - b_count = int(counts[1]) - aa, bb = numpy.meshgrid(numpy.arange(a_count), numpy.arange(b_count), indexing='ij') - displacements = aa.reshape(-1, 1) * xy0[None, :] + bb.reshape(-1, 1) * xy1[None, :] - rows = numpy.empty((displacements.shape[0], 5), dtype=float) - rows[:, :2] = xy + displacements - rows[:, 2] = angle_rad - rows[:, 3] = float(invert_y) - rows[:, 4] = scale - return rows - - -class ArrowLibrary(ILibraryView): - """ - Read-only library backed by the native lazy Arrow scan schema. - - Materializing a cell via `__getitem__` caches a real `Pattern` for that cell. - Cached cells are treated as edited for future writes from this module. - """ - - path: pathlib.Path - library_info: dict[str, Any] - - def __init__( - self, - *, - path: pathlib.Path, - payload: _ScanPayload, - source: _SourceBuffer, - ) -> None: - self.path = path - self.library_info = payload.library_info - self._payload = payload - self._source = source - self._cache: dict[str, Pattern] = {} - - @classmethod - def from_file(cls, filename: str | pathlib.Path) -> ArrowLibrary: - path = pathlib.Path(filename).expanduser().resolve() - source = _open_source_buffer(path) - scan_arr = gdsii_arrow._scan_buffer_to_arrow(source.data) - assert len(scan_arr) == 1 - payload = _extract_scan_payload(scan_arr[0]) - return cls(path=path, payload=payload, source=source) - - def __getitem__(self, key: str) -> Pattern: - return self._materialize_pattern(key, persist=True) - - def __iter__(self) -> Iterator[str]: - return iter(self._payload.cell_order) - - def __len__(self) -> int: - return len(self._payload.cell_order) - - def __contains__(self, key: object) -> bool: - return key in self._payload.cells - - def source_order(self) -> tuple[str, ...]: - return tuple(self._payload.cell_order) - - def raw_struct_bytes(self, name: str) -> bytes: - struct_range = self._payload.cells[name].struct_range - return self._source.raw_slice(struct_range.start, struct_range.end) - - def materialize_many( - self, - names: Sequence[str], - *, - persist: bool = True, - ) -> LibraryView: - mats = self._materialize_patterns(names, persist=persist) - return LibraryView(mats) - - def _materialize_patterns( - self, - names: Sequence[str], - *, - persist: bool, - ) -> dict[str, Pattern]: - ordered_names = list(dict.fromkeys(names)) - missing = [name for name in ordered_names if name not in self._payload.cells] - if missing: - raise KeyError(missing[0]) - - materialized: dict[str, Pattern] = {} - uncached = [name for name in ordered_names if name not in self._cache] - if uncached: - ranges = numpy.asarray( - [ - [ - self._payload.cells[name].struct_range.start, - self._payload.cells[name].struct_range.end, - ] - for name in uncached - ], - dtype=numpy.uint64, - ) - arrow_arr = gdsii_arrow._read_selected_cells_to_arrow(self._source.data, ranges) - assert len(arrow_arr) == 1 - selected_lib, _info = gdsii_arrow.read_arrow(arrow_arr[0]) - for name in uncached: - pat = selected_lib[name] - materialized[name] = pat - if persist: - self._cache[name] = pat - - for name in ordered_names: - if name in self._cache: - materialized[name] = self._cache[name] - return materialized - - def _materialize_pattern(self, name: str, *, persist: bool) -> Pattern: - return self._materialize_patterns((name,), persist=persist)[name] - - def _raw_children(self, name: str) -> set[str]: - return set(self._payload.cells[name].children) - - def _collect_raw_transforms(self, cell: _CellScan, target_id: int) -> list[NDArray[numpy.float64]]: - refs = self._payload.refs - start = cell.ref_start - stop = cell.ref_stop - if stop <= start: - return [] - - targets = refs.targets[start:stop] - mask = targets == target_id - if not mask.any(): - return [] - - rows: list[NDArray[numpy.float64]] = [] - counts = refs.counts[start:stop] - unit_mask = mask & (counts[:, 0] == 1) & (counts[:, 1] == 1) - if unit_mask.any(): - rows.append(_make_ref_rows( - refs.xy[start:stop][unit_mask], - refs.angle_rad[start:stop][unit_mask], - refs.invert_y[start:stop][unit_mask], - refs.scale[start:stop][unit_mask], - )) - - aref_indices = numpy.nonzero(mask & ~unit_mask)[0] - for idx in aref_indices: - abs_idx = start + int(idx) - rows.append(_expand_aref_row( - xy=refs.xy[abs_idx], - xy0=refs.xy0[abs_idx], - xy1=refs.xy1[abs_idx], - counts=refs.counts[abs_idx], - angle_rad=float(refs.angle_rad[abs_idx]), - invert_y=bool(refs.invert_y[abs_idx]), - scale=float(refs.scale[abs_idx]), - )) - return rows - - def child_graph( - self, - dangling: dangling_mode_t = 'error', - ) -> dict[str, set[str]]: - graph: dict[str, set[str]] = {} - for name in self._payload.cell_order: - if name in self._cache: - graph[name] = _pattern_children(self._cache[name]) - else: - graph[name] = self._raw_children(name) - - existing = set(graph) - dangling_refs = set().union(*(children - existing for children in graph.values())) - if dangling == 'error': - if dangling_refs: - raise self._dangling_refs_error(cast('set[str]', dangling_refs), 'building child graph') - return graph - if dangling == 'ignore': - return {name: {child for child in children if child in existing} for name, children in graph.items()} - - for child in dangling_refs: - graph.setdefault(cast('str', child), set()) - return graph - - def parent_graph( - self, - dangling: dangling_mode_t = 'error', - ) -> dict[str, set[str]]: - child_graph = self.child_graph(dangling='include' if dangling == 'include' else 'ignore') - existing = set(self.keys()) - igraph: dict[str, set[str]] = {name: set() for name in child_graph} - for parent, children in child_graph.items(): - for child in children: - if child in existing or dangling == 'include': - igraph.setdefault(child, set()).add(parent) - if dangling == 'error': - raw = self.child_graph(dangling='include') - dangling_refs = set().union(*(children - existing for children in raw.values())) - if dangling_refs: - raise self._dangling_refs_error(cast('set[str]', dangling_refs), 'building parent graph') - return igraph - - def subtree( - self, - tops: str | Sequence[str], - ) -> ILibraryView: - if isinstance(tops, str): - tops = (tops,) - keep = cast('set[str]', self.referenced_patterns(tops) - {None}) - keep |= set(tops) - return self.materialize_many(tuple(keep), persist=True) - - def tops(self) -> list[str]: - graph = self.child_graph(dangling='ignore') - names = set(graph) - not_toplevel: set[str] = set() - for children in graph.values(): - not_toplevel |= children - return list(names - not_toplevel) - - def find_refs_local( - self, - name: str, - parent_graph: dict[str, set[str]] | None = None, - dangling: dangling_mode_t = 'error', - ) -> dict[str, list[NDArray[numpy.float64]]]: - instances: dict[str, list[NDArray[numpy.float64]]] = defaultdict(list) - if parent_graph is None: - graph_mode = 'ignore' if dangling == 'ignore' else 'include' - parent_graph = self.parent_graph(dangling=graph_mode) - - if name not in self: - if name not in parent_graph: - return instances - if dangling == 'error': - raise self._dangling_refs_error({name}, f'finding local refs for {name!r}') - if dangling == 'ignore': - return instances - - target_id = self._payload.cells.get(name) - for parent in parent_graph.get(name, set()): - if parent in self._cache: - for ref in self._cache[parent].refs.get(name, []): - instances[parent].append(ref.as_transforms()) - continue - - if target_id is None or parent not in self._payload.cells: - continue - rows = self._collect_raw_transforms(self._payload.cells[parent], target_id.cell_id) - if rows: - instances[parent].extend(rows) - return instances - - def find_refs_global( - self, - name: str, - order: list[str] | None = None, - parent_graph: dict[str, set[str]] | None = None, - dangling: dangling_mode_t = 'error', - ) -> dict[tuple[str, ...], NDArray[numpy.float64]]: - graph_mode = 'ignore' if dangling == 'ignore' else 'include' - if order is None: - order = self.child_order(dangling=graph_mode) - if parent_graph is None: - parent_graph = self.parent_graph(dangling=graph_mode) - - if name not in self: - if name not in parent_graph: - return {} - if dangling == 'error': - raise self._dangling_refs_error({name}, f'finding global refs for {name!r}') - if dangling == 'ignore': - return {} - - self_keys = set(self.keys()) - transforms: dict[str, list[tuple[tuple[str, ...], NDArray[numpy.float64]]]] - transforms = defaultdict(list) - for parent, vals in self.find_refs_local(name, parent_graph=parent_graph, dangling=dangling).items(): - transforms[parent] = [((name,), numpy.concatenate(vals))] - - for next_name in order: - if next_name not in transforms: - continue - if not parent_graph.get(next_name, set()) & self_keys: - continue - - outers = self.find_refs_local(next_name, parent_graph=parent_graph, dangling=dangling) - inners = transforms.pop(next_name) - for parent, outer in outers.items(): - outer_tf = numpy.concatenate(outer) - for path, inner in inners: - combined = apply_transforms(outer_tf, inner) - transforms[parent].append(((next_name,) + path, combined)) - - result = {} - for parent, targets in transforms.items(): - for path, instances in targets: - full_path = (parent,) + path - result[full_path] = instances - return result - - -class OverlayLibrary(ILibrary): - """ - Mutable overlay over one or more source libraries. - - Source-backed cells remain lazy until accessed through `__getitem__`, at - which point that visible cell is promoted into an overlay-owned materialized - `Pattern`. - """ - - def __init__(self) -> None: - self._layers: list[_SourceLayer] = [] - self._entries: dict[str, Pattern | _SourceEntry] = {} - self._order: list[str] = [] - self._target_remap: dict[str, str] = {} - - def __iter__(self) -> Iterator[str]: - return (name for name in self._order if name in self._entries) - - def __len__(self) -> int: - return len(self._entries) - - def __contains__(self, key: object) -> bool: - return key in self._entries - - def __getitem__(self, key: str) -> Pattern: - return self._materialize_pattern(key, persist=True) - - def __setitem__( - self, - key: str, - value: Pattern | Callable[[], Pattern], - ) -> None: - if key in self._entries: - raise LibraryError(f'"{key}" already exists in the library. Overwriting is not allowed!') - pattern = value() if callable(value) else value - self._entries[key] = pattern - if key not in self._order: - self._order.append(key) - - def __delitem__(self, key: str) -> None: - if key not in self._entries: - raise KeyError(key) - del self._entries[key] - - def _merge(self, key_self: str, other: Mapping[str, Pattern], key_other: str) -> None: - self[key_self] = copy.deepcopy(other[key_other]) - - def add_source( - self, - source: Mapping[str, Pattern] | ILibraryView, - *, - rename_theirs: Callable[[ILibraryView, str], str] | None = None, - ) -> dict[str, str]: - view = _coerce_library_view(source) - source_order = _source_order(view) - child_graph = view.child_graph(dangling='include') - - source_to_visible: dict[str, str] = {} - visible_to_source: dict[str, str] = {} - rename_map: dict[str, str] = {} - - for name in source_order: - visible = name - if visible in self._entries or visible in visible_to_source: - if rename_theirs is None: - raise LibraryError(f'Conflicting name while adding source: {name!r}') - visible = rename_theirs(self, name) - if visible in self._entries or visible in visible_to_source: - raise LibraryError(f'Unresolved duplicate key encountered while adding source: {name!r} -> {visible!r}') - rename_map[name] = visible - source_to_visible[name] = visible - visible_to_source[visible] = name - - layer = _SourceLayer( - library=view, - source_to_visible=source_to_visible, - visible_to_source=visible_to_source, - child_graph=child_graph, - order=[source_to_visible[name] for name in source_order], - ) - layer_index = len(self._layers) - self._layers.append(layer) - - for source_name, visible_name in source_to_visible.items(): - self._entries[visible_name] = _SourceEntry(layer_index=layer_index, source_name=source_name) - if visible_name not in self._order: - self._order.append(visible_name) - - return rename_map - - def rename( - self, - old_name: str, - new_name: str, - move_references: bool = False, - ) -> OverlayLibrary: - if old_name not in self._entries: - raise LibraryError(f'"{old_name}" does not exist in the library.') - if old_name == new_name: - return self - if new_name in self._entries: - raise LibraryError(f'"{new_name}" already exists in the library.') - - entry = self._entries.pop(old_name) - self._entries[new_name] = entry - if isinstance(entry, _SourceEntry): - layer = self._layers[entry.layer_index] - layer.source_to_visible[entry.source_name] = new_name - del layer.visible_to_source[old_name] - layer.visible_to_source[new_name] = entry.source_name - - idx = self._order.index(old_name) - self._order[idx] = new_name - - if move_references: - self.move_references(old_name, new_name) - return self - - def _resolve_target(self, target: str) -> str: - seen: set[str] = set() - current = target - while current in self._target_remap: - if current in seen: - raise LibraryError(f'Cycle encountered while resolving target remap for {target!r}') - seen.add(current) - current = self._target_remap[current] - return current - - def _set_target_remap(self, old_target: str, new_target: str) -> None: - resolved_new = self._resolve_target(new_target) - if resolved_new == old_target: - raise LibraryError(f'Ref target remap would create a cycle: {old_target!r} -> {new_target!r}') - self._target_remap[old_target] = resolved_new - for key in list(self._target_remap): - self._target_remap[key] = self._resolve_target(self._target_remap[key]) - - def move_references(self, old_target: str, new_target: str) -> OverlayLibrary: - if old_target == new_target: - return self - self._set_target_remap(old_target, new_target) - for entry in list(self._entries.values()): - if isinstance(entry, Pattern) and old_target in entry.refs: - entry.refs[new_target].extend(entry.refs[old_target]) - del entry.refs[old_target] - return self - - def _effective_target(self, layer: _SourceLayer, target: str) -> str: - visible = layer.source_to_visible.get(target, target) - return self._resolve_target(visible) - - def _materialize_pattern(self, name: str, *, persist: bool) -> Pattern: - if name not in self._entries: - raise KeyError(name) - entry = self._entries[name] - if isinstance(entry, Pattern): - return entry - - layer = self._layers[entry.layer_index] - source_pat = layer.library[entry.source_name].deepcopy() - remap = lambda target: None if target is None else self._effective_target(layer, target) - pat = _remap_pattern_targets(source_pat, remap) - if persist: - self._entries[name] = pat - return pat - - def child_graph( - self, - dangling: dangling_mode_t = 'error', - ) -> dict[str, set[str]]: - graph: dict[str, set[str]] = {} - for name in self._order: - if name not in self._entries: - continue - entry = self._entries[name] - if isinstance(entry, Pattern): - graph[name] = _pattern_children(entry) - continue - layer = self._layers[entry.layer_index] - children = {self._effective_target(layer, child) for child in layer.child_graph.get(entry.source_name, set())} - graph[name] = children - - existing = set(graph) - dangling_refs = set().union(*(children - existing for children in graph.values())) - if dangling == 'error': - if dangling_refs: - raise self._dangling_refs_error(cast('set[str]', dangling_refs), 'building child graph') - return graph - if dangling == 'ignore': - return {name: {child for child in children if child in existing} for name, children in graph.items()} - - for child in dangling_refs: - graph.setdefault(cast('str', child), set()) - return graph - - def parent_graph( - self, - dangling: dangling_mode_t = 'error', - ) -> dict[str, set[str]]: - child_graph = self.child_graph(dangling='include' if dangling == 'include' else 'ignore') - existing = set(self.keys()) - igraph: dict[str, set[str]] = {name: set() for name in child_graph} - for parent, children in child_graph.items(): - for child in children: - if child in existing or dangling == 'include': - igraph.setdefault(child, set()).add(parent) - if dangling == 'error': - raw = self.child_graph(dangling='include') - dangling_refs = set().union(*(children - existing for children in raw.values())) - if dangling_refs: - raise self._dangling_refs_error(cast('set[str]', dangling_refs), 'building parent graph') - return igraph - - def subtree( - self, - tops: str | Sequence[str], - ) -> ILibraryView: - if isinstance(tops, str): - tops = (tops,) - keep = cast('set[str]', self.referenced_patterns(tops) - {None}) - keep |= set(tops) - return LibraryView({name: self[name] for name in keep}) - - def find_refs_local( - self, - name: str, - parent_graph: dict[str, set[str]] | None = None, - dangling: dangling_mode_t = 'error', - ) -> dict[str, list[NDArray[numpy.float64]]]: - instances: dict[str, list[NDArray[numpy.float64]]] = defaultdict(list) - if parent_graph is None: - graph_mode = 'ignore' if dangling == 'ignore' else 'include' - parent_graph = self.parent_graph(dangling=graph_mode) - - if name not in self: - if name not in parent_graph: - return instances - if dangling == 'error': - raise self._dangling_refs_error({name}, f'finding local refs for {name!r}') - if dangling == 'ignore': - return instances - - for parent in parent_graph.get(name, set()): - pat = self._materialize_pattern(parent, persist=False) - for ref in pat.refs.get(name, []): - instances[parent].append(ref.as_transforms()) - return instances - - def find_refs_global( - self, - name: str, - order: list[str] | None = None, - parent_graph: dict[str, set[str]] | None = None, - dangling: dangling_mode_t = 'error', - ) -> dict[tuple[str, ...], NDArray[numpy.float64]]: - graph_mode = 'ignore' if dangling == 'ignore' else 'include' - if order is None: - order = self.child_order(dangling=graph_mode) - if parent_graph is None: - parent_graph = self.parent_graph(dangling=graph_mode) - - if name not in self: - if name not in parent_graph: - return {} - if dangling == 'error': - raise self._dangling_refs_error({name}, f'finding global refs for {name!r}') - if dangling == 'ignore': - return {} - - self_keys = set(self.keys()) - transforms: dict[str, list[tuple[tuple[str, ...], NDArray[numpy.float64]]]] - transforms = defaultdict(list) - for parent, vals in self.find_refs_local(name, parent_graph=parent_graph, dangling=dangling).items(): - transforms[parent] = [((name,), numpy.concatenate(vals))] - - for next_name in order: - if next_name not in transforms: - continue - if not parent_graph.get(next_name, set()) & self_keys: - continue - - outers = self.find_refs_local(next_name, parent_graph=parent_graph, dangling=dangling) - inners = transforms.pop(next_name) - for parent, outer in outers.items(): - outer_tf = numpy.concatenate(outer) - for path, inner in inners: - combined = apply_transforms(outer_tf, inner) - transforms[parent].append(((next_name,) + path, combined)) - - result = {} - for parent, targets in transforms.items(): - for path, instances in targets: - result[(parent,) + path] = instances - return result - - def source_order(self) -> tuple[str, ...]: - return tuple(name for name in self._order if name in self._entries) - - -def readfile( - filename: str | pathlib.Path, - ) -> tuple[ArrowLibrary, dict[str, Any]]: - lib = ArrowLibrary.from_file(filename) - return lib, lib.library_info - - -def load_libraryfile( - filename: str | pathlib.Path, - ) -> tuple[ArrowLibrary, dict[str, Any]]: - return readfile(filename) - - -def _get_write_info( - library: Mapping[str, Pattern] | ILibraryView, - *, - meters_per_unit: float | None, - logical_units_per_unit: float | None, - library_name: str | None, - ) -> tuple[float, float, str]: - if meters_per_unit is not None and logical_units_per_unit is not None and library_name is not None: - return meters_per_unit, logical_units_per_unit, library_name - - infos: list[dict[str, Any]] = [] - if isinstance(library, ArrowLibrary): - infos.append(library.library_info) - elif isinstance(library, OverlayLibrary): - for layer in library._layers: - if isinstance(layer.library, ArrowLibrary): - infos.append(layer.library.library_info) - - if infos: - unit_pairs = {(info['meters_per_unit'], info['logical_units_per_unit']) for info in infos} - if len(unit_pairs) > 1: - raise LibraryError('Merged lazy GDS sources must have identical units before writing') - info = infos[0] - meters = info['meters_per_unit'] if meters_per_unit is None else meters_per_unit - logical = info['logical_units_per_unit'] if logical_units_per_unit is None else logical_units_per_unit - name = info['name'] if library_name is None else library_name - return meters, logical, name - - if meters_per_unit is None or logical_units_per_unit is None or library_name is None: - raise LibraryError('meters_per_unit, logical_units_per_unit, and library_name are required for non-GDS-backed lazy writes') - return meters_per_unit, logical_units_per_unit, library_name - - -def _can_copy_arrow_cell(library: ArrowLibrary, name: str) -> bool: - return name not in library._cache - - -def _can_copy_overlay_cell(library: OverlayLibrary, name: str, entry: _SourceEntry) -> bool: - layer = library._layers[entry.layer_index] - if not isinstance(layer.library, ArrowLibrary): - return False - if name != entry.source_name: - return False - children = layer.child_graph.get(entry.source_name, set()) - return all(library._effective_target(layer, child) == child for child in children) - - -def _write_pattern_struct(stream: IO[bytes], name: str, pat: Pattern) -> None: - elements: list[klamath.elements.Element] = [] - elements += gdsii._shapes_to_elements(pat.shapes) - elements += gdsii._labels_to_texts(pat.labels) - elements += gdsii._mrefs_to_grefs(pat.refs) - klamath.library.write_struct(stream, name=name.encode('ASCII'), elements=elements) - - -def write( - library: Mapping[str, Pattern] | ILibraryView, - stream: IO[bytes], - *, - meters_per_unit: float | None = None, - logical_units_per_unit: float | None = None, - library_name: str | None = None, - ) -> None: - meters_per_unit, logical_units_per_unit, library_name = _get_write_info( - library, - meters_per_unit=meters_per_unit, - logical_units_per_unit=logical_units_per_unit, - library_name=library_name, - ) - - header = klamath.library.FileHeader( - name=library_name.encode('ASCII'), - user_units_per_db_unit=logical_units_per_unit, - meters_per_db_unit=meters_per_unit, - ) - header.write(stream) - - if isinstance(library, ArrowLibrary): - for name in library.source_order(): - if _can_copy_arrow_cell(library, name): - stream.write(library.raw_struct_bytes(name)) - else: - _write_pattern_struct(stream, name, library._materialize_pattern(name, persist=False)) - klamath.records.ENDLIB.write(stream, None) - return - - if isinstance(library, OverlayLibrary): - for name in library.source_order(): - entry = library._entries[name] - if isinstance(entry, _SourceEntry) and _can_copy_overlay_cell(library, name, entry): - layer = library._layers[entry.layer_index] - assert isinstance(layer.library, ArrowLibrary) - stream.write(layer.library.raw_struct_bytes(entry.source_name)) - else: - _write_pattern_struct(stream, name, library._materialize_pattern(name, persist=False)) - klamath.records.ENDLIB.write(stream, None) - return - - gdsii.write(cast('Mapping[str, Pattern]', library), stream, meters_per_unit, logical_units_per_unit, library_name) - - -def writefile( - library: Mapping[str, Pattern] | ILibraryView, - filename: str | pathlib.Path, - *, - meters_per_unit: float | None = None, - logical_units_per_unit: float | None = None, - library_name: str | None = None, - ) -> None: - path = pathlib.Path(filename) - - with tmpfile(path) as base_stream: - streams: tuple[Any, ...] = (base_stream,) - if path.suffix == '.gz': - stream = cast('IO[bytes]', gzip.GzipFile(filename='', mtime=0, fileobj=base_stream, mode='wb', compresslevel=6)) - streams = (stream,) + streams - else: - stream = base_stream - - try: - write( - library, - stream, - meters_per_unit=meters_per_unit, - logical_units_per_unit=logical_units_per_unit, - library_name=library_name, - ) - finally: - for ss in streams: - ss.close() diff --git a/masque/file/gdsii_perf.py b/masque/file/gdsii_perf.py deleted file mode 100644 index 38d1a7d..0000000 --- a/masque/file/gdsii_perf.py +++ /dev/null @@ -1,633 +0,0 @@ -""" -Synthetic GDS fixture generation for reader/writer performance testing. - -The presets here are intentionally hierarchical and deterministic. They aim to -approximate a pair of real-world layout families discussed during GDS reader and -writer work: - -* `many_cells`: tens of thousands of cells, moderate reference count, very heavy - box usage after flattening, and moderate polygon density. -* `many_instances`: a much smaller cell library with very high reference count, - similar box density, and far fewer polygons. - -Fixtures are written by streaming structures through `klamath` directly so large -benchmark files can be produced without first materializing an equally large -`masque.Library` in Python. -""" -from __future__ import annotations - -from dataclasses import asdict, dataclass -from pathlib import Path -from typing import Any -import argparse -import json -import math - -import numpy -import klamath -from klamath import elements - - -EMPTY_PROPERTIES: dict[int, bytes] = {} -METERS_PER_DB_UNIT = 1e-9 -USER_UNITS_PER_DB_UNIT = 1e-3 -TOTAL_LAYERS = 200 - - -@dataclass(frozen=True) -class FixturePreset: - name: str - total_layers: int - box_layers: int - heavy_box_layers: int - polygon_layers: int - box_cells: int - poly_cells: int - box_wrappers: int - poly_wrappers: int - box_clusters: int - poly_clusters: int - box_cluster_refs: int - poly_cluster_refs: int - top_direct_box_refs: int - top_direct_poly_refs: int - heavy_boxes_per_cell: int - regular_boxes_per_cell: int - polygons_per_cell: int - path_stride: int - text_stride: int - box_cluster_array: tuple[int, int] - top_box_array: tuple[int, int] - poly_cluster_array: tuple[int, int] - top_poly_array: tuple[int, int] - rare_annotation_stride: int - - -PRESETS: dict[str, FixturePreset] = { - 'many_cells': FixturePreset( - name='many_cells', - total_layers=TOTAL_LAYERS, - box_layers=20, - heavy_box_layers=3, - polygon_layers=20, - box_cells=17_000, - poly_cells=6_000, - box_wrappers=18_000, - poly_wrappers=6_000, - box_clusters=2_000, - poly_clusters=999, - box_cluster_refs=24, - poly_cluster_refs=16, - top_direct_box_refs=21_000, - top_direct_poly_refs=7_000, - heavy_boxes_per_cell=6, - regular_boxes_per_cell=2, - polygons_per_cell=50, - path_stride=2, - text_stride=3, - box_cluster_array=(24, 16), - top_box_array=(8, 8), - poly_cluster_array=(4, 2), - top_poly_array=(3, 2), - rare_annotation_stride=1_250, - ), - 'many_instances': FixturePreset( - name='many_instances', - total_layers=TOTAL_LAYERS, - box_layers=25, - heavy_box_layers=3, - polygon_layers=10, - box_cells=2_500, - poly_cells=500, - box_wrappers=1_000, - poly_wrappers=500, - box_clusters=1_000, - poly_clusters=499, - box_cluster_refs=1_200, - poly_cluster_refs=400, - top_direct_box_refs=102_001, - top_direct_poly_refs=0, - heavy_boxes_per_cell=40, - regular_boxes_per_cell=16, - polygons_per_cell=60, - path_stride=1, - text_stride=2, - box_cluster_array=(1, 1), - top_box_array=(1, 1), - poly_cluster_array=(1, 1), - top_poly_array=(1, 1), - rare_annotation_stride=250, - ), - } - - -@dataclass(frozen=True) -class FixtureManifest: - preset: str - scale: float - gds_path: str - library_name: str - cells: int - refs: int - layers: int - box_layers: int - heavy_box_layers: list[list[int]] - polygon_layers: list[list[int]] - hierarchical_boxes_per_heavy_layer: int - hierarchical_boxes_per_regular_layer: int - hierarchical_polygons_total: int - hierarchical_paths_total: int - hierarchical_texts_total: int - flattened_box_placements: int - flattened_poly_placements: int - estimated_flat_boxes_per_heavy_layer: int - estimated_flat_polygons_per_active_polygon_layer: int - - -def _scaled_count(value: int, scale: float, minimum: int = 0) -> int: - if value == 0: - return 0 - scaled = int(math.ceil(value * scale)) - return max(minimum, scaled) - - -def _scaled_preset(preset: FixturePreset, scale: float) -> FixturePreset: - if scale <= 0: - raise ValueError(f'scale must be positive, got {scale!r}') - - return FixturePreset( - name=preset.name, - total_layers=preset.total_layers, - box_layers=min(preset.box_layers, preset.total_layers), - heavy_box_layers=min(preset.heavy_box_layers, preset.box_layers), - polygon_layers=min(preset.polygon_layers, preset.total_layers), - box_cells=_scaled_count(preset.box_cells, scale, minimum=1), - poly_cells=_scaled_count(preset.poly_cells, scale, minimum=1), - box_wrappers=_scaled_count(preset.box_wrappers, scale), - poly_wrappers=_scaled_count(preset.poly_wrappers, scale), - box_clusters=_scaled_count(preset.box_clusters, scale, minimum=1), - poly_clusters=_scaled_count(preset.poly_clusters, scale, minimum=1), - box_cluster_refs=_scaled_count(preset.box_cluster_refs, scale, minimum=1), - poly_cluster_refs=_scaled_count(preset.poly_cluster_refs, scale, minimum=1), - top_direct_box_refs=_scaled_count(preset.top_direct_box_refs, scale), - top_direct_poly_refs=_scaled_count(preset.top_direct_poly_refs, scale), - heavy_boxes_per_cell=max(1, preset.heavy_boxes_per_cell), - regular_boxes_per_cell=max(1, preset.regular_boxes_per_cell), - polygons_per_cell=max(1, preset.polygons_per_cell), - path_stride=max(1, preset.path_stride), - text_stride=max(1, preset.text_stride), - box_cluster_array=preset.box_cluster_array, - top_box_array=preset.top_box_array, - poly_cluster_array=preset.poly_cluster_array, - top_poly_array=preset.top_poly_array, - rare_annotation_stride=max(1, _scaled_count(preset.rare_annotation_stride, scale, minimum=1)), - ) - - -def _rect_xy(xmin: int, ymin: int, xmax: int, ymax: int) -> numpy.ndarray[Any, numpy.dtype[numpy.int32]]: - return numpy.array( - [[xmin, ymin], [xmin, ymax], [xmax, ymax], [xmax, ymin], [xmin, ymin]], - dtype=numpy.int32, - ) - - -def _poly_xy(points: list[tuple[int, int]]) -> numpy.ndarray[Any, numpy.dtype[numpy.int32]]: - closed = points + [points[0]] - return numpy.array(closed, dtype=numpy.int32) - - -def _sref( - target: str, - xy: tuple[int, int], - properties: dict[int, bytes] | None = None, - ) -> elements.Reference: - return klamath.library.Reference( - struct_name=target.encode('ASCII'), - invert_y=False, - mag=1.0, - angle_deg=0.0, - xy=numpy.array([xy], dtype=numpy.int32), - colrow=None, - properties=EMPTY_PROPERTIES if properties is None else properties, - ) - - -def _aref( - target: str, - origin: tuple[int, int], - counts: tuple[int, int], - step: tuple[int, int], - properties: dict[int, bytes] | None = None, - ) -> elements.Reference: - cols, rows = counts - dx, dy = step - xy = numpy.array( - [ - origin, - (origin[0] + cols * dx, origin[1]), - (origin[0], origin[1] + rows * dy), - ], - dtype=numpy.int32, - ) - return klamath.library.Reference( - struct_name=target.encode('ASCII'), - invert_y=False, - mag=1.0, - angle_deg=0.0, - xy=xy, - colrow=(cols, rows), - properties=EMPTY_PROPERTIES if properties is None else properties, - ) - - -def _annotation(index: int) -> dict[int, bytes]: - return {1: f'perf-{index}'.encode('ASCII')} - - -def _make_box_cell(name: str, index: int, cfg: FixturePreset) -> list[elements.Element]: - cell_elements: list[elements.Element] = [] - xbase = (index % 17) * 600 - ybase = (index // 17) * 180 - - for layer in range(cfg.heavy_box_layers): - for box_idx in range(cfg.heavy_boxes_per_cell): - x0 = xbase + box_idx * 22 - y0 = ybase + layer * 40 - width = 10 + ((index + box_idx + layer) % 7) * 6 - height = 10 + ((index * 3 + box_idx + layer) % 5) * 8 - properties = _annotation(index) if index % cfg.rare_annotation_stride == 0 and box_idx == 0 and layer == 0 else EMPTY_PROPERTIES - cell_elements.append(elements.Boundary( - layer=(layer, 0), - xy=_rect_xy(x0, y0, x0 + width, y0 + height), - properties=properties, - )) - - for layer in range(cfg.heavy_box_layers, cfg.box_layers): - for box_idx in range(cfg.regular_boxes_per_cell): - x0 = xbase + box_idx * 38 - y0 = ybase + (layer - cfg.heavy_box_layers) * 28 + 400 - width = 18 + ((index + layer + box_idx) % 9) * 4 - height = 12 + ((index + 2 * layer + box_idx) % 6) * 5 - cell_elements.append(elements.Boundary( - layer=(layer, 0), - xy=_rect_xy(x0, y0, x0 + width, y0 + height), - properties=EMPTY_PROPERTIES, - )) - - return cell_elements - - -def _make_poly_cell(name: str, index: int, cfg: FixturePreset) -> list[elements.Element]: - cell_elements: list[elements.Element] = [] - xbase = (index % 19) * 900 - ybase = (index // 19) * 260 - - for poly_idx in range(cfg.polygons_per_cell): - layer = poly_idx % cfg.polygon_layers - dx = xbase + (poly_idx % 5) * 120 - dy = ybase + (poly_idx // 5) * 80 - size = 18 + ((index + poly_idx + layer) % 11) * 7 - points = [ - (dx, dy), - (dx + size, dy + size // 5), - (dx + size + size // 3, dy + size), - (dx + size // 2, dy + size + size // 2), - (dx - size // 4, dy + size // 2), - ] - properties = _annotation(index) if poly_idx == 0 and index % cfg.rare_annotation_stride == 0 else EMPTY_PROPERTIES - cell_elements.append(elements.Boundary( - layer=(layer, 0), - xy=_poly_xy(points), - properties=properties, - )) - - if index % cfg.path_stride == 0: - layer = index % cfg.polygon_layers - cell_elements.append(elements.Path( - layer=(layer, 1), - path_type=2, - width=12 + (index % 5) * 4, - extension=(0, 0), - xy=numpy.array( - [ - [xbase, ybase + 900], - [xbase + 240, ybase + 930], - [xbase + 420, ybase + 960], - ], - dtype=numpy.int32, - ), - properties=EMPTY_PROPERTIES, - )) - - if index % cfg.text_stride == 0: - layer = index % cfg.polygon_layers - properties = _annotation(index) if index % cfg.rare_annotation_stride == 0 else EMPTY_PROPERTIES - cell_elements.append(elements.Text( - layer=(layer, 2), - presentation=0, - path_type=0, - width=0, - invert_y=False, - mag=1.0, - angle_deg=0.0, - xy=numpy.array([[xbase + 64, ybase + 1536]], dtype=numpy.int32), - string=f'T{index:05d}'.encode('ASCII'), - properties=properties, - )) - - return cell_elements - - -def _write_struct(stream: Any, name: str, cell_elements: list[elements.Element]) -> None: - klamath.library.write_struct(stream, name=name.encode('ASCII'), elements=cell_elements) - - -def _box_name(index: int) -> str: - return f'box_{index:05d}' - - -def _poly_name(index: int) -> str: - return f'poly_{index:05d}' - - -def _box_wrapper_name(index: int) -> str: - return f'box_wrap_{index:05d}' - - -def _poly_wrapper_name(index: int) -> str: - return f'poly_wrap_{index:05d}' - - -def _box_cluster_name(index: int) -> str: - return f'box_cluster_{index:05d}' - - -def _poly_cluster_name(index: int) -> str: - return f'poly_cluster_{index:05d}' - - -def _write_box_cells(stream: Any, cfg: FixturePreset) -> None: - for idx in range(cfg.box_cells): - _write_struct(stream, _box_name(idx), _make_box_cell(_box_name(idx), idx, cfg)) - - -def _write_poly_cells(stream: Any, cfg: FixturePreset) -> None: - for idx in range(cfg.poly_cells): - _write_struct(stream, _poly_name(idx), _make_poly_cell(_poly_name(idx), idx, cfg)) - - -def _write_wrappers(stream: Any, cfg: FixturePreset) -> None: - for idx in range(cfg.box_wrappers): - target = _box_name(idx % cfg.box_cells) - origin = ((idx % 97) * 2_000, (idx // 97) * 2_000) - _write_struct(stream, _box_wrapper_name(idx), [_sref(target, origin)]) - - for idx in range(cfg.poly_wrappers): - target = _poly_name(idx % cfg.poly_cells) - origin = ((idx % 61) * 3_200, (idx // 61) * 3_200) - _write_struct(stream, _poly_wrapper_name(idx), [_sref(target, origin)]) - - -def _write_box_clusters(stream: Any, cfg: FixturePreset) -> None: - array_refs = min(cfg.box_cluster_refs, max(1, (3 * cfg.box_cluster_refs) // 4)) - for idx in range(cfg.box_clusters): - cell_elements: list[elements.Element] = [] - for ref_idx in range(cfg.box_cluster_refs): - target = _box_name((idx * cfg.box_cluster_refs + ref_idx) % cfg.box_cells) - origin = ( - (ref_idx % 6) * 48_000, - (ref_idx // 6) * 48_000, - ) - if ref_idx < array_refs: - cell_elements.append(_aref(target, origin, cfg.box_cluster_array, (720, 900))) - else: - cell_elements.append(_sref(target, origin)) - _write_struct(stream, _box_cluster_name(idx), cell_elements) - - -def _write_poly_clusters(stream: Any, cfg: FixturePreset) -> None: - array_refs = min(cfg.poly_cluster_refs, cfg.poly_cluster_refs // 2) - for idx in range(cfg.poly_clusters): - cell_elements: list[elements.Element] = [] - for ref_idx in range(cfg.poly_cluster_refs): - target = _poly_name((idx * cfg.poly_cluster_refs + ref_idx) % cfg.poly_cells) - origin = ( - (ref_idx % 10) * 96_000, - (ref_idx // 10) * 96_000, - ) - if ref_idx < array_refs: - cell_elements.append(_aref(target, origin, cfg.poly_cluster_array, (12_000, 8_500))) - else: - cell_elements.append(_sref(target, origin)) - _write_struct(stream, _poly_cluster_name(idx), cell_elements) - - -def _top_box_refs(cfg: FixturePreset) -> list[elements.Reference]: - refs: list[elements.Reference] = [] - - for idx in range(cfg.box_wrappers): - refs.append(_sref( - _box_wrapper_name(idx), - ((idx % 240) * 240_000, (idx // 240) * 240_000), - )) - - for idx in range(cfg.box_clusters): - refs.append(_sref( - _box_cluster_name(idx), - ((idx % 100) * 800_000, (idx // 100) * 800_000 + 14_000_000), - )) - - for idx in range(cfg.top_direct_box_refs): - target = _box_name(idx % cfg.box_cells) - origin = ( - (idx % 150) * 160_000, - (idx // 150) * 160_000 + 26_000_000, - ) - if cfg.top_box_array == (1, 1): - refs.append(_sref(target, origin)) - else: - refs.append(_aref(target, origin, cfg.top_box_array, (1_100, 1_350))) - - return refs - - -def _top_poly_refs(cfg: FixturePreset) -> list[elements.Reference]: - refs: list[elements.Reference] = [] - - for idx in range(cfg.poly_wrappers): - refs.append(_sref( - _poly_wrapper_name(idx), - ((idx % 180) * 360_000, (idx // 180) * 360_000 + 44_000_000), - )) - - for idx in range(cfg.poly_clusters): - refs.append(_sref( - _poly_cluster_name(idx), - ((idx % 70) * 1_100_000, (idx // 70) * 1_100_000 + 58_000_000), - )) - - for idx in range(cfg.top_direct_poly_refs): - target = _poly_name(idx % cfg.poly_cells) - origin = ( - (idx % 110) * 420_000, - (idx // 110) * 420_000 + 72_000_000, - ) - if cfg.top_poly_array == (1, 1): - refs.append(_sref(target, origin)) - else: - refs.append(_aref(target, origin, cfg.top_poly_array, (16_000, 14_000))) - - return refs - - -def _write_top(stream: Any, cfg: FixturePreset) -> None: - cell_elements: list[elements.Element] = [] - cell_elements.extend(_top_box_refs(cfg)) - cell_elements.extend(_top_poly_refs(cfg)) - _write_struct(stream, 'TOP', cell_elements) - - -def _poly_paths_total(cfg: FixturePreset) -> int: - return (cfg.poly_cells - 1) // cfg.path_stride + 1 - - -def _poly_texts_total(cfg: FixturePreset) -> int: - return (cfg.poly_cells - 1) // cfg.text_stride + 1 - - -def _ref_instances_per_box_cluster(cfg: FixturePreset) -> int: - array_refs = min(cfg.box_cluster_refs, max(1, (3 * cfg.box_cluster_refs) // 4)) - array_mult = cfg.box_cluster_array[0] * cfg.box_cluster_array[1] - return array_refs * array_mult + (cfg.box_cluster_refs - array_refs) - - -def _ref_instances_per_poly_cluster(cfg: FixturePreset) -> int: - array_refs = min(cfg.poly_cluster_refs, cfg.poly_cluster_refs // 2) - array_mult = cfg.poly_cluster_array[0] * cfg.poly_cluster_array[1] - return array_refs * array_mult + (cfg.poly_cluster_refs - array_refs) - - -def fixture_manifest(path: str | Path, preset: str, scale: float = 1.0) -> FixtureManifest: - base = PRESETS[preset] - cfg = _scaled_preset(base, scale) - - flattened_box_placements = ( - cfg.box_wrappers - + cfg.box_clusters * _ref_instances_per_box_cluster(cfg) - + cfg.top_direct_box_refs * cfg.top_box_array[0] * cfg.top_box_array[1] - ) - flattened_poly_placements = ( - cfg.poly_wrappers - + cfg.poly_clusters * _ref_instances_per_poly_cluster(cfg) - + cfg.top_direct_poly_refs * cfg.top_poly_array[0] * cfg.top_poly_array[1] - ) - polygon_layers = max(1, cfg.polygon_layers) - polys_per_layer = (cfg.poly_cells * cfg.polygons_per_cell) // polygon_layers - - return FixtureManifest( - preset=cfg.name, - scale=scale, - gds_path=str(Path(path)), - library_name=f'masque-perf-{cfg.name}', - cells=cfg.box_cells + cfg.poly_cells + cfg.box_wrappers + cfg.poly_wrappers + cfg.box_clusters + cfg.poly_clusters + 1, - refs=( - cfg.box_wrappers - + cfg.poly_wrappers - + cfg.box_clusters * cfg.box_cluster_refs - + cfg.poly_clusters * cfg.poly_cluster_refs - + cfg.box_wrappers + cfg.poly_wrappers + cfg.box_clusters + cfg.poly_clusters - + cfg.top_direct_box_refs + cfg.top_direct_poly_refs - ), - layers=cfg.total_layers, - box_layers=cfg.box_layers, - heavy_box_layers=[[layer, 0] for layer in range(cfg.heavy_box_layers)], - polygon_layers=[[layer, 0] for layer in range(cfg.polygon_layers)], - hierarchical_boxes_per_heavy_layer=cfg.box_cells * cfg.heavy_boxes_per_cell, - hierarchical_boxes_per_regular_layer=cfg.box_cells * cfg.regular_boxes_per_cell, - hierarchical_polygons_total=cfg.poly_cells * cfg.polygons_per_cell, - hierarchical_paths_total=_poly_paths_total(cfg), - hierarchical_texts_total=_poly_texts_total(cfg), - flattened_box_placements=flattened_box_placements, - flattened_poly_placements=flattened_poly_placements, - estimated_flat_boxes_per_heavy_layer=flattened_box_placements * cfg.heavy_boxes_per_cell, - estimated_flat_polygons_per_active_polygon_layer=flattened_poly_placements * polys_per_layer // cfg.poly_cells if cfg.poly_cells else 0, - ) - - -def write_fixture( - path: str | Path, - *, - preset: str, - scale: float = 1.0, - write_manifest: bool = True, - ) -> FixtureManifest: - if preset not in PRESETS: - known = ', '.join(sorted(PRESETS)) - raise KeyError(f'unknown preset {preset!r}; expected one of: {known}') - - manifest = fixture_manifest(path, preset, scale) - cfg = _scaled_preset(PRESETS[preset], scale) - output = Path(path) - output.parent.mkdir(parents=True, exist_ok=True) - - with output.open('wb') as stream: - header = klamath.library.FileHeader( - name=manifest.library_name.encode('ASCII'), - user_units_per_db_unit=USER_UNITS_PER_DB_UNIT, - meters_per_db_unit=METERS_PER_DB_UNIT, - ) - header.write(stream) - _write_box_cells(stream, cfg) - _write_poly_cells(stream, cfg) - _write_wrappers(stream, cfg) - _write_box_clusters(stream, cfg) - _write_poly_clusters(stream, cfg) - _write_top(stream, cfg) - klamath.records.ENDLIB.write(stream, None) - - if write_manifest: - manifest_path = output.with_suffix(output.suffix + '.json') - manifest_path.write_text(json.dumps(asdict(manifest), indent=2, sort_keys=True) + '\n') - - return manifest - - -def build_arg_parser() -> argparse.ArgumentParser: - parser = argparse.ArgumentParser(description='Generate synthetic GDS fixtures for GDS reader/writer performance work.') - parser.add_argument( - 'preset', - nargs='?', - default='many_cells', - choices=sorted(PRESETS), - help='Fixture family to generate.', - ) - parser.add_argument( - 'output', - nargs='?', - help='Output .gds path. Defaults to build/gds_perf/.gds', - ) - parser.add_argument( - '--scale', - type=float, - default=1.0, - help='Scale the preset counts down or up while keeping the same shape mix. Default: 1.0', - ) - parser.add_argument( - '--no-manifest', - action='store_true', - help='Do not write the sidecar JSON manifest.', - ) - return parser - - -def main(argv: list[str] | None = None) -> int: - parser = build_arg_parser() - args = parser.parse_args(argv) - output = Path(args.output) if args.output is not None else Path('build/gds_perf') / f'{args.preset}.gds' - manifest = write_fixture(output, preset=args.preset, scale=args.scale, write_manifest=not args.no_manifest) - print(json.dumps(asdict(manifest), indent=2, sort_keys=True)) - return 0 - - -if __name__ == '__main__': - raise SystemExit(main()) diff --git a/masque/file/oasis.py b/masque/file/oasis.py index b5d0cd8..672af25 100644 --- a/masque/file/oasis.py +++ b/masque/file/oasis.py @@ -120,10 +120,10 @@ def build( layer, data_type = _mlayer2oas(layer_num) lib.layers += [ fatrec.LayerName( - nstring = name, - layer_interval = (layer, layer), - type_interval = (data_type, data_type), - is_textlayer = tt, + nstring=name, + layer_interval=(layer, layer), + type_interval=(data_type, data_type), + is_textlayer=tt, ) for tt in (True, False)] @@ -182,8 +182,8 @@ def writefile( Args: library: A {name: Pattern} mapping of patterns to write. filename: Filename to save to. - *args: passed to `oasis.build()` - **kwargs: passed to `oasis.build()` + *args: passed to `oasis.write` + **kwargs: passed to `oasis.write` """ path = pathlib.Path(filename) @@ -213,9 +213,9 @@ def readfile( Will automatically decompress gzipped files. Args: - filename: Filename to load from. - *args: passed to `oasis.read()` - **kwargs: passed to `oasis.read()` + filename: Filename to save to. + *args: passed to `oasis.read` + **kwargs: passed to `oasis.read` """ path = pathlib.Path(filename) if is_gzipped(path): @@ -286,11 +286,11 @@ def read( annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings) pat.polygon( - vertices = vertices, - layer = element.get_layer_tuple(), - offset = element.get_xy(), - annotations = annotations, - repetition = repetition, + vertices=vertices, + layer=element.get_layer_tuple(), + offset=element.get_xy(), + annotations=annotations, + repetition=repetition, ) elif isinstance(element, fatrec.Path): vertices = numpy.cumsum(numpy.vstack(((0, 0), element.get_point_list())), axis=0) @@ -310,13 +310,13 @@ def read( annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings) pat.path( - vertices = vertices, - layer = element.get_layer_tuple(), - offset = element.get_xy(), - repetition = repetition, - annotations = annotations, - width = element.get_half_width() * 2, - cap = cap, + vertices=vertices, + layer=element.get_layer_tuple(), + offset=element.get_xy(), + repetition=repetition, + annotations=annotations, + width=element.get_half_width() * 2, + cap=cap, **path_args, ) @@ -325,11 +325,11 @@ def read( height = element.get_height() annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings) pat.polygon( - layer = element.get_layer_tuple(), - offset = element.get_xy(), - repetition = repetition, - vertices = numpy.array(((0, 0), (1, 0), (1, 1), (0, 1))) * (width, height), - annotations = annotations, + layer=element.get_layer_tuple(), + offset=element.get_xy(), + repetition=repetition, + vertices=numpy.array(((0, 0), (1, 0), (1, 1), (0, 1))) * (width, height), + annotations=annotations, ) elif isinstance(element, fatrec.Trapezoid): @@ -440,11 +440,11 @@ def read( else: string = str_or_ref.string pat.label( - layer = element.get_layer_tuple(), - offset = element.get_xy(), - repetition = repetition, - annotations = annotations, - string = string, + layer=element.get_layer_tuple(), + offset=element.get_xy(), + repetition=repetition, + annotations=annotations, + string=string, ) else: @@ -549,35 +549,33 @@ def _shapes_to_elements( offset = rint_cast(shape.offset + rep_offset) radius = rint_cast(shape.radius) circle = fatrec.Circle( - layer = layer, - datatype = datatype, - radius = cast('int', radius), - x = offset[0], - y = offset[1], - properties = properties, - repetition = repetition, + layer=layer, + datatype=datatype, + radius=cast('int', radius), + x=offset[0], + y=offset[1], + properties=properties, + repetition=repetition, ) elements.append(circle) elif isinstance(shape, Path): xy = rint_cast(shape.offset + shape.vertices[0] + rep_offset) deltas = rint_cast(numpy.diff(shape.vertices, axis=0)) half_width = rint_cast(shape.width / 2) - path_type = next((k for k, v in path_cap_map.items() if v == shape.cap), None) # reverse lookup - if path_type is None: - raise PatternError(f'OASIS writer does not support path cap {shape.cap}') + path_type = next(k for k, v in path_cap_map.items() if v == shape.cap) # reverse lookup extension_start = (path_type, shape.cap_extensions[0] if shape.cap_extensions is not None else None) extension_end = (path_type, shape.cap_extensions[1] if shape.cap_extensions is not None else None) path = fatrec.Path( - layer = layer, - datatype = datatype, - point_list = cast('Sequence[Sequence[int]]', deltas), - half_width = cast('int', half_width), - x = xy[0], - y = xy[1], - extension_start = extension_start, # TODO implement multiple cap types? - extension_end = extension_end, - properties = properties, - repetition = repetition, + layer=layer, + datatype=datatype, + point_list=cast('Sequence[Sequence[int]]', deltas), + half_width=cast('int', half_width), + x=xy[0], + y=xy[1], + extension_start=extension_start, # TODO implement multiple cap types? + extension_end=extension_end, + properties=properties, + repetition=repetition, ) elements.append(path) else: @@ -585,13 +583,13 @@ def _shapes_to_elements( xy = rint_cast(polygon.offset + polygon.vertices[0] + rep_offset) points = rint_cast(numpy.diff(polygon.vertices, axis=0)) elements.append(fatrec.Polygon( - layer = layer, - datatype = datatype, - x = xy[0], - y = xy[1], - point_list = cast('list[list[int]]', points), - properties = properties, - repetition = repetition, + layer=layer, + datatype=datatype, + x=xy[0], + y=xy[1], + point_list=cast('list[list[int]]', points), + properties=properties, + repetition=repetition, )) return elements @@ -608,13 +606,13 @@ def _labels_to_texts( xy = rint_cast(label.offset + rep_offset) properties = annotations_to_properties(label.annotations) texts.append(fatrec.Text( - layer = layer, - datatype = datatype, - x = xy[0], - y = xy[1], - string = label.string, - properties = properties, - repetition = repetition, + layer=layer, + datatype=datatype, + x=xy[0], + y=xy[1], + string=label.string, + properties=properties, + repetition=repetition, )) return texts @@ -624,12 +622,10 @@ def repetition_fata2masq( ) -> Repetition | None: mrep: Repetition | None if isinstance(rep, fatamorgana.GridRepetition): - mrep = Grid( - a_vector = rep.a_vector, - b_vector = rep.b_vector, - a_count = rep.a_count, - b_count = rep.b_count, - ) + mrep = Grid(a_vector=rep.a_vector, + b_vector=rep.b_vector, + a_count=rep.a_count, + b_count=rep.b_count) elif isinstance(rep, fatamorgana.ArbitraryRepetition): displacements = numpy.cumsum(numpy.column_stack(( rep.x_displacements, @@ -651,19 +647,14 @@ def repetition_masq2fata( frep: fatamorgana.GridRepetition | fatamorgana.ArbitraryRepetition | None if isinstance(rep, Grid): a_vector = rint_cast(rep.a_vector) - a_count = int(rep.a_count) - if rep.b_count > 1: - b_vector = rint_cast(rep.b_vector) - b_count = int(rep.b_count) - else: - b_vector = None - b_count = None - + b_vector = rint_cast(rep.b_vector) if rep.b_vector is not None else None + a_count = rint_cast(rep.a_count) + b_count = rint_cast(rep.b_count) if rep.b_count is not None else None frep = fatamorgana.GridRepetition( - a_vector = a_vector, - b_vector = b_vector, - a_count = a_count, - b_count = b_count, + a_vector=cast('list[int]', a_vector), + b_vector=cast('list[int] | None', b_vector), + a_count=cast('int', a_count), + b_count=cast('int | None', b_count), ) offset = (0, 0) elif isinstance(rep, Arbitrary): @@ -716,9 +707,13 @@ def properties_to_annotations( string = repr(value) logger.warning(f'Converting property value for key ({key}) to string ({string})') values.append(string) - annotations.setdefault(key, []).extend(values) + annotations[key] = values return annotations + properties = [fatrec.Property(key, vals, is_standard=False) + for key, vals in annotations.items()] + return properties + def check_valid_names( names: Iterable[str], diff --git a/masque/file/svg.py b/masque/file/svg.py index 772aa39..859c074 100644 --- a/masque/file/svg.py +++ b/masque/file/svg.py @@ -10,59 +10,25 @@ import svgwrite # type: ignore from .utils import mangle_name from .. import Pattern -from ..utils import rotation_matrix_2d logger = logging.getLogger(__name__) -def _ref_to_svg_transform(ref) -> str: - linear = rotation_matrix_2d(ref.rotation) * ref.scale - if ref.mirrored: - linear = linear @ numpy.diag((1.0, -1.0)) - - a = linear[0, 0] - b = linear[1, 0] - c = linear[0, 1] - d = linear[1, 1] - e = ref.offset[0] - f = ref.offset[1] - return f'matrix({a:g} {b:g} {c:g} {d:g} {e:g} {f:g})' - - -def _make_svg_ids(names: Mapping[str, Pattern]) -> dict[str, str]: - svg_ids: dict[str, str] = {} - seen_ids: set[str] = set() - for name in names: - base_id = mangle_name(name) - svg_id = base_id - suffix = 1 - while svg_id in seen_ids: - suffix += 1 - svg_id = f'{base_id}_{suffix}' - seen_ids.add(svg_id) - svg_ids[name] = svg_id - return svg_ids - - -def _detached_library(library: Mapping[str, Pattern]) -> dict[str, Pattern]: - return {name: pat.deepcopy() for name, pat in library.items()} - - def writefile( library: Mapping[str, Pattern], top: str, filename: str, custom_attributes: bool = False, - annotate_ports: bool = False, ) -> None: """ - Write a Pattern to an SVG file, by first calling .polygonize() on a detached - materialized copy + Write a Pattern to an SVG file, by first calling .polygonize() on it to change the shapes into polygons, and then writing patterns as SVG groups (, inside ), polygons as paths (), and refs as elements. + Note that this function modifies the Pattern. + If `custom_attributes` is `True`, a non-standard `pattern_layer` attribute is written to the relevant elements. @@ -74,21 +40,17 @@ def writefile( prior to calling this function. Args: - library: Mapping of pattern names to patterns. - top: Name of the top-level pattern to render. + pattern: Pattern to write to file. Modified by this function. filename: Filename to write to. custom_attributes: Whether to write non-standard `pattern_layer` attribute to the SVG elements. - annotate_ports: If True, draw an arrow for each port (similar to - `Pattern.visualize(..., ports=True)`). """ - detached = _detached_library(library) - pattern = detached[top] + pattern = library[top] # Polygonize pattern pattern.polygonize() - bounds = pattern.get_bounds(library=detached) + bounds = pattern.get_bounds(library=library) if bounds is None: bounds_min, bounds_max = numpy.array([[-1, -1], [1, 1]]) logger.warning('Pattern had no bounds (empty?); setting arbitrary viewbox', stacklevel=1) @@ -101,11 +63,10 @@ def writefile( # Create file svg = svgwrite.Drawing(filename, profile='full', viewBox=viewbox_string, debug=(not custom_attributes)) - svg_ids = _make_svg_ids(detached) # Now create a group for each pattern and add in any Boundary and Use elements - for name, pat in detached.items(): - svg_group = svg.g(id=svg_ids[name], fill='blue', stroke='red') + for name, pat in library.items(): + svg_group = svg.g(id=mangle_name(name), fill='blue', stroke='red') for layer, shapes in pat.shapes.items(): for shape in shapes: @@ -118,37 +79,16 @@ def writefile( svg_group.add(path) - if annotate_ports: - # Draw arrows for the ports, pointing into the device (per port definition) - for port_name, port in pat.ports.items(): - if port.rotation is not None: - p1 = port.offset - angle = port.rotation - size = 1.0 # arrow size - p2 = p1 + size * numpy.array([numpy.cos(angle), numpy.sin(angle)]) - - # head - head_angle = 0.5 - h1 = p1 + 0.7 * size * numpy.array([numpy.cos(angle + head_angle), numpy.sin(angle + head_angle)]) - h2 = p1 + 0.7 * size * numpy.array([numpy.cos(angle - head_angle), numpy.sin(angle - head_angle)]) - - line = svg.line(start=p1, end=p2, stroke='green', stroke_width=0.2) - head = svg.polyline(points=[h1, p1, h2], fill='none', stroke='green', stroke_width=0.2) - - svg_group.add(line) - svg_group.add(head) - svg_group.add(svg.text(port_name, insert=p2, font_size=0.5, fill='green')) - for target, refs in pat.refs.items(): if target is None: continue for ref in refs: - transform = _ref_to_svg_transform(ref) - use = svg.use(href='#' + svg_ids[target], transform=transform) + transform = f'scale({ref.scale:g}) rotate({ref.rotation:g}) translate({ref.offset[0]:g},{ref.offset[1]:g})' + use = svg.use(href='#' + mangle_name(target), transform=transform) svg_group.add(use) svg.defs.add(svg_group) - svg.add(svg.use(href='#' + svg_ids[top])) + svg.add(svg.use(href='#' + mangle_name(top))) svg.save() @@ -163,21 +103,21 @@ def writefile_inverted( box and drawing the polygons with reverse vertex order inside it, all within one `` element. + Note that this function modifies the Pattern. + If you want pattern polygonized with non-default arguments, just call `pattern.polygonize()` prior to calling this function. Args: - library: Mapping of pattern names to patterns. - top: Name of the top-level pattern to render. + pattern: Pattern to write to file. Modified by this function. filename: Filename to write to. """ - detached = _detached_library(library) - pattern = detached[top] + pattern = library[top] # Polygonize and flatten pattern - pattern.polygonize().flatten(detached) + pattern.polygonize().flatten(library) - bounds = pattern.get_bounds(library=detached) + bounds = pattern.get_bounds(library=library) if bounds is None: bounds_min, bounds_max = numpy.array([[-1, -1], [1, 1]]) logger.warning('Pattern had no bounds (empty?); setting arbitrary viewbox', stacklevel=1) diff --git a/masque/file/utils.py b/masque/file/utils.py index 58c7573..33f68d4 100644 --- a/masque/file/utils.py +++ b/masque/file/utils.py @@ -33,12 +33,6 @@ def preflight( Run a standard set of useful operations and checks, usually done immediately prior to writing to a file (or immediately after reading). - Note that this helper is not copy-isolating. When `sort=True`, it constructs a new - `Library` wrapper around the same `Pattern` objects after sorting them in place, so - later mutating preflight steps such as `prune_empty_patterns` and - `wrap_repeated_shapes` may still mutate caller-owned patterns. Callers that need - isolation should deep-copy the library before calling `preflight()`. - Args: sort: Whether to sort the patterns based on their names, and optionaly sort the pattern contents. Default True. Useful for reproducible builds. @@ -81,8 +75,7 @@ def preflight( raise PatternError('Non-numeric layers found:' + pformat(named_layers)) if prune_empty_patterns: - prune_dangling = 'error' if allow_dangling_refs is False else 'ignore' - pruned = lib.prune_empty(dangling=prune_dangling) + pruned = lib.prune_empty() if pruned: logger.info(f'Preflight pruned {len(pruned)} empty patterns') logger.debug('Pruned: ' + pformat(pruned)) @@ -151,11 +144,7 @@ def tmpfile(path: str | pathlib.Path) -> Iterator[IO[bytes]]: path = pathlib.Path(path) suffixes = ''.join(path.suffixes) with tempfile.NamedTemporaryFile(suffix=suffixes, delete=False) as tmp_stream: - try: - yield tmp_stream - except Exception: - pathlib.Path(tmp_stream.name).unlink(missing_ok=True) - raise + yield tmp_stream try: shutil.move(tmp_stream.name, path) diff --git a/masque/label.py b/masque/label.py index d220fee..711ef35 100644 --- a/masque/label.py +++ b/masque/label.py @@ -7,12 +7,12 @@ from numpy.typing import ArrayLike, NDArray from .repetition import Repetition from .utils import rotation_matrix_2d, annotations_t, annotations_eq, annotations_lt, rep2key -from .traits import PositionableImpl, Copyable, Pivotable, RepeatableImpl, Bounded, Flippable +from .traits import PositionableImpl, Copyable, Pivotable, RepeatableImpl, Bounded from .traits import AnnotatableImpl @functools.total_ordering -class Label(PositionableImpl, RepeatableImpl, AnnotatableImpl, Bounded, Pivotable, Copyable, Flippable): +class Label(PositionableImpl, RepeatableImpl, AnnotatableImpl, Bounded, Pivotable, Copyable): """ A text annotation with a position (but no size; it is not drawn) """ @@ -53,36 +53,17 @@ class Label(PositionableImpl, RepeatableImpl, AnnotatableImpl, Bounded, Pivotabl self.repetition = repetition self.annotations = annotations if annotations is not None else {} - @classmethod - def _from_raw( - cls, - string: str, - *, - offset: NDArray[numpy.float64], - repetition: Repetition | None = None, - annotations: annotations_t | None = None, - ) -> Self: - new = cls.__new__(cls) - new._string = string - new._offset = offset - new._repetition = repetition - new._annotations = annotations - return new - def __copy__(self) -> Self: return type(self)( string=self.string, offset=self.offset.copy(), repetition=self.repetition, - annotations=copy.copy(self.annotations), ) def __deepcopy__(self, memo: dict | None = None) -> Self: memo = {} if memo is None else memo new = copy.copy(self) new._offset = self._offset.copy() - new._repetition = copy.deepcopy(self._repetition, memo) - new._annotations = copy.deepcopy(self._annotations, memo) return new def __lt__(self, other: 'Label') -> bool: @@ -95,8 +76,6 @@ class Label(PositionableImpl, RepeatableImpl, AnnotatableImpl, Bounded, Pivotabl return annotations_lt(self.annotations, other.annotations) def __eq__(self, other: Any) -> bool: - if type(self) is not type(other): - return False return ( self.string == other.string and numpy.array_equal(self.offset, other.offset) @@ -117,34 +96,10 @@ class Label(PositionableImpl, RepeatableImpl, AnnotatableImpl, Bounded, Pivotabl """ pivot = numpy.asarray(pivot, dtype=float) self.translate(-pivot) - if self.repetition is not None: - self.repetition.rotate(rotation) self.offset = numpy.dot(rotation_matrix_2d(rotation), self.offset) self.translate(+pivot) return self - def flip_across(self, axis: int | None = None, *, x: float | None = None, y: float | None = None) -> Self: - """ - Extrinsic transformation: Flip the label across a line in the pattern's - coordinate system. This affects both the label's offset and its - repetition grid. - - Args: - axis: Axis to mirror across. 0: x-axis (flip y), 1: y-axis (flip x). - x: Vertical line x=val to mirror across. - y: Horizontal line y=val to mirror across. - - Returns: - self - """ - axis, pivot = self._check_flip_args(axis=axis, x=x, y=y) - self.translate(-pivot) - if self.repetition is not None: - self.repetition.mirror(axis) - self.offset[1 - axis] *= -1 - self.translate(+pivot) - return self - def get_bounds_single(self) -> NDArray[numpy.float64]: """ Return the bounds of the label. diff --git a/masque/library.py b/masque/library.py index e98d98d..9e7c133 100644 --- a/masque/library.py +++ b/masque/library.py @@ -22,7 +22,7 @@ import copy from pprint import pformat from collections import defaultdict from abc import ABCMeta, abstractmethod -from graphlib import TopologicalSorter, CycleError +from graphlib import TopologicalSorter import numpy from numpy.typing import ArrayLike, NDArray @@ -59,9 +59,6 @@ TreeView: TypeAlias = Mapping[str, 'Pattern'] Tree: TypeAlias = MutableMapping[str, 'Pattern'] """ A mutable name-to-`Pattern` mapping which is expected to have only one top-level cell """ -dangling_mode_t: TypeAlias = Literal['error', 'ignore', 'include'] -""" How helpers should handle refs whose targets are not present in the library. """ - SINGLE_USE_PREFIX = '_' """ @@ -144,6 +141,7 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta): Args: tops: Name(s) of the pattern(s) to check. Default is all patterns in the library. + skip: Memo, set patterns which have already been traversed. Returns: Set of all referenced pattern names @@ -180,8 +178,6 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta): if isinstance(tops, str): tops = (tops,) - tops = set(tops) - skip |= tops # don't re-visit tops # Get referenced patterns for all tops targets = set() @@ -191,9 +187,9 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta): # Perform recursive lookups, but only once for each name for target in targets - skip: assert target is not None - skip.add(target) if target in self: targets |= self.referenced_patterns(target, skip=skip) + skip.add(target) return targets @@ -278,7 +274,7 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta): For an in-place variant, see `Pattern.flatten`. Args: - tops: The pattern(s) to flatten. + tops: The pattern(s) to flattern. flatten_ports: If `True`, keep ports from any referenced patterns; otherwise discard them. dangling_ok: If `True`, no error will be thrown if any @@ -296,9 +292,8 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta): def flatten_single(name: str) -> None: flattened[name] = None pat = self[name].deepcopy() - refs_by_target = tuple((target, tuple(refs)) for target, refs in pat.refs.items()) - for target, refs in refs_by_target: + for target in pat.refs: if target is None: continue if dangling_ok and target not in self: @@ -309,16 +304,10 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta): target_pat = flattened[target] if target_pat is None: raise PatternError(f'Circular reference in {name} to {target}') - ports_only = flatten_ports and bool(target_pat.ports) - if target_pat.is_empty() and not ports_only: # avoid some extra allocations + if target_pat.is_empty(): # avoid some extra allocations continue - for ref in refs: - if flatten_ports and ref.repetition is not None and target_pat.ports: - raise PatternError( - f'Cannot flatten ports from repeated ref to {target!r}; ' - 'flatten with flatten_ports=False or expand/rename the ports manually first.' - ) + for ref in pat.refs[target]: p = ref.as_pattern(pattern=target_pat) if not flatten_ports: p.ports.clear() @@ -424,21 +413,6 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta): """ return self[self.top()] - @staticmethod - def _dangling_refs_error(dangling: set[str], context: str) -> LibraryError: - dangling_list = sorted(dangling) - return LibraryError(f'Dangling refs found while {context}: ' + pformat(dangling_list)) - - def _raw_child_graph(self) -> tuple[dict[str, set[str]], set[str]]: - existing = set(self.keys()) - graph: dict[str, set[str]] = {} - dangling: set[str] = set() - for name, pat in self.items(): - children = {child for child, refs in pat.refs.items() if child is not None and refs} - graph[name] = children - dangling |= children - existing - return graph, dangling - def dfs( self, pattern: 'Pattern', @@ -493,11 +467,9 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta): memo = {} if transform is None or transform is True: - transform = numpy.array([0, 0, 0, 0, 1], dtype=float) + transform = numpy.zeros(4) elif transform is not False: transform = numpy.asarray(transform, dtype=float) - if transform.size == 4: - transform = numpy.append(transform, 1.0) original_pattern = pattern @@ -540,99 +512,50 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta): raise LibraryError('visit_* functions returned a new `Pattern` object' ' but no top-level name was provided in `hierarchy`') - del cast('ILibrary', self)[name] cast('ILibrary', self)[name] = pattern return self - def child_graph( - self, - dangling: dangling_mode_t = 'error', - ) -> dict[str, set[str]]: + def child_graph(self) -> dict[str, set[str | None]]: """ Return a mapping from pattern name to a set of all child patterns (patterns it references). - Only non-empty ref lists with non-`None` targets are treated as graph edges. - - Args: - dangling: How refs to missing targets are handled. `'error'` raises, - `'ignore'` drops those edges, and `'include'` exposes them as - synthetic leaf nodes. - Returns: Mapping from pattern name to a set of all pattern names it references. """ - graph, dangling_refs = self._raw_child_graph() - if dangling == 'error': - if dangling_refs: - raise self._dangling_refs_error(dangling_refs, 'building child graph') - return graph - if dangling == 'ignore': - existing = set(graph) - return {name: {child for child in children if child in existing} for name, children in graph.items()} - - for target in dangling_refs: - graph.setdefault(target, set()) + graph = {name: set(pat.refs.keys()) for name, pat in self.items()} return graph - def parent_graph( - self, - dangling: dangling_mode_t = 'error', - ) -> dict[str, set[str]]: + def parent_graph(self) -> dict[str, set[str]]: """ Return a mapping from pattern name to a set of all parent patterns (patterns which reference it). - Args: - dangling: How refs to missing targets are handled. `'error'` raises, - `'ignore'` drops those targets, and `'include'` adds them as - synthetic keys whose values are their existing parents. - Returns: Mapping from pattern name to a set of all patterns which reference it. """ - child_graph, dangling_refs = self._raw_child_graph() - if dangling == 'error' and dangling_refs: - raise self._dangling_refs_error(dangling_refs, 'building parent graph') - - existing = set(child_graph) - igraph: dict[str, set[str]] = {name: set() for name in existing} - for parent, children in child_graph.items(): - for child in children: - if child in existing: - igraph[child].add(parent) - elif dangling == 'include': - igraph.setdefault(child, set()).add(parent) + igraph: dict[str, set[str]] = {name: set() for name in self} + for name, pat in self.items(): + for child, reflist in pat.refs.items(): + if reflist and child is not None: + igraph[child].add(name) return igraph - def child_order( - self, - dangling: dangling_mode_t = 'error', - ) -> list[str]: + def child_order(self) -> list[str]: """ - Return a topologically sorted list of graph node names. + Return a topologically sorted list of all contained pattern names. Child (referenced) patterns will appear before their parents. - Args: - dangling: Passed to `child_graph()`. - Return: Topologically sorted list of pattern names. """ - try: - return cast('list[str]', list(TopologicalSorter(self.child_graph(dangling=dangling)).static_order())) - except CycleError as exc: - cycle = exc.args[1] if len(exc.args) > 1 else None - if cycle is None: - raise LibraryError('Cycle found while building child order') from exc - raise LibraryError(f'Cycle found while building child order: {cycle}') from exc + return cast('list[str]', list(TopologicalSorter(self.child_graph()).static_order())) def find_refs_local( self, name: str, parent_graph: dict[str, set[str]] | None = None, - dangling: dangling_mode_t = 'error', ) -> dict[str, list[NDArray[numpy.float64]]]: """ Find the location and orientation of all refs pointing to `name`. @@ -645,8 +568,6 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta): The provided graph may be for a superset of `self` (i.e. it may contain additional patterns which are not present in self; they will be ignored). - dangling: How refs to missing targets are handled if `parent_graph` - is not provided. `'include'` also allows querying missing names. Returns: Mapping of {parent_name: transform_list}, where transform_list @@ -655,18 +576,8 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta): """ instances = defaultdict(list) if parent_graph is None: - graph_mode = 'ignore' if dangling == 'ignore' else 'include' - parent_graph = self.parent_graph(dangling=graph_mode) - - if name not in self: - if name not in parent_graph: - return instances - if dangling == 'error': - raise self._dangling_refs_error({name}, f'finding local refs for {name!r}') - if dangling == 'ignore': - return instances - - for parent in parent_graph.get(name, set()): + parent_graph = self.parent_graph() + for parent in parent_graph[name]: if parent not in self: # parent_graph may be a for a superset of self continue for ref in self[parent].refs[name]: @@ -679,7 +590,6 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta): name: str, order: list[str] | None = None, parent_graph: dict[str, set[str]] | None = None, - dangling: dangling_mode_t = 'error', ) -> dict[tuple[str, ...], NDArray[numpy.float64]]: """ Find the absolute (top-level) location and orientation of all refs (including @@ -696,28 +606,18 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta): The provided graph may be for a superset of `self` (i.e. it may contain additional patterns which are not present in self; they will be ignored). - dangling: How refs to missing targets are handled if `order` or - `parent_graph` are not provided. `'include'` also allows - querying missing names. Returns: Mapping of `{hierarchy: transform_list}`, where `hierarchy` is a tuple of the form `(toplevel_pattern, lvl1_pattern, ..., name)` and `transform_list` is an Nx4 ndarray with rows `(x_offset, y_offset, rotation_ccw_rad, mirror_across_x)`. """ - graph_mode = 'ignore' if dangling == 'ignore' else 'include' - if order is None: - order = self.child_order(dangling=graph_mode) - if parent_graph is None: - parent_graph = self.parent_graph(dangling=graph_mode) - if name not in self: - if name not in parent_graph: - return {} - if dangling == 'error': - raise self._dangling_refs_error({name}, f'finding global refs for {name!r}') - if dangling == 'ignore': - return {} + return {} + if order is None: + order = self.child_order() + if parent_graph is None: + parent_graph = self.parent_graph() self_keys = set(self.keys()) @@ -726,16 +626,16 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta): NDArray[numpy.float64] ]]] transforms = defaultdict(list) - for parent, vals in self.find_refs_local(name, parent_graph=parent_graph, dangling=dangling).items(): + for parent, vals in self.find_refs_local(name, parent_graph=parent_graph).items(): transforms[parent] = [((name,), numpy.concatenate(vals))] for next_name in order: if next_name not in transforms: continue - if not parent_graph.get(next_name, set()) & self_keys: + if not parent_graph[next_name] & self_keys: continue - outers = self.find_refs_local(next_name, parent_graph=parent_graph, dangling=dangling) + outers = self.find_refs_local(next_name, parent_graph=parent_graph) inners = transforms.pop(next_name) for parent, outer in outers.items(): for path, inner in inners: @@ -783,33 +683,6 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta): def _merge(self, key_self: str, other: Mapping[str, 'Pattern'], key_other: str) -> None: pass - def resolve( - self, - other: 'Abstract | str | Pattern | TreeView', - append: bool = False, - ) -> 'Abstract | Pattern': - """ - Resolve another device (name, Abstract, Pattern, or TreeView) into an Abstract or Pattern. - If it is a TreeView, it is first added into this library. - - Args: - other: The device to resolve. - append: If True and `other` is an `Abstract`, returns the full `Pattern` from the library. - - Returns: - An `Abstract` or `Pattern` object. - """ - from .pattern import Pattern #noqa: PLC0415 - if not isinstance(other, (str, Abstract, Pattern)): - # We got a TreeView; add it into self and grab its topcell as an Abstract - other = self << other - - if isinstance(other, str): - other = self.abstract(other) - if append and isinstance(other, Abstract): - other = self[other.name] - return other - def rename( self, old_name: str, @@ -828,11 +701,6 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta): Returns: self """ - if old_name not in self: - raise LibraryError(f'"{old_name}" does not exist in the library.') - if old_name == new_name: - return self - self[new_name] = self[old_name] del self[old_name] if move_references: @@ -857,9 +725,6 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta): Returns: self """ - if old_target == new_target: - return self - for pattern in self.values(): if old_target in pattern.refs: pattern.refs[new_target].extend(pattern.refs[old_target]) @@ -899,7 +764,7 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta): Returns: (name, pattern) tuple """ - from .pattern import Pattern #noqa: PLC0415 + from .pattern import Pattern pat = Pattern() self[name] = pat return name, pat @@ -933,23 +798,18 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta): (default). Returns: - A mapping of `{old_name: new_name}` for all names in `other` which were - renamed while being added. Unchanged names are omitted. + A mapping of `{old_name: new_name}` for all `old_name`s in `other`. Unchanged + names map to themselves. Raises: `LibraryError` if a duplicate name is encountered even after applying `rename_theirs()`. """ - from .pattern import map_targets #noqa: PLC0415 + from .pattern import map_targets duplicates = set(self.keys()) & set(other.keys()) if not duplicates: - if mutate_other: - temp = other - else: - temp = Library(copy.deepcopy(dict(other))) - - for key in temp: - self._merge(key, temp, key) + for key in other: + self._merge(key, other, key) return {} if mutate_other: @@ -1050,7 +910,7 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta): # This currently simplifies globally (same shape in different patterns is # merged into the same ref target). - from .pattern import Pattern #noqa: PLC0415 + from .pattern import Pattern if exclude_types is None: exclude_types = () @@ -1059,18 +919,6 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta): def label2name(label: tuple) -> str: # noqa: ARG001 return self.get_name(SINGLE_USE_PREFIX + 'shape') - used_names = set(self.keys()) - - def reserve_target_name(label: tuple) -> str: - base_name = label2name(label) - name = base_name - ii = sum(1 for nn in used_names if nn.startswith(base_name)) if base_name in used_names else 0 - while name in used_names or name == '': - name = base_name + b64suffix(ii) - ii += 1 - used_names.add(name) - return name - shape_counts: MutableMapping[tuple, int] = defaultdict(int) shape_funcs = {} @@ -1087,7 +935,6 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta): shape_counts[label] += 1 shape_pats = {} - target_names = {} for label, count in shape_counts.items(): if count < threshold: continue @@ -1096,7 +943,6 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta): shape_pat = Pattern() shape_pat.shapes[label[-1]] += [shape_func()] shape_pats[label] = shape_pat - target_names[label] = reserve_target_name(label) # ## Second pass ## for pat in tuple(self.values()): @@ -1121,14 +967,14 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta): # For repeated shapes, create a `Pattern` holding a normalized shape object, # and add `pat.refs` entries for each occurrence in pat. Also, note down that # we should delete the `pat.shapes` entries for which we made `Ref`s. + shapes_to_remove = [] for label, shape_entries in shape_table.items(): layer = label[-1] - target = target_names[label] - shapes_to_remove = [] + target = label2name(label) for ii, values in shape_entries: offset, scale, rotation, mirror_x = values pat.ref(target=target, offset=offset, scale=scale, - rotation=rotation, mirrored=mirror_x) + rotation=rotation, mirrored=(mirror_x, False)) shapes_to_remove.append(ii) # Remove any shapes for which we have created refs. @@ -1136,7 +982,7 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta): del pat.shapes[layer][ii] for ll, pp in shape_pats.items(): - self[target_names[ll]] = pp + self[label2name(ll)] = pp return self @@ -1157,7 +1003,7 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta): Returns: self """ - from .pattern import Pattern #noqa: PLC0415 + from .pattern import Pattern if name_func is None: def name_func(_pat: Pattern, _shape: Shape | Label) -> str: @@ -1191,25 +1037,6 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta): return self - def resolve_repeated_refs(self, name: str | None = None) -> Self: - """ - Expand all repeated references into multiple individual references. - Alters the library in-place. - - Args: - name: If specified, only resolve repeated refs in this pattern. - Otherwise, resolve in all patterns. - - Returns: - self - """ - if name is not None: - self[name].resolve_repeated_refs() - else: - for pat in self.values(): - pat.resolve_repeated_refs() - return self - def subtree( self, tops: str | Sequence[str], @@ -1239,19 +1066,17 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta): def prune_empty( self, repeat: bool = True, - dangling: dangling_mode_t = 'error', ) -> set[str]: """ Delete any empty patterns (i.e. where `Pattern.is_empty` returns `True`). Args: repeat: Also recursively delete any patterns which only contain(ed) empty patterns. - dangling: Passed to `parent_graph()`. Returns: A set containing the names of all deleted patterns """ - parent_graph = self.parent_graph(dangling=dangling) + parent_graph = self.parent_graph() empty = {name for name, pat in self.items() if pat.is_empty()} trimmed = set() while empty: @@ -1381,7 +1206,7 @@ class Library(ILibrary): Returns: The newly created `Library` and the newly created `Pattern` """ - from .pattern import Pattern #noqa: PLC0415 + from .pattern import Pattern tree = cls() pat = Pattern() tree[name] = pat @@ -1397,12 +1222,12 @@ class LazyLibrary(ILibrary): """ mapping: dict[str, Callable[[], 'Pattern']] cache: dict[str, 'Pattern'] - _lookups_in_progress: list[str] + _lookups_in_progress: set[str] def __init__(self) -> None: self.mapping = {} self.cache = {} - self._lookups_in_progress = [] + self._lookups_in_progress = set() def __setitem__( self, @@ -1433,20 +1258,16 @@ class LazyLibrary(ILibrary): return self.cache[key] if key in self._lookups_in_progress: - chain = ' -> '.join(self._lookups_in_progress + [key]) raise LibraryError( - f'Detected circular reference or recursive lookup of "{key}".\n' - f'Lookup chain: {chain}\n' + f'Detected multiple simultaneous lookups of "{key}".\n' 'This may be caused by an invalid (cyclical) reference, or buggy code.\n' - 'If you are lazy-loading a file, try a non-lazy load and check for reference cycles.' + 'If you are lazy-loading a file, try a non-lazy load and check for reference cycles.' # TODO give advice on finding cycles ) - self._lookups_in_progress.append(key) - try: - func = self.mapping[key] - pat = func() - finally: - self._lookups_in_progress.pop() + self._lookups_in_progress.add(key) + func = self.mapping[key] + pat = func() + self._lookups_in_progress.remove(key) self.cache[key] = pat return pat @@ -1489,11 +1310,6 @@ class LazyLibrary(ILibrary): Returns: self """ - if old_name not in self.mapping: - raise LibraryError(f'"{old_name}" does not exist in the library.') - if old_name == new_name: - return self - self[new_name] = self.mapping[old_name] # copy over function if old_name in self.cache: self.cache[new_name] = self.cache[old_name] @@ -1515,9 +1331,6 @@ class LazyLibrary(ILibrary): Returns: self """ - if old_target == new_target: - return self - self.precache() for pattern in self.cache.values(): if old_target in pattern.refs: diff --git a/masque/pattern.py b/masque/pattern.py index e882795..7e0a79e 100644 --- a/masque/pattern.py +++ b/masque/pattern.py @@ -26,7 +26,6 @@ from .traits import AnnotatableImpl, Scalable, Mirrorable, Rotatable, Positionab from .ports import Port, PortList - logger = logging.getLogger(__name__) @@ -38,8 +37,8 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): or provide equivalent functions. `Pattern` also stores a dict of `Port`s, which can be used to "snap" together points. - See `Pattern.plug()` and `Pattern.place()`, as well as `builder.Pather` - and `ports.PortsList`. + See `Pattern.plug()` and `Pattern.place()`, as well as the helper classes + `builder.Builder`, `builder.Pather`, `builder.RenderPather`, and `ports.PortsList`. For convenience, ports can be read out using square brackets: - `pattern['A'] == Port((0, 0), 0)` @@ -172,8 +171,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): return s def __copy__(self) -> 'Pattern': - logger.warning('Making a shallow copy of a Pattern... old shapes/refs/labels are re-referenced! ' - 'Consider using .deepcopy() if this was not intended.') + logger.warning('Making a shallow copy of a Pattern... old shapes are re-referenced!') new = Pattern( annotations=copy.deepcopy(self.annotations), ports=copy.deepcopy(self.ports), @@ -200,7 +198,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): def __lt__(self, other: 'Pattern') -> bool: self_nonempty_targets = [target for target, reflist in self.refs.items() if reflist] - other_nonempty_targets = [target for target, reflist in other.refs.items() if reflist] + other_nonempty_targets = [target for target, reflist in self.refs.items() if reflist] self_tgtkeys = tuple(sorted((target is None, target) for target in self_nonempty_targets)) other_tgtkeys = tuple(sorted((target is None, target) for target in other_nonempty_targets)) @@ -214,7 +212,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): return refs_ours < refs_theirs self_nonempty_layers = [ll for ll, elems in self.shapes.items() if elems] - other_nonempty_layers = [ll for ll, elems in other.shapes.items() if elems] + other_nonempty_layers = [ll for ll, elems in self.shapes.items() if elems] self_layerkeys = tuple(sorted(layer2key(ll) for ll in self_nonempty_layers)) other_layerkeys = tuple(sorted(layer2key(ll) for ll in other_nonempty_layers)) @@ -223,21 +221,21 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): for _, _, layer in self_layerkeys: shapes_ours = tuple(sorted(self.shapes[layer])) - shapes_theirs = tuple(sorted(other.shapes[layer])) + shapes_theirs = tuple(sorted(self.shapes[layer])) if shapes_ours != shapes_theirs: return shapes_ours < shapes_theirs self_nonempty_txtlayers = [ll for ll, elems in self.labels.items() if elems] - other_nonempty_txtlayers = [ll for ll, elems in other.labels.items() if elems] + other_nonempty_txtlayers = [ll for ll, elems in self.labels.items() if elems] self_txtlayerkeys = tuple(sorted(layer2key(ll) for ll in self_nonempty_txtlayers)) other_txtlayerkeys = tuple(sorted(layer2key(ll) for ll in other_nonempty_txtlayers)) if self_txtlayerkeys != other_txtlayerkeys: return self_txtlayerkeys < other_txtlayerkeys - for _, _, layer in self_txtlayerkeys: + for _, _, layer in self_layerkeys: labels_ours = tuple(sorted(self.labels[layer])) - labels_theirs = tuple(sorted(other.labels[layer])) + labels_theirs = tuple(sorted(self.labels[layer])) if labels_ours != labels_theirs: return labels_ours < labels_theirs @@ -254,7 +252,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): return False self_nonempty_targets = [target for target, reflist in self.refs.items() if reflist] - other_nonempty_targets = [target for target, reflist in other.refs.items() if reflist] + other_nonempty_targets = [target for target, reflist in self.refs.items() if reflist] self_tgtkeys = tuple(sorted((target is None, target) for target in self_nonempty_targets)) other_tgtkeys = tuple(sorted((target is None, target) for target in other_nonempty_targets)) @@ -268,7 +266,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): return False self_nonempty_layers = [ll for ll, elems in self.shapes.items() if elems] - other_nonempty_layers = [ll for ll, elems in other.shapes.items() if elems] + other_nonempty_layers = [ll for ll, elems in self.shapes.items() if elems] self_layerkeys = tuple(sorted(layer2key(ll) for ll in self_nonempty_layers)) other_layerkeys = tuple(sorted(layer2key(ll) for ll in other_nonempty_layers)) @@ -277,21 +275,21 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): for _, _, layer in self_layerkeys: shapes_ours = tuple(sorted(self.shapes[layer])) - shapes_theirs = tuple(sorted(other.shapes[layer])) + shapes_theirs = tuple(sorted(self.shapes[layer])) if shapes_ours != shapes_theirs: return False self_nonempty_txtlayers = [ll for ll, elems in self.labels.items() if elems] - other_nonempty_txtlayers = [ll for ll, elems in other.labels.items() if elems] + other_nonempty_txtlayers = [ll for ll, elems in self.labels.items() if elems] self_txtlayerkeys = tuple(sorted(layer2key(ll) for ll in self_nonempty_txtlayers)) other_txtlayerkeys = tuple(sorted(layer2key(ll) for ll in other_nonempty_txtlayers)) if self_txtlayerkeys != other_txtlayerkeys: return False - for _, _, layer in self_txtlayerkeys: + for _, _, layer in self_layerkeys: labels_ours = tuple(sorted(self.labels[layer])) - labels_theirs = tuple(sorted(other.labels[layer])) + labels_theirs = tuple(sorted(self.labels[layer])) if labels_ours != labels_theirs: return False @@ -349,16 +347,6 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): Returns: self """ - annotation_conflicts: set[str] = set() - if other_pattern.annotations is not None and self.annotations is not None: - annotation_conflicts = set(self.annotations.keys()) & set(other_pattern.annotations.keys()) - if annotation_conflicts: - raise PatternError(f'Annotation keys overlap: {annotation_conflicts}') - - port_conflicts = set(self.ports.keys()) & set(other_pattern.ports.keys()) - if port_conflicts: - raise PatternError(f'Port names overlap: {port_conflicts}') - for target, rseq in other_pattern.refs.items(): self.refs[target].extend(rseq) for layer, sseq in other_pattern.shapes.items(): @@ -369,7 +357,14 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): if other_pattern.annotations is not None: if self.annotations is None: self.annotations = {} + annotation_conflicts = set(self.annotations.keys()) & set(other_pattern.annotations.keys()) + if annotation_conflicts: + raise PatternError(f'Annotation keys overlap: {annotation_conflicts}') self.annotations.update(other_pattern.annotations) + + port_conflicts = set(self.ports.keys()) & set(other_pattern.ports.keys()) + if port_conflicts: + raise PatternError(f'Port names overlap: {port_conflicts}') self.ports.update(other_pattern.ports) return self @@ -504,61 +499,6 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): ] return polys - def layer_as_polygons( - self, - layer: layer_t, - flatten: bool = True, - library: Mapping[str, 'Pattern'] | None = None, - ) -> list[Polygon]: - """ - Collect all geometry effectively on a given layer as a list of polygons. - - If `flatten=True`, it recursively gathers shapes on `layer` from all `self.refs`. - `Repetition` objects are expanded, and non-polygon shapes are converted - to `Polygon` approximations. - - Args: - layer: The layer to collect geometry from. - flatten: If `True`, include geometry from referenced patterns. - library: Required if `flatten=True` to resolve references. - - Returns: - A list of `Polygon` objects. - """ - if flatten and self.has_refs() and library is None: - raise PatternError("Must provide a library to layer_as_polygons() when flatten=True") - - polys: list[Polygon] = [] - - # Local shapes - for shape in self.shapes.get(layer, []): - for p in shape.to_polygons(): - # expand repetitions - if p.repetition is not None: - for offset in p.repetition.displacements: - polys.append(p.deepcopy().translate(offset).set_repetition(None)) - else: - polys.append(p.deepcopy()) - - if flatten and self.has_refs(): - assert library is not None - for target, refs in self.refs.items(): - if target is None: - continue - target_pat = library[target] - for ref in refs: - # Get polygons from target pattern on the same layer - ref_polys = target_pat.layer_as_polygons(layer, flatten=True, library=library) - # Apply ref transformations - for p in ref_polys: - p_pat = ref.as_pattern(Pattern(shapes={layer: [p]})) - # as_pattern expands repetition of the ref itself - # but we need to pull the polygons back out - for p_transformed in p_pat.shapes[layer]: - polys.append(cast('Polygon', p_transformed)) - - return polys - def referenced_patterns(self) -> set[str | None]: """ Get all pattern namers referenced by this pattern. Non-recursive. @@ -695,7 +635,6 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): """ for entry in chain(chain_elements(self.shapes, self.labels, self.refs), self.ports.values()): cast('Positionable', entry).translate(offset) - self._log_bulk_update(f"translate({offset!r})") return self def scale_elements(self, c: float) -> Self: @@ -749,9 +688,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self: """ - Extrinsic transformation: Rotate the Pattern around the a location in the - container's coordinate system. This affects all elements' offsets and - their repetition grids. + Rotate the Pattern around the a location. Args: pivot: (x, y) location to rotate around @@ -765,14 +702,11 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): self.rotate_elements(rotation) self.rotate_element_centers(rotation) self.translate_elements(+pivot) - self._log_bulk_update(f"rotate_around({pivot}, {rotation})") return self def rotate_element_centers(self, rotation: float) -> Self: """ - Extrinsic transformation part: Rotate the offsets and repetition grids of all - shapes, labels, refs, and ports around (0, 0) in the container's - coordinate system. + Rotate the offsets of all shapes, labels, refs, and ports around (0, 0) Args: rotation: Angle to rotate by (counter-clockwise, radians) @@ -783,15 +717,11 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): for entry in chain(chain_elements(self.shapes, self.refs, self.labels), self.ports.values()): old_offset = cast('Positionable', entry).offset cast('Positionable', entry).offset = numpy.dot(rotation_matrix_2d(rotation), old_offset) - if isinstance(entry, Repeatable) and entry.repetition is not None: - entry.repetition.rotate(rotation) return self def rotate_elements(self, rotation: float) -> Self: """ - Intrinsic transformation part: Rotate each shape, ref, label, and port around its - origin (offset) in the container's coordinate system. This does NOT - affect their repetition grids. + Rotate each shape, ref, and port around its origin (offset) Args: rotation: Angle to rotate by (counter-clockwise, radians) @@ -799,61 +729,54 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): Returns: self """ - for entry in chain(chain_elements(self.shapes, self.refs, self.labels), self.ports.values()): - if isinstance(entry, Rotatable): - entry.rotate(rotation) + for entry in chain(chain_elements(self.shapes, self.refs), self.ports.values()): + cast('Rotatable', entry).rotate(rotation) return self - def mirror_element_centers(self, axis: int = 0) -> Self: + def mirror_element_centers(self, across_axis: int = 0) -> Self: """ - Extrinsic transformation part: Mirror the offsets and repetition grids of all - shapes, labels, refs, and ports relative to the container's origin. + Mirror the offsets of all shapes, labels, and refs across an axis Args: - axis: Axis to mirror across (0: x-axis, 1: y-axis) + across_axis: Axis to mirror across + (0: mirror across x axis, 1: mirror across y axis) Returns: self """ for entry in chain(chain_elements(self.shapes, self.refs, self.labels), self.ports.values()): - cast('Positionable', entry).offset[1 - axis] *= -1 - if isinstance(entry, Repeatable) and entry.repetition is not None: - entry.repetition.mirror(axis) + cast('Positionable', entry).offset[1 - across_axis] *= -1 return self - def mirror_elements(self, axis: int = 0) -> Self: + def mirror_elements(self, across_axis: int = 0) -> Self: """ - Intrinsic transformation part: Mirror each shape, ref, label, and port relative - to its offset. This does NOT affect their repetition grids. + Mirror each shape, ref, and pattern across an axis, relative + to its offset Args: - axis: Axis to mirror across - 0: mirror across x axis (flip y), - 1: mirror across y axis (flip x) + across_axis: Axis to mirror across + (0: mirror across x axis, 1: mirror across y axis) Returns: self """ - for entry in chain(chain_elements(self.shapes, self.refs, self.labels), self.ports.values()): - if isinstance(entry, Mirrorable): - entry.mirror(axis=axis) - self._log_bulk_update(f"mirror_elements({axis})") + for entry in chain(chain_elements(self.shapes, self.refs), self.ports.values()): + cast('Mirrorable', entry).mirror(across_axis) return self - def mirror(self, axis: int = 0) -> Self: + def mirror(self, across_axis: int = 0) -> Self: """ - Extrinsic transformation: Mirror the Pattern across an axis through its origin. - This affects all elements' offsets and their internal orientations. + Mirror the Pattern across an axis Args: - axis: Axis to mirror across (0: x-axis, 1: y-axis). + across_axis: Axis to mirror across + (0: mirror across x axis, 1: mirror across y axis) Returns: self """ - self.mirror_elements(axis=axis) - self.mirror_element_centers(axis=axis) - self._log_bulk_update(f"mirror({axis})") + self.mirror_elements(across_axis) + self.mirror_element_centers(across_axis) return self def copy(self) -> Self: @@ -864,7 +787,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): Returns: A deep copy of the current Pattern. """ - return self.deepcopy() + return copy.deepcopy(self) def deepcopy(self) -> Self: """ @@ -1007,28 +930,6 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): del self.labels[layer] return self - def resolve_repeated_refs(self) -> Self: - """ - Expand all repeated references into multiple individual references. - Alters the current pattern in-place. - - Returns: - self - """ - new_refs: defaultdict[str | None, list[Ref]] = defaultdict(list) - for target, rseq in self.refs.items(): - for ref in rseq: - if ref.repetition is None: - new_refs[target].append(ref) - else: - for dd in ref.repetition.displacements: - new_ref = ref.deepcopy() - new_ref.offset = ref.offset + dd - new_ref.repetition = None - new_refs[target].append(new_ref) - self.refs = new_refs - return self - def prune_refs(self) -> Self: """ Remove empty ref lists in `self.refs`. @@ -1080,16 +981,10 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): if target_pat is None: raise PatternError(f'Circular reference in {name} to {target}') - ports_only = flatten_ports and bool(target_pat.ports) - if target_pat.is_empty() and not ports_only: # avoid some extra allocations + if target_pat.is_empty(): # avoid some extra allocations continue for ref in refs: - if flatten_ports and ref.repetition is not None and target_pat.ports: - raise PatternError( - f'Cannot flatten ports from repeated ref to {target!r}; ' - 'flatten with flatten_ports=False or expand/rename the ports manually first.' - ) p = ref.as_pattern(pattern=target_pat) if not flatten_ports: p.ports.clear() @@ -1108,8 +1003,6 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): line_color: str = 'k', fill_color: str = 'none', overdraw: bool = False, - filename: str | None = None, - ports: bool = False, ) -> None: """ Draw a picture of the Pattern and wait for the user to inspect it @@ -1120,18 +1013,15 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): klayout or a different GDS viewer! Args: - library: Mapping of {name: Pattern} for resolving references. Required if `self.has_refs()`. - offset: Coordinates to offset by before drawing. - line_color: Outlines are drawn with this color. - fill_color: Interiors are drawn with this color. - overdraw: Whether to create a new figure or draw on a pre-existing one. - filename: If provided, save the figure to this file instead of showing it. - ports: If True, annotate the plot with arrows representing the ports. + offset: Coordinates to offset by before drawing + line_color: Outlines are drawn with this color (passed to `matplotlib.collections.PolyCollection`) + fill_color: Interiors are drawn with this color (passed to `matplotlib.collections.PolyCollection`) + overdraw: Whether to create a new figure or draw on a pre-existing one """ # TODO: add text labels to visualize() try: - from matplotlib import pyplot # type: ignore #noqa: PLC0415 - import matplotlib.collections # type: ignore #noqa: PLC0415 + from matplotlib import pyplot # type: ignore + import matplotlib.collections # type: ignore except ImportError: logger.exception('Pattern.visualize() depends on matplotlib!\n' + 'Make sure to install masque with the [visualize] option to pull in the needed dependencies.') @@ -1140,155 +1030,48 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): if self.has_refs() and library is None: raise PatternError('Must provide a library when visualizing a pattern with refs') - # Cache for {Pattern object ID: List of local polygon vertex arrays} - # Polygons are stored relative to the pattern's origin (offset included) - poly_cache: dict[int, list[NDArray[numpy.float64]]] = {} + offset = numpy.asarray(offset, dtype=float) - def get_local_polys(pat: 'Pattern') -> list[NDArray[numpy.float64]]: - pid = id(pat) - if pid not in poly_cache: - polys = [] - for shape in chain.from_iterable(pat.shapes.values()): - for ss in shape.to_polygons(): - # Shape.to_polygons() returns Polygons with their own offsets and vertices. - # We need to expand any shape-level repetition here. - v_base = ss.vertices + ss.offset - if ss.repetition is not None: - for disp in ss.repetition.displacements: - polys.append(v_base + disp) - else: - polys.append(v_base) - poly_cache[pid] = polys - return poly_cache[pid] - - all_polygons: list[NDArray[numpy.float64]] = [] - port_info: list[tuple[str, NDArray[numpy.float64], float]] = [] - - def collect_polys_recursive( - pat: 'Pattern', - c_offset: NDArray[numpy.float64], - c_rotation: float, - c_mirrored: bool, - c_scale: float, - ) -> None: - # Current transform: T(c_offset) * R(c_rotation) * M(c_mirrored) * S(c_scale) - - # 1. Transform and collect local polygons - local_polys = get_local_polys(pat) - if local_polys: - rot_mat = rotation_matrix_2d(c_rotation) - for v in local_polys: - vt = v * c_scale - if c_mirrored: - vt = vt.copy() - vt[:, 1] *= -1 - vt = (rot_mat @ vt.T).T + c_offset - all_polygons.append(vt) - - # 2. Collect ports if requested - if ports: - for name, p in pat.ports.items(): - pt_v = p.offset * c_scale - if c_mirrored: - pt_v = pt_v.copy() - pt_v[1] *= -1 - pt_v = rotation_matrix_2d(c_rotation) @ pt_v + c_offset - - if p.rotation is not None: - pt_rot = p.rotation - if c_mirrored: - pt_rot = -pt_rot - pt_rot += c_rotation - port_info.append((name, pt_v, pt_rot)) - - # 3. Recurse into refs - for target, refs in pat.refs.items(): - if target is None: - continue - assert library is not None - target_pat = library[target] - for ref in refs: - # Ref order of operations: mirror, rotate, scale, translate, repeat - - # Combined scale and mirror - r_scale = c_scale * ref.scale - r_mirrored = c_mirrored ^ ref.mirrored - - # Combined rotation: push c_mirrored and c_rotation through ref.rotation - r_rot_relative = -ref.rotation if c_mirrored else ref.rotation - r_rotation = c_rotation + r_rot_relative - - # Offset composition helper - def get_full_offset(rel_offset: NDArray[numpy.float64]) -> NDArray[numpy.float64]: - o = rel_offset * c_scale - if c_mirrored: - o = o.copy() - o[1] *= -1 - return rotation_matrix_2d(c_rotation) @ o + c_offset - - if ref.repetition is not None: - for disp in ref.repetition.displacements: - collect_polys_recursive( - target_pat, - get_full_offset(ref.offset + disp), - r_rotation, - r_mirrored, - r_scale - ) - else: - collect_polys_recursive( - target_pat, - get_full_offset(ref.offset), - r_rotation, - r_mirrored, - r_scale - ) - - # Start recursive collection - collect_polys_recursive(self, numpy.asarray(offset, dtype=float), 0.0, False, 1.0) - - # Plotting if not overdraw: figure = pyplot.figure() + pyplot.axis('equal') else: figure = pyplot.gcf() axes = figure.gca() - if all_polygons: - mpl_poly_collection = matplotlib.collections.PolyCollection( - all_polygons, - facecolors = fill_color, - edgecolors = line_color, - ) - axes.add_collection(mpl_poly_collection) + polygons = [] + for shape in chain.from_iterable(self.shapes.values()): + polygons += [offset + s.offset + s.vertices for s in shape.to_polygons()] - if ports: - for port_name, pt_v, pt_rot in port_info: - p1 = pt_v - angle = pt_rot - size = 1.0 # arrow size - p2 = p1 + size * numpy.array([numpy.cos(angle), numpy.sin(angle)]) + mpl_poly_collection = matplotlib.collections.PolyCollection( + polygons, + facecolors=fill_color, + edgecolors=line_color, + ) + axes.add_collection(mpl_poly_collection) + pyplot.axis('equal') - axes.annotate( - port_name, - xy = tuple(p1), - xytext = tuple(p2), - arrowprops = dict(arrowstyle="->", color='g', linewidth=1), - color = 'g', - fontsize = 8, + for target, refs in self.refs.items(): + if target is None: + continue + if not refs: + continue + assert library is not None + target_pat = library[target] + for ref in refs: + ref.as_pattern(target_pat).visualize( + library=library, + offset=offset, + overdraw=True, + line_color=line_color, + fill_color=fill_color, ) - axes.autoscale_view() - axes.set_aspect('equal') - if not overdraw: - axes.set_xlabel('x') - axes.set_ylabel('y') - if filename: - figure.savefig(filename) - else: - figure.show() + pyplot.xlabel('x') + pyplot.ylabel('y') + pyplot.show() # @overload # def place( @@ -1331,7 +1114,6 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): port_map: dict[str, str | None] | None = None, skip_port_check: bool = False, append: bool = False, - skip_geometry: bool = False, ) -> Self: """ Instantiate or append the pattern `other` into the current pattern, adding its @@ -1363,10 +1145,6 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): append: If `True`, `other` is appended instead of being referenced. Note that this does not flatten `other`, so its refs will still be refs (now inside `self`). - skip_geometry: If `True`, the operation only updates the port list and - skips adding any geometry (shapes, labels, or references). This - allows the pattern assembly to proceed for port-tracking purposes - even when layout generation is suppressed. Returns: self @@ -1381,27 +1159,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): port_map = {} if not skip_port_check: - port_map, overwrite_targets = self._resolve_insert_mapping( - other.ports.keys(), - map_in=None, - map_out=port_map, - allow_conflicts=skip_geometry, - ) - for target in overwrite_targets: - self.ports.pop(target, None) - - if not skip_geometry: - if append: - if isinstance(other, Abstract): - raise PatternError('Must provide a full `Pattern` (not an `Abstract`) when appending!') - if other.annotations is not None and self.annotations is not None: - annotation_conflicts = set(self.annotations.keys()) & set(other.annotations.keys()) - if annotation_conflicts: - raise PatternError(f'Annotation keys overlap: {annotation_conflicts}') - else: - if isinstance(other, Pattern): - raise PatternError('Must provide an `Abstract` (not a `Pattern`) when creating a reference. ' - 'Use `append=True` if you intended to append the full geometry.') + self.check_ports(other.ports.keys(), map_in=None, map_out=port_map) ports = {} for name, port in other.ports.items(): @@ -1418,12 +1176,10 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): pp.rotate_around(pivot, rotation) pp.translate(offset) self.ports[name] = pp - self._log_port_update(name) - - if skip_geometry: - return self if append: + if isinstance(other, Abstract): + raise PatternError('Must provide a full `Pattern` (not an `Abstract`) when appending!') other_copy = other.deepcopy() other_copy.ports.clear() if mirrored: @@ -1432,6 +1188,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): other_copy.translate_elements(offset) self.append(other_copy) else: + assert not isinstance(other, Pattern) ref = Ref(mirrored=mirrored) ref.rotate_around(pivot, rotation) ref.translate(offset) @@ -1477,7 +1234,6 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): set_rotation: bool | None = None, append: bool = False, ok_connections: Iterable[tuple[str, str]] = (), - skip_geometry: bool = False, ) -> Self: """ Instantiate or append a pattern into the current pattern, connecting @@ -1485,7 +1241,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): ports specified by `map_out`. Examples: - ========= + ======list, === - `my_pat.plug(subdevice, {'A': 'C', 'B': 'B'}, map_out={'D': 'myport'})` instantiates `subdevice` into `my_pat`, plugging ports 'A' and 'B' of `my_pat` into ports 'C' and 'B' of `subdevice`. The connected ports @@ -1532,11 +1288,6 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): any other ptypte. Non-allowed ptype connections will emit a warning. Order is ignored, i.e. `(a, b)` is equivalent to `(b, a)`. - skip_geometry: If `True`, only ports are updated and geometry is - skipped. If a valid transform cannot be found (e.g. due to - misaligned ports), a 'best-effort' dummy transform is used - to ensure new ports are still added at approximate locations, - allowing downstream routing to continue. Returns: self @@ -1568,59 +1319,23 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): out_port_name = next(iter(set(other.ports.keys()) - set(map_in.values()))) map_out = {out_port_name: next(iter(map_in.keys()))} - map_out, overwrite_targets = self._resolve_insert_mapping( - other.ports.keys(), + self.check_ports(other.ports.keys(), map_in, map_out) + translation, rotation, pivot = self.find_transform( + other, map_in, - map_out, - allow_conflicts=skip_geometry, + mirrored = mirrored, + set_rotation = set_rotation, + ok_connections = ok_connections, ) - if not skip_geometry: - if append: - if isinstance(other, Abstract): - raise PatternError('Must provide a full `Pattern` (not an `Abstract`) when appending!') - if other.annotations is not None and self.annotations is not None: - annotation_conflicts = set(self.annotations.keys()) & set(other.annotations.keys()) - if annotation_conflicts: - raise PatternError(f'Annotation keys overlap: {annotation_conflicts}') - elif isinstance(other, Pattern): - raise PatternError('Must provide an `Abstract` (not a `Pattern`) when creating a reference. ' - 'Use `append=True` if you intended to append the full geometry.') - try: - translation, rotation, pivot = self.find_transform( - other, - map_in, - mirrored = mirrored, - set_rotation = set_rotation, - ok_connections = ok_connections, - ) - except PortError: - if not skip_geometry: - raise - logger.warning("Port transform failed for dead device. Using dummy transform.") - if map_in: - ki, vi = next(iter(map_in.items())) - s_port = self.ports[ki] - o_port = other.ports[vi].deepcopy() - if mirrored: - o_port.mirror() - o_port.offset[1] *= -1 - translation = s_port.offset - o_port.offset - rotation = (s_port.rotation - o_port.rotation - pi) if (s_port.rotation is not None and o_port.rotation is not None) else 0 - pivot = o_port.offset - else: - translation = numpy.zeros(2) - rotation = 0.0 - pivot = numpy.zeros(2) - - for target in overwrite_targets: - self.ports.pop(target, None) # get rid of plugged ports for ki, vi in map_in.items(): del self.ports[ki] - self._log_port_removal(ki) map_out[vi] = None + if isinstance(other, Pattern): + assert append, 'Got a name (not an abstract) but was asked to reference (not append)' + self.place( other, offset = translation, @@ -1630,7 +1345,6 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): port_map = map_out, skip_port_check = True, append = append, - skip_geometry = skip_geometry, ) return self @@ -1664,7 +1378,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): current device. Args: - source: A collection of ports (e.g. Pattern, Pather, or dict) + source: A collection of ports (e.g. Pattern, Builder, or dict) from which to create the interface. in_prefix: Prepended to port names for newly-created ports with reversed directions compared to the current device. @@ -1692,13 +1406,9 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable): else: raise PatternError(f'Unable to get ports from {type(source)}: {source}') - if port_map is not None: + if port_map: if isinstance(port_map, dict): missing_inkeys = set(port_map.keys()) - set(orig_ports.keys()) - port_targets = list(port_map.values()) - duplicate_targets = {vv for vv in port_targets if port_targets.count(vv) > 1} - if duplicate_targets: - raise PortError(f'Duplicate targets in `port_map`: {duplicate_targets}') mapped_ports = {port_map[k]: v for k, v in orig_ports.items() if k in port_map} else: port_set = set(port_map) diff --git a/masque/ports.py b/masque/ports.py index e745880..0211723 100644 --- a/masque/ports.py +++ b/masque/ports.py @@ -2,7 +2,6 @@ from typing import overload, Self, NoReturn, Any from collections.abc import Iterable, KeysView, ValuesView, Mapping import logging import functools -import copy from collections import Counter from abc import ABCMeta, abstractmethod from itertools import chain @@ -11,17 +10,16 @@ import numpy from numpy import pi from numpy.typing import ArrayLike, NDArray -from .traits import PositionableImpl, PivotableImpl, Copyable, Mirrorable, Flippable +from .traits import PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable from .utils import rotate_offsets_around, rotation_matrix_2d from .error import PortError, format_stacktrace logger = logging.getLogger(__name__) -port_logger = logging.getLogger('masque.ports') @functools.total_ordering -class Port(PivotableImpl, PositionableImpl, Mirrorable, Flippable, Copyable): +class Port(PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable): """ A point at which a `Device` can be snapped to another `Device`. @@ -93,12 +91,6 @@ class Port(PivotableImpl, PositionableImpl, Mirrorable, Flippable, Copyable): def copy(self) -> Self: return self.deepcopy() - def __deepcopy__(self, memo: dict | None = None) -> Self: - memo = {} if memo is None else memo - new = copy.copy(self) - new._offset = self._offset.copy() - return new - def get_bounds(self) -> NDArray[numpy.float64]: return numpy.vstack((self.offset, self.offset)) @@ -107,27 +99,6 @@ class Port(PivotableImpl, PositionableImpl, Mirrorable, Flippable, Copyable): self.ptype = ptype return self - def flip_across(self, axis: int | None = None, *, x: float | None = None, y: float | None = None) -> Self: - """ - Mirror the object across a line in the container's coordinate system. - - Note this operation is performed relative to the pattern's origin and modifies the port's offset. - - Args: - axis: Axis to mirror across. 0 mirrors across y=0. 1 mirrors across x=0. - x: Vertical line x=val to mirror across. - y: Horizontal line y=val to mirror across. - - Returns: - self - """ - axis, pivot = self._check_flip_args(axis=axis, x=x, y=y) - self.translate(-pivot) - self.mirror(axis) - self.offset[1 - axis] *= -1 - self.translate(+pivot) - return self - def mirror(self, axis: int = 0) -> Self: if self.rotation is not None: self.rotation *= -1 @@ -143,34 +114,6 @@ class Port(PivotableImpl, PositionableImpl, Mirrorable, Flippable, Copyable): self.rotation = rotation return self - def describe(self) -> str: - """ - Returns a human-readable description of the port's state including cardinal directions. - """ - deg = numpy.rad2deg(self.rotation) if self.rotation is not None else None - - cardinal = "" - travel_dir = "" - - if self.rotation is not None: - dirs = {0: "East (+x)", 90: "North (+y)", 180: "West (-x)", 270: "South (-y)"} - # normalize to [0, 360) - deg_norm = deg % 360 - - # Find closest cardinal - closest = min(dirs.keys(), key=lambda x: abs((deg_norm - x + 180) % 360 - 180)) - if numpy.isclose((deg_norm - closest + 180) % 360 - 180, 0, atol=1e-3): - cardinal = f" ({dirs[closest]})" - - # Travel direction (rotation + 180) - t_deg = (deg_norm + 180) % 360 - closest_t = min(dirs.keys(), key=lambda x: abs((t_deg - x + 180) % 360 - 180)) - if numpy.isclose((t_deg - closest_t + 180) % 360 - 180, 0, atol=1e-3): - travel_dir = f" (Travel -> {dirs[closest_t]})" - - deg_text = 'any' if deg is None else f'{deg:g}' - return f"pos=({self.x:g}, {self.y:g}), rot={deg_text}{cardinal}{travel_dir}" - def __repr__(self) -> str: if self.rotation is None: rot = 'any' @@ -236,19 +179,6 @@ class PortList(metaclass=ABCMeta): def ports(self, value: dict[str, Port]) -> None: pass - def _log_port_update(self, name: str) -> None: - """ Log the current state of the named port """ - port_logger.debug("Port %s: %s", name, self.ports[name].describe()) - - def _log_port_removal(self, name: str) -> None: - """ Log that the named port has been removed """ - port_logger.debug("Port %s: removed", name) - - def _log_bulk_update(self, label: str) -> None: - """ Log all current ports at DEBUG level """ - for name, port in self.ports.items(): - port_logger.debug("%s: Port %s: %s", label, name, port) - @overload def __getitem__(self, key: str) -> Port: pass @@ -273,12 +203,6 @@ class PortList(metaclass=ABCMeta): else: # noqa: RET505 return {k: self.ports[k] for k in key} - def measure_travel(self, src: str, dst: str) -> tuple[NDArray[numpy.float64], float | None]: - """ - Convenience wrapper for measuring travel between two named ports. - """ - return self[src].measure_travel(self[dst]) - def __contains__(self, key: str) -> NoReturn: raise NotImplementedError('PortsList.__contains__ is left unimplemented. Use `key in container.ports` instead.') @@ -308,7 +232,6 @@ class PortList(metaclass=ABCMeta): raise PortError(f'Port {name} already exists.') assert name not in self.ports self.ports[name] = value - self._log_port_update(name) return self def rename_ports( @@ -330,147 +253,17 @@ class PortList(metaclass=ABCMeta): Returns: self """ - self._rename_ports_impl(mapping, overwrite=overwrite) - return self - - @staticmethod - def _normalize_target_mapping( - ordered_targets: Iterable[tuple[str, str | None]], - explicit_map: Mapping[str, str | None] | None = None, - ) -> dict[str, str | None]: - ordered_targets = list(ordered_targets) - normalized = {} if explicit_map is None else copy.deepcopy(dict(explicit_map)) - winners = { - target: source - for source, target in ordered_targets - if target is not None - } - for source, target in ordered_targets: - if target is not None and winners[target] != source: - normalized[source] = None - return normalized - - def _resolve_insert_mapping( - self, - other_names: Iterable[str], - map_in: Mapping[str, str] | None = None, - map_out: Mapping[str, str | None] | None = None, - *, - allow_conflicts: bool = False, - ) -> tuple[dict[str, str | None], set[str]]: - if map_in is None: - map_in = {} - - normalized_map_out = {} if map_out is None else copy.deepcopy(dict(map_out)) - other_names = list(other_names) - other = set(other_names) - - missing_inkeys = set(map_in.keys()) - set(self.ports.keys()) - if missing_inkeys: - raise PortError(f'`map_in` keys not present in device: {missing_inkeys}') - - missing_invals = set(map_in.values()) - other - if missing_invals: - raise PortError(f'`map_in` values not present in other device: {missing_invals}') - - map_in_counts = Counter(map_in.values()) - conflicts_in = {kk for kk, vv in map_in_counts.items() if vv > 1} - if conflicts_in: - raise PortError(f'Duplicate values in `map_in`: {conflicts_in}') - - missing_outkeys = set(normalized_map_out.keys()) - other - if missing_outkeys: - raise PortError(f'`map_out` keys not present in other device: {missing_outkeys}') - - connected_outkeys = set(normalized_map_out.keys()) & set(map_in.values()) - if connected_outkeys: - raise PortError(f'`map_out` keys conflict with connected ports: {connected_outkeys}') - - orig_remaining = set(self.ports.keys()) - set(map_in.keys()) - connected = set(map_in.values()) - if allow_conflicts: - ordered_targets = [ - (name, normalized_map_out.get(name, name)) - for name in other_names - if name not in connected - ] - normalized_map_out = self._normalize_target_mapping(ordered_targets, normalized_map_out) - final_targets = { - normalized_map_out.get(name, name) - for name in other_names - if name not in connected and normalized_map_out.get(name, name) is not None - } - overwrite_targets = {target for target in final_targets if target in orig_remaining} - return normalized_map_out, overwrite_targets - - other_remaining = other - set(normalized_map_out.keys()) - connected - mapped_vals = set(normalized_map_out.values()) - mapped_vals.discard(None) - - conflicts_final = orig_remaining & (other_remaining | mapped_vals) - if conflicts_final: - raise PortError(f'Device ports conflict with existing ports: {conflicts_final}') - - conflicts_partial = other_remaining & mapped_vals - if conflicts_partial: - raise PortError(f'`map_out` targets conflict with non-mapped outputs: {conflicts_partial}') - - map_out_counts = Counter(normalized_map_out.values()) - map_out_counts[None] = 0 - conflicts_out = {kk for kk, vv in map_out_counts.items() if vv > 1} - if conflicts_out: - raise PortError(f'Duplicate targets in `map_out`: {conflicts_out}') - return normalized_map_out, set() - - def _rename_ports_impl( - self, - mapping: Mapping[str, str | None], - *, - overwrite: bool = False, - allow_collisions: bool = False, - ) -> dict[str, str]: if not overwrite: duplicates = (set(self.ports.keys()) - set(mapping.keys())) & set(mapping.values()) if duplicates: raise PortError(f'Unrenamed ports would be overwritten: {duplicates}') - missing = set(mapping) - set(self.ports) - if missing: - raise PortError(f'Ports to rename were not found: {missing}') - renamed_targets = [vv for vv in mapping.values() if vv is not None] - if not allow_collisions: - duplicate_targets = {vv for vv in renamed_targets if renamed_targets.count(vv) > 1} - if duplicate_targets: - raise PortError(f'Renamed ports would collide: {duplicate_targets}') - winners = { - target: source - for source, target in mapping.items() - if target is not None - } - overwritten = { - target - for target, source in winners.items() - if target in self.ports and target not in mapping and target != source - } + renamed = {vv: self.ports.pop(kk) for kk, vv in mapping.items()} + if None in renamed: + del renamed[None] - for kk, vv in mapping.items(): - if vv is None or vv != kk: - self._log_port_removal(kk) - - source_ports = {kk: self.ports.pop(kk) for kk in mapping} - for target in overwritten: - self.ports.pop(target, None) - - renamed = { - vv: source_ports[kk] - for kk, vv in mapping.items() - if vv is not None and winners[vv] == kk - } self.ports.update(renamed) # type: ignore - - for vv in winners: - self._log_port_update(vv) - return winners + return self def add_port_pair( self, @@ -492,16 +285,12 @@ class PortList(metaclass=ABCMeta): Returns: self """ - if names[0] == names[1]: - raise PortError(f'Port names must be distinct: {names[0]!r}') new_ports = { names[0]: Port(offset, rotation=rotation, ptype=ptype), names[1]: Port(offset, rotation=rotation + pi, ptype=ptype), } self.check_ports(names) self.ports.update(new_ports) - self._log_port_update(names[0]) - self._log_port_update(names[1]) return self def plugged( @@ -524,19 +313,7 @@ class PortList(metaclass=ABCMeta): Raises: `PortError` if the ports are not properly aligned. """ - if not connections: - raise PortError('Must provide at least one port connection') - missing_a = set(connections) - set(self.ports) - if missing_a: - raise PortError(f'Connection source ports were not found: {missing_a}') - missing_b = set(connections.values()) - set(self.ports) - if missing_b: - raise PortError(f'Connection destination ports were not found: {missing_b}') a_names, b_names = list(zip(*connections.items(), strict=True)) - used_names = list(chain(a_names, b_names)) - duplicate_names = {name for name in used_names if used_names.count(name) > 1} - if duplicate_names: - raise PortError(f'Each port may appear in at most one connection: {duplicate_names}') a_ports = [self.ports[pp] for pp in a_names] b_ports = [self.ports[pp] for pp in b_names] @@ -583,7 +360,6 @@ class PortList(metaclass=ABCMeta): for pp in chain(a_names, b_names): del self.ports[pp] - self._log_port_removal(pp) return self def check_ports( @@ -614,7 +390,45 @@ class PortList(metaclass=ABCMeta): `PortError` if there are any duplicate names after `map_in` and `map_out` are applied. """ - self._resolve_insert_mapping(other_names, map_in, map_out) + if map_in is None: + map_in = {} + + if map_out is None: + map_out = {} + + other = set(other_names) + + missing_inkeys = set(map_in.keys()) - set(self.ports.keys()) + if missing_inkeys: + raise PortError(f'`map_in` keys not present in device: {missing_inkeys}') + + missing_invals = set(map_in.values()) - other + if missing_invals: + raise PortError(f'`map_in` values not present in other device: {missing_invals}') + + missing_outkeys = set(map_out.keys()) - other + if missing_outkeys: + raise PortError(f'`map_out` keys not present in other device: {missing_outkeys}') + + orig_remaining = set(self.ports.keys()) - set(map_in.keys()) + other_remaining = other - set(map_out.keys()) - set(map_in.values()) + mapped_vals = set(map_out.values()) + mapped_vals.discard(None) + + conflicts_final = orig_remaining & (other_remaining | mapped_vals) + if conflicts_final: + raise PortError(f'Device ports conflict with existing ports: {conflicts_final}') + + conflicts_partial = other_remaining & mapped_vals + if conflicts_partial: + raise PortError(f'`map_out` targets conflict with non-mapped outputs: {conflicts_partial}') + + map_out_counts = Counter(map_out.values()) + map_out_counts[None] = 0 + conflicts_out = {kk for kk, vv in map_out_counts.items() if vv > 1} + if conflicts_out: + raise PortError(f'Duplicate targets in `map_out`: {conflicts_out}') + return self def find_transform( @@ -654,8 +468,6 @@ class PortList(metaclass=ABCMeta): The rotation should be performed before the translation. """ - if not map_in: - raise PortError('Must provide at least one port connection') s_ports = self[map_in.keys()] o_ports = other[map_in.values()] return self.find_port_transform( @@ -707,8 +519,6 @@ class PortList(metaclass=ABCMeta): The rotation should be performed before the translation. """ - if not map_in: - raise PortError('Must provide at least one port connection') s_offsets = numpy.array([p.offset for p in s_ports.values()]) o_offsets = numpy.array([p.offset for p in o_ports.values()]) s_types = [p.ptype for p in s_ports.values()] @@ -738,7 +548,7 @@ class PortList(metaclass=ABCMeta): rotations = numpy.mod(s_rotations - o_rotations - pi, 2 * pi) if not has_rot.any(): if set_rotation is None: - raise PortError('Must provide set_rotation if rotation is indeterminate') + PortError('Must provide set_rotation if rotation is indeterminate') rotations[:] = set_rotation else: rotations[~has_rot] = rotations[has_rot][0] @@ -763,3 +573,4 @@ class PortList(metaclass=ABCMeta): raise PortError(msg) return translations[0], rotations[0], o_offsets[0] + diff --git a/masque/ref.py b/masque/ref.py index 0cc911f..b3a684c 100644 --- a/masque/ref.py +++ b/masque/ref.py @@ -15,8 +15,7 @@ from .utils import annotations_t, rotation_matrix_2d, annotations_eq, annotation from .repetition import Repetition from .traits import ( PositionableImpl, RotatableImpl, ScalableImpl, - PivotableImpl, Copyable, RepeatableImpl, AnnotatableImpl, - FlippableImpl, + Mirrorable, PivotableImpl, Copyable, RepeatableImpl, AnnotatableImpl, ) @@ -26,9 +25,8 @@ if TYPE_CHECKING: @functools.total_ordering class Ref( - FlippableImpl, PivotableImpl, RepeatableImpl, AnnotatableImpl, - PositionableImpl, RotatableImpl, ScalableImpl, - Copyable, + PositionableImpl, RotatableImpl, ScalableImpl, Mirrorable, + PivotableImpl, Copyable, RepeatableImpl, AnnotatableImpl, ): """ `Ref` provides basic support for nesting Pattern objects within each other. @@ -44,7 +42,7 @@ class Ref( __slots__ = ( '_mirrored', # inherited - '_offset', '_rotation', '_scale', '_repetition', '_annotations', + '_offset', '_rotation', 'scale', '_repetition', '_annotations', ) _mirrored: bool @@ -86,48 +84,24 @@ class Ref( self.repetition = repetition self.annotations = annotations if annotations is not None else {} - @classmethod - def _from_raw( - cls, - *, - offset: NDArray[numpy.float64], - rotation: float, - mirrored: bool, - scale: float, - repetition: Repetition | None, - annotations: annotations_t | None, - ) -> Self: - new = cls.__new__(cls) - new._offset = offset - new._rotation = rotation % (2 * pi) - new._scale = scale - new._mirrored = mirrored - new._repetition = repetition - new._annotations = annotations - return new - def __copy__(self) -> 'Ref': new = Ref( offset=self.offset.copy(), rotation=self.rotation, scale=self.scale, mirrored=self.mirrored, - repetition=self.repetition, - annotations=self.annotations, + repetition=copy.deepcopy(self.repetition), + annotations=copy.deepcopy(self.annotations), ) return new def __deepcopy__(self, memo: dict | None = None) -> 'Ref': memo = {} if memo is None else memo new = copy.copy(self) - new._offset = self._offset.copy() - new.repetition = copy.deepcopy(self.repetition, memo) - new.annotations = copy.deepcopy(self.annotations, memo) + #new.repetition = copy.deepcopy(self.repetition, memo) + #new.annotations = copy.deepcopy(self.annotations, memo) return new - def copy(self) -> 'Ref': - return self.deepcopy() - def __lt__(self, other: 'Ref') -> bool: if (self.offset != other.offset).any(): return tuple(self.offset) < tuple(other.offset) @@ -142,8 +116,6 @@ class Ref( return annotations_lt(self.annotations, other.annotations) def __eq__(self, other: Any) -> bool: - if type(self) is not type(other): - return False return ( numpy.array_equal(self.offset, other.offset) and self.mirrored == other.mirrored @@ -188,16 +160,16 @@ class Ref( return pattern def rotate(self, rotation: float) -> Self: - """ - Intrinsic transformation: Rotate the target pattern relative to this Ref's - origin. This does NOT affect the repetition grid. - """ self.rotation += rotation + if self.repetition is not None: + self.repetition.rotate(rotation) return self def mirror(self, axis: int = 0) -> Self: self.mirror_target(axis) self.rotation *= -1 + if self.repetition is not None: + self.repetition.mirror(axis) return self def mirror_target(self, axis: int = 0) -> Self: @@ -215,11 +187,10 @@ class Ref( xys = self.offset[None, :] if self.repetition is not None: xys = xys + self.repetition.displacements - transforms = numpy.empty((xys.shape[0], 5)) + transforms = numpy.empty((xys.shape[0], 4)) transforms[:, :2] = xys transforms[:, 2] = self.rotation transforms[:, 3] = self.mirrored - transforms[:, 4] = self.scale return transforms def get_bounds_single( @@ -256,10 +227,7 @@ class Ref( bounds = numpy.vstack((numpy.min(corners, axis=0), numpy.max(corners, axis=0))) * self.scale + [self.offset] return bounds - - single_ref = self.deepcopy() - single_ref.repetition = None - return single_ref.as_pattern(pattern=pattern).get_bounds(library) + return self.as_pattern(pattern=pattern).get_bounds(library) def __repr__(self) -> str: rotation = f' r{numpy.rad2deg(self.rotation):g}' if self.rotation != 0 else '' diff --git a/masque/repetition.py b/masque/repetition.py index 9e8af26..5e7a7f0 100644 --- a/masque/repetition.py +++ b/masque/repetition.py @@ -34,7 +34,7 @@ class Repetition(Copyable, Rotatable, Mirrorable, Scalable, Bounded, metaclass=A pass @abstractmethod - def __lt__(self, other: 'Repetition') -> bool: + def __le__(self, other: 'Repetition') -> bool: pass @abstractmethod @@ -64,7 +64,7 @@ class Grid(Repetition): _a_count: int """ Number of instances along the direction specified by the `a_vector` """ - _b_vector: NDArray[numpy.float64] + _b_vector: NDArray[numpy.float64] | None """ Vector `[x, y]` specifying a second lattice vector for the grid. Specifies center-to-center spacing between adjacent elements. Can be `None` for a 1D array. @@ -113,22 +113,6 @@ class Grid(Repetition): self.a_count = a_count self.b_count = b_count - @classmethod - def _from_raw( - cls: type[GG], - *, - a_vector: NDArray[numpy.float64], - a_count: int, - b_vector: NDArray[numpy.float64], - b_count: int, - ) -> GG: - new = cls.__new__(cls) - new._a_vector = a_vector - new._b_vector = b_vector - new._a_count = int(a_count) - new._b_count = int(b_count) - return new - @classmethod def aligned( cls: type[GG], @@ -200,8 +184,6 @@ class Grid(Repetition): def a_count(self, val: int) -> None: if val != int(val): raise PatternError('a_count must be convertable to an int!') - if int(val) < 1: - raise PatternError(f'Repetition has too-small a_count: {val}') self._a_count = int(val) # b_count property @@ -213,12 +195,13 @@ class Grid(Repetition): def b_count(self, val: int) -> None: if val != int(val): raise PatternError('b_count must be convertable to an int!') - if int(val) < 1: - raise PatternError(f'Repetition has too-small b_count: {val}') self._b_count = int(val) @property def displacements(self) -> NDArray[numpy.float64]: + if self.b_vector is None: + return numpy.arange(self.a_count)[:, None] * self.a_vector[None, :] + aa, bb = numpy.meshgrid(numpy.arange(self.a_count), numpy.arange(self.b_count), indexing='ij') return (aa.flatten()[:, None] * self.a_vector[None, :] + bb.flatten()[:, None] * self.b_vector[None, :]) # noqa @@ -308,7 +291,7 @@ class Grid(Repetition): return False return True - def __lt__(self, other: Repetition) -> bool: + def __le__(self, other: Repetition) -> bool: if type(self) is not type(other): return repr(type(self)) < repr(type(other)) other = cast('Grid', other) @@ -318,8 +301,12 @@ class Grid(Repetition): return self.b_count < other.b_count if not numpy.array_equal(self.a_vector, other.a_vector): return tuple(self.a_vector) < tuple(other.a_vector) + if self.b_vector is None: + return other.b_vector is not None + if other.b_vector is None: + return False if not numpy.array_equal(self.b_vector, other.b_vector): - return tuple(self.b_vector) < tuple(other.b_vector) + return tuple(self.a_vector) < tuple(other.a_vector) return False @@ -345,22 +332,7 @@ class Arbitrary(Repetition): @displacements.setter def displacements(self, val: ArrayLike) -> None: - try: - vala = numpy.array(val, dtype=float) - except (TypeError, ValueError) as exc: - raise PatternError('displacements must be convertible to an Nx2 ndarray') from exc - - if vala.size == 0: - self._displacements = numpy.empty((0, 2), dtype=float) - return - - if vala.ndim == 1: - if vala.size != 2: - raise PatternError('displacements must be convertible to an Nx2 ndarray') - vala = vala.reshape(1, 2) - elif vala.ndim != 2 or vala.shape[1] != 2: - raise PatternError('displacements must be convertible to an Nx2 ndarray') - + vala = numpy.array(val, dtype=float) order = numpy.lexsort(vala.T[::-1]) # sortrows self._displacements = vala[order] @@ -378,11 +350,11 @@ class Arbitrary(Repetition): return (f'') def __eq__(self, other: Any) -> bool: - if type(other) is not type(self): + if not type(other) is not type(self): return False return numpy.array_equal(self.displacements, other.displacements) - def __lt__(self, other: Repetition) -> bool: + def __le__(self, other: Repetition) -> bool: if type(self) is not type(other): return repr(type(self)) < repr(type(other)) other = cast('Arbitrary', other) @@ -419,9 +391,7 @@ class Arbitrary(Repetition): Returns: self """ - new_displacements = self.displacements.copy() - new_displacements[:, 1 - axis] *= -1 - self.displacements = new_displacements + self.displacements[1 - axis] *= -1 return self def get_bounds(self) -> NDArray[numpy.float64] | None: @@ -432,8 +402,6 @@ class Arbitrary(Repetition): Returns: `[[x_min, y_min], [x_max, y_max]]` or `None` """ - if self.displacements.size == 0: - return None xy_min = numpy.min(self.displacements, axis=0) xy_max = numpy.max(self.displacements, axis=0) return numpy.array((xy_min, xy_max)) @@ -448,5 +416,6 @@ class Arbitrary(Repetition): Returns: self """ - self.displacements = self.displacements * c + self.displacements *= c return self + diff --git a/masque/shapes/__init__.py b/masque/shapes/__init__.py index ac3a14b..fd66c59 100644 --- a/masque/shapes/__init__.py +++ b/masque/shapes/__init__.py @@ -11,7 +11,6 @@ from .shape import ( from .polygon import Polygon as Polygon from .poly_collection import PolyCollection as PolyCollection -from .rect_collection import RectCollection as RectCollection from .circle import Circle as Circle from .ellipse import Ellipse as Ellipse from .arc import Arc as Arc diff --git a/masque/shapes/arc.py b/masque/shapes/arc.py index 9d5f65d..480835e 100644 --- a/masque/shapes/arc.py +++ b/masque/shapes/arc.py @@ -54,8 +54,8 @@ class Arc(PositionableImpl, Shape): val = numpy.array(val, dtype=float).flatten() if not val.size == 2: raise PatternError('Radii must have length 2') - if not val.min() > 0: - raise PatternError('Radii must be positive') + if not val.min() >= 0: + raise PatternError('Radii must be non-negative') self._radii = val @property @@ -64,8 +64,8 @@ class Arc(PositionableImpl, Shape): @radius_x.setter def radius_x(self, val: float) -> None: - if not val > 0: - raise PatternError('Radius must be positive') + if not val >= 0: + raise PatternError('Radius must be non-negative') self._radii[0] = val @property @@ -74,8 +74,8 @@ class Arc(PositionableImpl, Shape): @radius_y.setter def radius_y(self, val: float) -> None: - if not val > 0: - raise PatternError('Radius must be positive') + if not val >= 0: + raise PatternError('Radius must be non-negative') self._radii[1] = val # arc start/stop angle properties @@ -159,36 +159,27 @@ class Arc(PositionableImpl, Shape): rotation: float = 0, repetition: Repetition | None = None, annotations: annotations_t = None, + raw: bool = False, ) -> None: - self.radii = radii - self.angles = angles - self.width = width - self.offset = offset - self.rotation = rotation - self.repetition = repetition - self.annotations = annotations - - @classmethod - def _from_raw( - cls, - *, - radii: NDArray[numpy.float64], - angles: NDArray[numpy.float64], - width: float, - offset: NDArray[numpy.float64], - rotation: float, - annotations: annotations_t = None, - repetition: Repetition | None = None, - ) -> 'Arc': - new = cls.__new__(cls) - new._radii = radii - new._angles = angles - new._width = width - new._offset = offset - new._rotation = rotation % (2 * pi) - new._repetition = repetition - new._annotations = annotations - return new + if raw: + assert isinstance(radii, numpy.ndarray) + assert isinstance(angles, numpy.ndarray) + assert isinstance(offset, numpy.ndarray) + self._radii = radii + self._angles = angles + self._width = width + self._offset = offset + self._rotation = rotation + self._repetition = repetition + self._annotations = annotations + else: + self.radii = radii + self.angles = angles + self.width = width + self.offset = offset + self.rotation = rotation + self.repetition = repetition + self.annotations = annotations def __deepcopy__(self, memo: dict | None = None) -> 'Arc': memo = {} if memo is None else memo @@ -196,7 +187,6 @@ class Arc(PositionableImpl, Shape): new._offset = self._offset.copy() new._radii = self._radii.copy() new._angles = self._angles.copy() - new._repetition = copy.deepcopy(self._repetition, memo) new._annotations = copy.deepcopy(self._annotations) return new @@ -240,8 +230,6 @@ class Arc(PositionableImpl, Shape): if (num_vertices is None) and (max_arclen is None): raise PatternError('Max number of points and arclength left unspecified' + ' (default was also overridden)') - if max_arclen is not None and (numpy.isnan(max_arclen) or max_arclen <= 0): - raise PatternError('Max arclength must be positive and not NaN') r0, r1 = self.radii @@ -268,38 +256,29 @@ class Arc(PositionableImpl, Shape): return arc_lengths, tt wh = self.width / 2.0 - arclen_limits: list[float] = [] - if max_arclen is not None: - arclen_limits.append(max_arclen) if num_vertices is not None: n_pts = numpy.ceil(max(self.radii + wh) / min(self.radii) * num_vertices * 100).astype(int) perimeter_inner = get_arclens(n_pts, *a_ranges[0], dr=-wh)[0].sum() perimeter_outer = get_arclens(n_pts, *a_ranges[1], dr= wh)[0].sum() implied_arclen = (perimeter_outer + perimeter_inner + self.width * 2) / num_vertices - if not (numpy.isnan(implied_arclen) or implied_arclen <= 0): - arclen_limits.append(implied_arclen) - if not arclen_limits: - raise PatternError('Arc polygonization could not determine a valid max_arclen') - max_arclen = min(arclen_limits) + max_arclen = min(implied_arclen, max_arclen if max_arclen is not None else numpy.inf) + assert max_arclen is not None def get_thetas(inner: bool) -> NDArray[numpy.float64]: """ Figure out the parameter values at which we should place vertices to meet the arclength constraint""" dr = -wh if inner else wh - n_pts = max(2, int(numpy.ceil(2 * pi * max(self.radii + dr) / max_arclen))) + n_pts = numpy.ceil(2 * pi * max(self.radii + dr) / max_arclen).astype(int) arc_lengths, thetas = get_arclens(n_pts, *a_ranges[0 if inner else 1], dr=dr) keep = [0] - start = 0 + removable = (numpy.cumsum(arc_lengths) <= max_arclen) + start = 1 while start < arc_lengths.size: - removable = (numpy.cumsum(arc_lengths[start:]) <= max_arclen) - if not removable.any(): - next_to_keep = start + 1 - else: - next_to_keep = start + numpy.where(removable)[0][-1] + 1 + next_to_keep = start + numpy.where(removable)[0][-1] # TODO: any chance we haven't sampled finely enough? keep.append(next_to_keep) - start = next_to_keep - + removable = (numpy.cumsum(arc_lengths[next_to_keep + 1:]) <= max_arclen) + start = next_to_keep + 1 if keep[-1] != thetas.size - 1: keep.append(thetas.size - 1) @@ -331,54 +310,81 @@ class Arc(PositionableImpl, Shape): return [poly] def get_bounds_single(self) -> NDArray[numpy.float64]: + """ + Equation for rotated ellipse is + `x = x0 + a * cos(t) * cos(rot) - b * sin(t) * sin(phi)` + `y = y0 + a * cos(t) * sin(rot) + b * sin(t) * cos(rot)` + where `t` is our parameter. + + Differentiating and solving for 0 slope wrt. `t`, we find + `tan(t) = -+ b/a cot(phi)` + where -+ is for x, y cases, so that's where the extrema are. + + If the extrema are innaccessible due to arc constraints, check the arc endpoints instead. + """ a_ranges = cast('_array2x2_t', self._angles_to_parameters()) - sin_r = numpy.sin(self.rotation) - cos_r = numpy.cos(self.rotation) - def point(rx: float, ry: float, tt: float) -> NDArray[numpy.float64]: - return numpy.array(( - rx * numpy.cos(tt) * cos_r - ry * numpy.sin(tt) * sin_r, - rx * numpy.cos(tt) * sin_r + ry * numpy.sin(tt) * cos_r, - )) - - def points_in_interval(rx: float, ry: float, a0: float, a1: float) -> list[NDArray[numpy.float64]]: - candidates = [a0, a1] - if rx != 0 and ry != 0: - tx = numpy.arctan2(-ry * sin_r, rx * cos_r) - ty = numpy.arctan2(ry * cos_r, rx * sin_r) - candidates.extend((tx, tx + pi, ty, ty + pi)) - - lo = min(a0, a1) - hi = max(a0, a1) - pts = [] - for base in candidates: - k_min = int(numpy.floor((lo - base) / (2 * pi))) - 1 - k_max = int(numpy.ceil((hi - base) / (2 * pi))) + 1 - for kk in range(k_min, k_max + 1): - tt = base + kk * 2 * pi - if lo <= tt <= hi: - pts.append(point(rx, ry, tt)) - return pts - - pts = [] + mins = [] + maxs = [] for aa, sgn in zip(a_ranges, (-1, +1), strict=True): wh = sgn * self.width / 2 rx = self.radius_x + wh ry = self.radius_y + wh - if rx == 0 or ry == 0: - pts.append(numpy.zeros(2)) - continue - pts.extend(points_in_interval(rx, ry, aa[0], aa[1])) - all_pts = numpy.asarray(pts) + self.offset - return numpy.vstack((numpy.min(all_pts, axis=0), - numpy.max(all_pts, axis=0))) + if rx == 0 or ry == 0: + # Single point, at origin + mins.append([0, 0]) + maxs.append([0, 0]) + continue + + a0, a1 = aa + a0_offset = a0 - (a0 % (2 * pi)) + + sin_r = numpy.sin(self.rotation) + cos_r = numpy.cos(self.rotation) + sin_a = numpy.sin(aa) + cos_a = numpy.cos(aa) + + # Cutoff angles + xpt = (-self.rotation) % (2 * pi) + a0_offset + ypt = (pi / 2 - self.rotation) % (2 * pi) + a0_offset + xnt = (xpt - pi) % (2 * pi) + a0_offset + ynt = (ypt - pi) % (2 * pi) + a0_offset + + # Points along coordinate axes + rx2_inv = 1 / (rx * rx) + ry2_inv = 1 / (ry * ry) + xr = numpy.abs(cos_r * cos_r * rx2_inv + sin_r * sin_r * ry2_inv) ** -0.5 + yr = numpy.abs(-sin_r * -sin_r * rx2_inv + cos_r * cos_r * ry2_inv) ** -0.5 + + # Arc endpoints + xn, xp = sorted(rx * cos_r * cos_a - ry * sin_r * sin_a) + yn, yp = sorted(rx * sin_r * cos_a + ry * cos_r * sin_a) + + # If our arc subtends a coordinate axis, use the extremum along that axis + if a0 < xpt < a1 or a0 < xpt + 2 * pi < a1: + xp = xr + + if a0 < xnt < a1 or a0 < xnt + 2 * pi < a1: + xn = -xr + + if a0 < ypt < a1 or a0 < ypt + 2 * pi < a1: + yp = yr + + if a0 < ynt < a1 or a0 < ynt + 2 * pi < a1: + yn = -yr + + mins.append([xn, yn]) + maxs.append([xp, yp]) + return numpy.vstack((numpy.min(mins, axis=0) + self.offset, + numpy.max(maxs, axis=0) + self.offset)) def rotate(self, theta: float) -> 'Arc': self.rotation += theta return self def mirror(self, axis: int = 0) -> 'Arc': + self.offset[axis - 1] *= -1 self.rotation *= -1 self.rotation += axis * pi self.angles *= -1 @@ -411,7 +417,7 @@ class Arc(PositionableImpl, Shape): rotation %= 2 * pi width = self.width - return ((type(self), tuple(radii.tolist()), norm_angles, width / norm_value), + return ((type(self), radii, norm_angles, width / norm_value), (self.offset, scale / norm_value, rotation, False), lambda: Arc( radii=radii * norm_value, @@ -458,18 +464,13 @@ class Arc(PositionableImpl, Shape): `[[a_min_inner, a_max_inner], [a_min_outer, a_max_outer]]` """ aa = [] - d_angle = self.angles[1] - self.angles[0] - if abs(d_angle) >= 2 * pi: - # Full ring - return numpy.tile([0, 2 * pi], (2, 1)).astype(float) - for sgn in (-1, +1): wh = sgn * self.width / 2.0 rx = self.radius_x + wh ry = self.radius_y + wh a0, a1 = (numpy.arctan2(rx * numpy.sin(ai), ry * numpy.cos(ai)) for ai in self.angles) - sign = numpy.sign(d_angle) + sign = numpy.sign(self.angles[1] - self.angles[0]) if sign != numpy.sign(a1 - a0): a1 += sign * 2 * pi diff --git a/masque/shapes/circle.py b/masque/shapes/circle.py index d7591db..b20a681 100644 --- a/masque/shapes/circle.py +++ b/masque/shapes/circle.py @@ -50,33 +50,24 @@ class Circle(PositionableImpl, Shape): offset: ArrayLike = (0.0, 0.0), repetition: Repetition | None = None, annotations: annotations_t = None, + raw: bool = False, ) -> None: - self.radius = radius - self.offset = offset - self.repetition = repetition - self.annotations = annotations - - @classmethod - def _from_raw( - cls, - *, - radius: float, - offset: NDArray[numpy.float64], - annotations: annotations_t = None, - repetition: Repetition | None = None, - ) -> 'Circle': - new = cls.__new__(cls) - new._radius = radius - new._offset = offset - new._repetition = repetition - new._annotations = annotations - return new + if raw: + assert isinstance(offset, numpy.ndarray) + self._radius = radius + self._offset = offset + self._repetition = repetition + self._annotations = annotations + else: + self.radius = radius + self.offset = offset + self.repetition = repetition + self.annotations = annotations def __deepcopy__(self, memo: dict | None = None) -> 'Circle': memo = {} if memo is None else memo new = copy.copy(self) new._offset = self._offset.copy() - new._repetition = copy.deepcopy(self._repetition, memo) new._annotations = copy.deepcopy(self._annotations) return new @@ -117,7 +108,7 @@ class Circle(PositionableImpl, Shape): n += [num_vertices] if max_arclen is not None: n += [2 * pi * self.radius / max_arclen] - num_vertices = max(3, int(round(max(n)))) + num_vertices = int(round(max(n))) thetas = numpy.linspace(2 * pi, 0, num_vertices, endpoint=False) xs = numpy.cos(thetas) * self.radius ys = numpy.sin(thetas) * self.radius @@ -133,6 +124,7 @@ class Circle(PositionableImpl, Shape): return self def mirror(self, axis: int = 0) -> 'Circle': # noqa: ARG002 (axis unused) + self.offset[axis - 1] *= -1 return self def scale_by(self, c: float) -> 'Circle': diff --git a/masque/shapes/ellipse.py b/masque/shapes/ellipse.py index 52a3297..6029f2f 100644 --- a/masque/shapes/ellipse.py +++ b/masque/shapes/ellipse.py @@ -42,7 +42,7 @@ class Ellipse(PositionableImpl, Shape): @radii.setter def radii(self, val: ArrayLike) -> None: - val = numpy.array(val, dtype=float).flatten() + val = numpy.array(val).flatten() if not val.size == 2: raise PatternError('Radii must have length 2') if not val.min() >= 0: @@ -95,37 +95,28 @@ class Ellipse(PositionableImpl, Shape): rotation: float = 0, repetition: Repetition | None = None, annotations: annotations_t = None, + raw: bool = False, ) -> None: - self.radii = radii - self.offset = offset - self.rotation = rotation - self.repetition = repetition - self.annotations = annotations - - @classmethod - def _from_raw( - cls, - *, - radii: NDArray[numpy.float64], - offset: NDArray[numpy.float64], - rotation: float, - annotations: annotations_t = None, - repetition: Repetition | None = None, - ) -> Self: - new = cls.__new__(cls) - new._radii = radii - new._offset = offset - new._rotation = rotation % pi - new._repetition = repetition - new._annotations = annotations - return new + if raw: + assert isinstance(radii, numpy.ndarray) + assert isinstance(offset, numpy.ndarray) + self._radii = radii + self._offset = offset + self._rotation = rotation + self._repetition = repetition + self._annotations = annotations + else: + self.radii = radii + self.offset = offset + self.rotation = rotation + self.repetition = repetition + self.annotations = annotations def __deepcopy__(self, memo: dict | None = None) -> Self: memo = {} if memo is None else memo new = copy.copy(self) new._offset = self._offset.copy() new._radii = self._radii.copy() - new._repetition = copy.deepcopy(self._repetition, memo) new._annotations = copy.deepcopy(self._annotations) return new @@ -177,7 +168,7 @@ class Ellipse(PositionableImpl, Shape): n += [num_vertices] if max_arclen is not None: n += [perimeter / max_arclen] - num_vertices = max(3, int(round(max(n)))) + num_vertices = int(round(max(n))) thetas = numpy.linspace(2 * pi, 0, num_vertices, endpoint=False) sin_th, cos_th = (numpy.sin(thetas), numpy.cos(thetas)) @@ -189,19 +180,16 @@ class Ellipse(PositionableImpl, Shape): return [poly] def get_bounds_single(self) -> NDArray[numpy.float64]: - cos_r = numpy.cos(self.rotation) - sin_r = numpy.sin(self.rotation) - x_extent = numpy.sqrt((self.radius_x * cos_r) ** 2 + (self.radius_y * sin_r) ** 2) - y_extent = numpy.sqrt((self.radius_x * sin_r) ** 2 + (self.radius_y * cos_r) ** 2) - extents = numpy.array((x_extent, y_extent)) - return numpy.vstack((self.offset - extents, - self.offset + extents)) + rot_radii = numpy.dot(rotation_matrix_2d(self.rotation), self.radii) + return numpy.vstack((self.offset - rot_radii[0], + self.offset + rot_radii[1])) def rotate(self, theta: float) -> Self: self.rotation += theta return self def mirror(self, axis: int = 0) -> Self: + self.offset[axis - 1] *= -1 self.rotation *= -1 self.rotation += axis * pi return self @@ -219,7 +207,7 @@ class Ellipse(PositionableImpl, Shape): radii = self.radii[::-1] / self.radius_y scale = self.radius_y angle = (self.rotation + pi / 2) % pi - return ((type(self), tuple(radii.tolist())), + return ((type(self), radii), (self.offset, scale / norm_value, angle, False), lambda: Ellipse(radii=radii * norm_value)) diff --git a/masque/shapes/path.py b/masque/shapes/path.py index a1e04af..7778428 100644 --- a/masque/shapes/path.py +++ b/masque/shapes/path.py @@ -24,16 +24,7 @@ class PathCap(Enum): # # defined by path.cap_extensions def __lt__(self, other: Any) -> bool: - if self.__class__ is not other.__class__: - return self.__class__.__name__ < other.__class__.__name__ - # Order: Flush, Square, Circle, SquareCustom - order = { - PathCap.Flush: 0, - PathCap.Square: 1, - PathCap.Circle: 2, - PathCap.SquareCustom: 3, - } - return order[self] < order[other] + return self.value == other.value @functools.total_ordering @@ -88,10 +79,10 @@ class Path(Shape): def cap(self, val: PathCap) -> None: self._cap = PathCap(val) if self.cap != PathCap.SquareCustom: - self._cap_extensions = None - elif self._cap_extensions is None: + self.cap_extensions = None + elif self.cap_extensions is None: # just got set to SquareCustom - self._cap_extensions = numpy.zeros(2) + self.cap_extensions = numpy.zeros(2) # cap_extensions property @property @@ -201,50 +192,37 @@ class Path(Shape): rotation: float = 0, repetition: Repetition | None = None, annotations: annotations_t = None, + raw: bool = False, ) -> None: self._cap_extensions = None # Since .cap setter might access it - self.vertices = vertices - self.repetition = repetition - self.annotations = annotations - self._cap = cap - if cap == PathCap.SquareCustom and cap_extensions is None: - self._cap_extensions = numpy.zeros(2) + if raw: + assert isinstance(vertices, numpy.ndarray) + assert isinstance(cap_extensions, numpy.ndarray) or cap_extensions is None + self._vertices = vertices + self._repetition = repetition + self._annotations = annotations + self._width = width + self._cap = cap + self._cap_extensions = cap_extensions else: + self.vertices = vertices + self.repetition = repetition + self.annotations = annotations + self.width = width + self.cap = cap self.cap_extensions = cap_extensions - self.width = width if rotation: self.rotate(rotation) if numpy.any(offset): self.translate(offset) - @classmethod - def _from_raw( - cls, - *, - vertices: NDArray[numpy.float64], - width: float, - cap: PathCap, - cap_extensions: NDArray[numpy.float64] | None = None, - annotations: annotations_t = None, - repetition: Repetition | None = None, - ) -> Self: - new = cls.__new__(cls) - new._vertices = vertices - new._width = width - new._cap = cap - new._cap_extensions = cap_extensions - new._repetition = repetition - new._annotations = annotations - return new - def __deepcopy__(self, memo: dict | None = None) -> 'Path': memo = {} if memo is None else memo new = copy.copy(self) new._vertices = self._vertices.copy() new._cap = copy.deepcopy(self._cap, memo) new._cap_extensions = copy.deepcopy(self._cap_extensions, memo) - new._repetition = copy.deepcopy(self._repetition, memo) new._annotations = copy.deepcopy(self._annotations) return new @@ -275,14 +253,6 @@ class Path(Shape): if self.cap_extensions is None: return True return tuple(self.cap_extensions) < tuple(other.cap_extensions) - if not numpy.array_equal(self.vertices, other.vertices): - min_len = min(self.vertices.shape[0], other.vertices.shape[0]) - eq_mask = self.vertices[:min_len] != other.vertices[:min_len] - eq_lt = self.vertices[:min_len] < other.vertices[:min_len] - eq_lt_masked = eq_lt[eq_mask] - if eq_lt_masked.size > 0: - return eq_lt_masked.flat[0] - return self.vertices.shape[0] < other.vertices.shape[0] if self.repetition != other.repetition: return rep2key(self.repetition) < rep2key(other.repetition) return annotations_lt(self.annotations, other.annotations) @@ -333,30 +303,9 @@ class Path(Shape): ) -> list['Polygon']: extensions = self._calculate_cap_extensions() - v = remove_colinear_vertices(self.vertices, closed_path=False, preserve_uturns=True) + v = remove_colinear_vertices(self.vertices, closed_path=False) dv = numpy.diff(v, axis=0) - norms = numpy.sqrt((dv * dv).sum(axis=1)) - - # Filter out zero-length segments if any remained after remove_colinear_vertices - valid = (norms > 1e-18) - if not numpy.all(valid): - # This shouldn't happen much if remove_colinear_vertices is working - v = v[numpy.append(valid, True)] - dv = numpy.diff(v, axis=0) - norms = norms[valid] - - if dv.shape[0] == 0: - # All vertices were the same. It's a point. - if self.width == 0: - return [Polygon(vertices=numpy.zeros((3, 2)))] # Area-less degenerate - if self.cap == PathCap.Circle: - return Circle(radius=self.width / 2, offset=v[0]).to_polygons(num_vertices=num_vertices, max_arclen=max_arclen) - if self.cap == PathCap.Square: - return [Polygon.square(side_length=self.width, offset=v[0])] - # Flush or CustomSquare - return [Polygon(vertices=numpy.zeros((3, 2)))] - - dvdir = dv / norms[:, None] + dvdir = dv / numpy.sqrt((dv * dv).sum(axis=1))[:, None] if self.width == 0: verts = numpy.vstack((v, v[::-1])) @@ -375,21 +324,11 @@ class Path(Shape): bs = v[1:-1] - v[:-2] + perp[1:] - perp[:-1] ds = v[1:-1] - v[:-2] - perp[1:] + perp[:-1] - try: - # Vectorized solve for all intersections - # solve supports broadcasting: As (N-2, 2, 2), bs (N-2, 2, 1) - rp = numpy.linalg.solve(As, bs[:, :, None])[:, 0, 0] - rn = numpy.linalg.solve(As, ds[:, :, None])[:, 0, 0] - except numpy.linalg.LinAlgError: - # Fallback to slower lstsq if some segments are parallel (singular matrix) - rp = numpy.zeros(As.shape[0]) - rn = numpy.zeros(As.shape[0]) - for ii in range(As.shape[0]): - rp[ii] = numpy.linalg.lstsq(As[ii], bs[ii, :, None], rcond=1e-12)[0][0, 0] - rn[ii] = numpy.linalg.lstsq(As[ii], ds[ii, :, None], rcond=1e-12)[0][0, 0] + rp = numpy.linalg.solve(As, bs[:, :, None])[:, 0] + rn = numpy.linalg.solve(As, ds[:, :, None])[:, 0] - intersection_p = v[:-2] + rp[:, None] * dv[:-1] + perp[:-1] - intersection_n = v[:-2] + rn[:, None] * dv[:-1] - perp[:-1] + intersection_p = v[:-2] + rp * dv[:-1] + perp[:-1] + intersection_n = v[:-2] + rn * dv[:-1] - perp[:-1] towards_perp = (dv[1:] * perp[:-1]).sum(axis=1) > 0 # path bends towards previous perp? # straight = (dv[1:] * perp[:-1]).sum(axis=1) == 0 # path is straight @@ -457,14 +396,12 @@ class Path(Shape): return self def mirror(self, axis: int = 0) -> 'Path': - self.vertices[:, 1 - axis] *= -1 + self.vertices[:, axis - 1] *= -1 return self def scale_by(self, c: float) -> 'Path': self.vertices *= c self.width *= c - if self.cap_extensions is not None: - self.cap_extensions *= c return self def normalized_form(self, norm_value: float) -> normalized_shape_tuple: @@ -481,22 +418,21 @@ class Path(Shape): rotated_vertices = numpy.vstack([numpy.dot(rotation_matrix_2d(-rotation), v) for v in normed_vertices]) - # Canonical ordering for open paths: pick whichever of (v) or (v[::-1]) is smaller - if tuple(rotated_vertices.flat) > tuple(rotated_vertices[::-1].flat): - reordered_vertices = rotated_vertices[::-1] - else: - reordered_vertices = rotated_vertices + # Reorder the vertices so that the one with lowest x, then y, comes first. + x_min = rotated_vertices[:, 0].argmin() + if not is_scalar(x_min): + y_min = rotated_vertices[x_min, 1].argmin() + x_min = cast('Sequence', x_min)[y_min] + reordered_vertices = numpy.roll(rotated_vertices, -x_min, axis=0) width0 = self.width / norm_value - cap_extensions0 = None if self.cap_extensions is None else tuple(float(v) / norm_value for v in self.cap_extensions) - return ((type(self), reordered_vertices.data.tobytes(), width0, self.cap, cap_extensions0), + return ((type(self), reordered_vertices.data.tobytes(), width0, self.cap), (offset, scale / norm_value, rotation, False), lambda: Path( reordered_vertices * norm_value, - width=width0 * norm_value, + width=self.width * norm_value, cap=self.cap, - cap_extensions=None if cap_extensions0 is None else tuple(v * norm_value for v in cap_extensions0), )) def clean_vertices(self) -> 'Path': @@ -526,7 +462,7 @@ class Path(Shape): Returns: self """ - self.vertices = remove_colinear_vertices(self.vertices, closed_path=False, preserve_uturns=True) + self.vertices = remove_colinear_vertices(self.vertices, closed_path=False) return self def _calculate_cap_extensions(self) -> NDArray[numpy.float64]: diff --git a/masque/shapes/poly_collection.py b/masque/shapes/poly_collection.py index f1c840a..6048f24 100644 --- a/masque/shapes/poly_collection.py +++ b/masque/shapes/poly_collection.py @@ -34,7 +34,7 @@ class PolyCollection(Shape): _vertex_lists: NDArray[numpy.float64] """ 2D NDArray ((N+M+...) x 2) of vertices `[[xa0, ya0], [xa1, ya1], ..., [xb0, yb0], [xb1, yb1], ... ]` """ - _vertex_offsets: NDArray[numpy.integer[Any]] + _vertex_offsets: NDArray[numpy.intp] """ 1D NDArray specifying the starting offset for each polygon """ @property @@ -45,7 +45,7 @@ class PolyCollection(Shape): return self._vertex_lists @property - def vertex_offsets(self) -> NDArray[numpy.integer[Any]]: + def vertex_offsets(self) -> NDArray[numpy.intp]: """ Starting offset (in `vertex_lists`) for each polygon """ @@ -56,14 +56,12 @@ class PolyCollection(Shape): """ Iterator which provides slices which index vertex_lists """ - if self._vertex_offsets.size == 0: - return for ii, ff in zip( self._vertex_offsets, - chain(self._vertex_offsets[1:], [self._vertex_lists.shape[0]]), + chain(self._vertex_offsets, (self._vertex_lists.shape[0],)), strict=True, ): - yield slice(int(ii), int(ff)) + yield slice(ii, ff) @property def polygon_vertices(self) -> Iterator[NDArray[numpy.float64]]: @@ -84,7 +82,7 @@ class PolyCollection(Shape): def set_offset(self, val: ArrayLike) -> Self: if numpy.any(val): - raise PatternError('PolyCollection offset is forced to (0, 0)') + raise PatternError('Path offset is forced to (0, 0)') return self def translate(self, offset: ArrayLike) -> Self: @@ -100,38 +98,30 @@ class PolyCollection(Shape): rotation: float = 0.0, repetition: Repetition | None = None, annotations: annotations_t = None, + raw: bool = False, ) -> None: - self._vertex_lists = numpy.asarray(vertex_lists, dtype=float) - self._vertex_offsets = numpy.asarray(vertex_offsets, dtype=numpy.intp) - self.repetition = repetition - self.annotations = annotations + if raw: + assert isinstance(vertex_lists, numpy.ndarray) + assert isinstance(vertex_offsets, numpy.ndarray) + self._vertex_lists = vertex_lists + self._vertex_offsets = vertex_offsets + self._repetition = repetition + self._annotations = annotations + else: + self._vertex_lists = numpy.asarray(vertex_lists, dtype=float) + self._vertex_offsets = numpy.asarray(vertex_offsets, dtype=numpy.intp) + self.repetition = repetition + self.annotations = annotations if rotation: self.rotate(rotation) if numpy.any(offset): self.translate(offset) - @classmethod - def _from_raw( - cls, - *, - vertex_lists: NDArray[numpy.float64], - vertex_offsets: NDArray[numpy.integer[Any]], - annotations: annotations_t = None, - repetition: Repetition | None = None, - ) -> Self: - new = cls.__new__(cls) - new._vertex_lists = vertex_lists - new._vertex_offsets = vertex_offsets - new._repetition = repetition - new._annotations = annotations - return new - def __deepcopy__(self, memo: dict | None = None) -> Self: memo = {} if memo is None else memo new = copy.copy(self) new._vertex_lists = self._vertex_lists.copy() new._vertex_offsets = self._vertex_offsets.copy() - new._repetition = copy.deepcopy(self._repetition, memo) new._annotations = copy.deepcopy(self._annotations) return new @@ -139,7 +129,7 @@ class PolyCollection(Shape): return ( type(self) is type(other) and numpy.array_equal(self._vertex_lists, other._vertex_lists) - and numpy.array_equal(self.vertex_offsets, other.vertex_offsets) + and numpy.array_equal(self._vertex_offsets, other._vertex_offsets) and self.repetition == other.repetition and annotations_eq(self.annotations, other.annotations) ) @@ -178,9 +168,7 @@ class PolyCollection(Shape): annotations = copy.deepcopy(self.annotations), ) for vv in self.polygon_vertices] - def get_bounds_single(self) -> NDArray[numpy.float64] | None: # TODO note shape get_bounds doesn't include repetition - if self._vertex_lists.size == 0: - return None + def get_bounds_single(self) -> NDArray[numpy.float64]: # TODO note shape get_bounds doesn't include repetition return numpy.vstack((numpy.min(self._vertex_lists, axis=0), numpy.max(self._vertex_lists, axis=0))) @@ -191,7 +179,7 @@ class PolyCollection(Shape): return self def mirror(self, axis: int = 0) -> Self: - self._vertex_lists[:, 1 - axis] *= -1 + self._vertex_lists[:, axis - 1] *= -1 return self def scale_by(self, c: float) -> Self: @@ -222,11 +210,11 @@ class PolyCollection(Shape): # TODO: normalize mirroring? - return ((type(self), rotated_vertices.data.tobytes() + self.vertex_offsets.tobytes()), + return ((type(self), rotated_vertices.data.tobytes() + self._vertex_offsets.tobytes()), (offset, scale / norm_value, rotation, False), lambda: PolyCollection( vertex_lists=rotated_vertices * norm_value, - vertex_offsets=self.vertex_offsets.copy(), + vertex_offsets=self._vertex_offsets, ), ) diff --git a/masque/shapes/polygon.py b/masque/shapes/polygon.py index 06e5c2b..c8c3ddd 100644 --- a/masque/shapes/polygon.py +++ b/masque/shapes/polygon.py @@ -1,4 +1,4 @@ -from typing import Any, cast, TYPE_CHECKING, Self, Literal +from typing import Any, cast, TYPE_CHECKING, Self import copy import functools @@ -96,11 +96,11 @@ class Polygon(Shape): @offset.setter def offset(self, val: ArrayLike) -> None: if numpy.any(val): - raise PatternError('Polygon offset is forced to (0, 0)') + raise PatternError('Path offset is forced to (0, 0)') def set_offset(self, val: ArrayLike) -> Self: if numpy.any(val): - raise PatternError('Polygon offset is forced to (0, 0)') + raise PatternError('Path offset is forced to (0, 0)') return self def translate(self, offset: ArrayLike) -> Self: @@ -115,34 +115,26 @@ class Polygon(Shape): rotation: float = 0.0, repetition: Repetition | None = None, annotations: annotations_t = None, + raw: bool = False, ) -> None: - self.vertices = vertices - self.repetition = repetition - self.annotations = annotations + if raw: + assert isinstance(vertices, numpy.ndarray) + self._vertices = vertices + self._repetition = repetition + self._annotations = annotations + else: + self.vertices = vertices + self.repetition = repetition + self.annotations = annotations if rotation: self.rotate(rotation) if numpy.any(offset): self.translate(offset) - @classmethod - def _from_raw( - cls, - *, - vertices: NDArray[numpy.float64], - annotations: annotations_t = None, - repetition: Repetition | None = None, - ) -> Self: - new = cls.__new__(cls) - new._vertices = vertices - new._repetition = repetition - new._annotations = annotations - return new - def __deepcopy__(self, memo: dict | None = None) -> 'Polygon': memo = {} if memo is None else memo new = copy.copy(self) new._vertices = self._vertices.copy() - new._repetition = copy.deepcopy(self._repetition, memo) new._annotations = copy.deepcopy(self._annotations) return new @@ -329,7 +321,7 @@ class Polygon(Shape): else: raise PatternError('Two of ymin, yctr, ymax, ly must be None!') - poly = Polygon.rectangle(abs(lx), abs(ly), offset=(xctr, yctr), repetition=repetition) + poly = Polygon.rectangle(lx, ly, offset=(xctr, yctr), repetition=repetition) return poly @staticmethod @@ -402,7 +394,7 @@ class Polygon(Shape): return self def mirror(self, axis: int = 0) -> 'Polygon': - self.vertices[:, 1 - axis] *= -1 + self.vertices[:, axis - 1] *= -1 return self def scale_by(self, c: float) -> 'Polygon': @@ -425,15 +417,11 @@ class Polygon(Shape): for v in normed_vertices]) # Reorder the vertices so that the one with lowest x, then y, comes first. - x_min_val = rotated_vertices[:, 0].min() - x_min_inds = numpy.where(rotated_vertices[:, 0] == x_min_val)[0] - if x_min_inds.size > 1: - y_min_val = rotated_vertices[x_min_inds, 1].min() - tie_breaker = numpy.where(rotated_vertices[x_min_inds, 1] == y_min_val)[0][0] - start_ind = x_min_inds[tie_breaker] - else: - start_ind = x_min_inds[0] - reordered_vertices = numpy.roll(rotated_vertices, -start_ind, axis=0) + x_min = rotated_vertices[:, 0].argmin() + if not is_scalar(x_min): + y_min = rotated_vertices[x_min, 1].argmin() + x_min = cast('Sequence', x_min)[y_min] + reordered_vertices = numpy.roll(rotated_vertices, -x_min, axis=0) # TODO: normalize mirroring? @@ -474,23 +462,3 @@ class Polygon(Shape): def __repr__(self) -> str: centroid = self.vertices.mean(axis=0) return f'' - - def boolean( - self, - other: Any, - operation: Literal['union', 'intersection', 'difference', 'xor'] = 'union', - scale: float = 1e6, - ) -> list['Polygon']: - """ - Perform a boolean operation using this polygon as the subject. - - Args: - other: Polygon, Iterable[Polygon], or raw vertices acting as the CLIP. - operation: 'union', 'intersection', 'difference', 'xor'. - scale: Scaling factor for integer conversion. - - Returns: - A list of resulting Polygons. - """ - from ..utils.boolean import boolean #noqa: PLC0415 - return boolean([self], other, operation=operation, scale=scale) diff --git a/masque/shapes/rect_collection.py b/masque/shapes/rect_collection.py deleted file mode 100644 index eaf028f..0000000 --- a/masque/shapes/rect_collection.py +++ /dev/null @@ -1,249 +0,0 @@ -from typing import Any, cast, Self -from collections.abc import Iterator -import copy -import functools - -import numpy -from numpy import pi -from numpy.typing import NDArray, ArrayLike - -from . import Shape, normalized_shape_tuple -from .polygon import Polygon -from ..error import PatternError -from ..repetition import Repetition -from ..utils import annotations_lt, annotations_eq, rep2key, annotations_t - - -def _normalize_rects(rects: ArrayLike) -> NDArray[numpy.float64]: - arr = numpy.asarray(rects, dtype=float) - if arr.ndim != 2 or arr.shape[1] != 4: - raise PatternError('Rectangles must be an Nx4 array of [xmin, ymin, xmax, ymax]') - if numpy.any(arr[:, 0] > arr[:, 2]) or numpy.any(arr[:, 1] > arr[:, 3]): - raise PatternError('Rectangles must satisfy xmin <= xmax and ymin <= ymax') - if arr.shape[0] <= 1: - return arr - order = numpy.lexsort((arr[:, 3], arr[:, 2], arr[:, 1], arr[:, 0])) - return arr[order] - - -def _renormalize_rects_in_place(rects: NDArray[numpy.float64]) -> None: - x0 = numpy.minimum(rects[:, 0], rects[:, 2]) - x1 = numpy.maximum(rects[:, 0], rects[:, 2]) - y0 = numpy.minimum(rects[:, 1], rects[:, 3]) - y1 = numpy.maximum(rects[:, 1], rects[:, 3]) - rects[:, 0] = x0 - rects[:, 1] = y0 - rects[:, 2] = x1 - rects[:, 3] = y1 - - -@functools.total_ordering -class RectCollection(Shape): - """ - A collection of axis-aligned rectangles, stored as an Nx4 array of - `[xmin, ymin, xmax, ymax]` rows. - """ - __slots__ = ( - '_rects', - '_repetition', '_annotations', - ) - - _rects: NDArray[numpy.float64] - - @property - def rects(self) -> NDArray[numpy.float64]: - return self._rects - - @rects.setter - def rects(self, val: ArrayLike) -> None: - self._rects = _normalize_rects(val) - - @property - def offset(self) -> NDArray[numpy.float64]: - return numpy.zeros(2) - - @offset.setter - def offset(self, val: ArrayLike) -> None: - if numpy.any(val): - raise PatternError('RectCollection offset is forced to (0, 0)') - - def set_offset(self, val: ArrayLike) -> Self: - if numpy.any(val): - raise PatternError('RectCollection offset is forced to (0, 0)') - return self - - def translate(self, offset: ArrayLike) -> Self: - delta = numpy.asarray(offset, dtype=float).reshape(2) - self._rects[:, [0, 2]] += delta[0] - self._rects[:, [1, 3]] += delta[1] - return self - - def __init__( - self, - rects: ArrayLike, - *, - offset: ArrayLike = (0.0, 0.0), - rotation: float = 0.0, - repetition: Repetition | None = None, - annotations: annotations_t = None, - ) -> None: - self.rects = rects - self.repetition = repetition - self.annotations = annotations - if rotation: - self.rotate(rotation) - if numpy.any(offset): - self.translate(offset) - - @classmethod - def _from_raw( - cls, - *, - rects: NDArray[numpy.float64], - annotations: annotations_t = None, - repetition: Repetition | None = None, - ) -> Self: - new = cls.__new__(cls) - new._rects = rects - new._repetition = repetition - new._annotations = annotations - return new - - @property - def polygon_vertices(self) -> Iterator[NDArray[numpy.float64]]: - for rect in self._rects: - xmin, ymin, xmax, ymax = rect - yield numpy.array([ - [xmin, ymin], - [xmin, ymax], - [xmax, ymax], - [xmax, ymin], - ], dtype=float) - - def __deepcopy__(self, memo: dict | None = None) -> Self: - memo = {} if memo is None else memo - new = copy.copy(self) - new._rects = self._rects.copy() - new._repetition = copy.deepcopy(self._repetition, memo) - new._annotations = copy.deepcopy(self._annotations) - return new - - def _sorted_rects(self) -> NDArray[numpy.float64]: - if self._rects.shape[0] <= 1: - return self._rects - order = numpy.lexsort((self._rects[:, 3], self._rects[:, 2], self._rects[:, 1], self._rects[:, 0])) - return self._rects[order] - - def __eq__(self, other: Any) -> bool: - return ( - type(self) is type(other) - and numpy.array_equal(self._sorted_rects(), other._sorted_rects()) - and self.repetition == other.repetition - and annotations_eq(self.annotations, other.annotations) - ) - - def __lt__(self, other: Shape) -> bool: - if type(self) is not type(other): - if repr(type(self)) != repr(type(other)): - return repr(type(self)) < repr(type(other)) - return id(type(self)) < id(type(other)) - - other = cast('RectCollection', other) - self_rects = self._sorted_rects() - other_rects = other._sorted_rects() - if not numpy.array_equal(self_rects, other_rects): - min_len = min(self_rects.shape[0], other_rects.shape[0]) - eq_mask = self_rects[:min_len] != other_rects[:min_len] - eq_lt = self_rects[:min_len] < other_rects[:min_len] - eq_lt_masked = eq_lt[eq_mask] - if eq_lt_masked.size > 0: - return bool(eq_lt_masked.flat[0]) - return self_rects.shape[0] < other_rects.shape[0] - if self.repetition != other.repetition: - return rep2key(self.repetition) < rep2key(other.repetition) - return annotations_lt(self.annotations, other.annotations) - - def to_polygons( - self, - num_vertices: int | None = None, # unused # noqa: ARG002 - max_arclen: float | None = None, # unused # noqa: ARG002 - ) -> list[Polygon]: - return [ - Polygon( - vertices=vertices, - repetition=copy.deepcopy(self.repetition), - annotations=copy.deepcopy(self.annotations), - ) - for vertices in self.polygon_vertices - ] - - def get_bounds_single(self) -> NDArray[numpy.float64] | None: - if self._rects.size == 0: - return None - mins = self._rects[:, :2].min(axis=0) - maxs = self._rects[:, 2:].max(axis=0) - return numpy.vstack((mins, maxs)) - - def rotate(self, theta: float) -> Self: - quarter_turns = int(numpy.rint(theta / (pi / 2))) - if not numpy.isclose(theta, quarter_turns * (pi / 2)): - raise PatternError('RectCollection only supports Manhattan rotations') - turns = quarter_turns % 4 - if turns == 0 or self._rects.size == 0: - return self - - corners = numpy.stack(( - self._rects[:, [0, 1]], - self._rects[:, [0, 3]], - self._rects[:, [2, 3]], - self._rects[:, [2, 1]], - ), axis=1) - flat = corners.reshape(-1, 2) - if turns == 1: - rotated = numpy.column_stack((-flat[:, 1], flat[:, 0])) - elif turns == 2: - rotated = -flat - else: - rotated = numpy.column_stack((flat[:, 1], -flat[:, 0])) - corners = rotated.reshape(corners.shape) - self._rects[:, 0] = corners[:, :, 0].min(axis=1) - self._rects[:, 1] = corners[:, :, 1].min(axis=1) - self._rects[:, 2] = corners[:, :, 0].max(axis=1) - self._rects[:, 3] = corners[:, :, 1].max(axis=1) - return self - - def mirror(self, axis: int = 0) -> Self: - if axis not in (0, 1): - raise PatternError('Axis must be 0 or 1') - if axis == 0: - self._rects[:, [1, 3]] *= -1 - else: - self._rects[:, [0, 2]] *= -1 - _renormalize_rects_in_place(self._rects) - return self - - def scale_by(self, c: float) -> Self: - self._rects *= c - _renormalize_rects_in_place(self._rects) - return self - - def normalized_form(self, norm_value: float) -> normalized_shape_tuple: - rects = self._sorted_rects() - centers = 0.5 * (rects[:, :2] + rects[:, 2:]) - offset = centers.mean(axis=0) - zeroed = rects.copy() - zeroed[:, [0, 2]] -= offset[0] - zeroed[:, [1, 3]] -= offset[1] - normed = zeroed / norm_value - return ( - (type(self), normed.data.tobytes()), - (offset, 1.0, 0.0, False), - lambda: RectCollection(rects=normed * norm_value), - ) - - def __repr__(self) -> str: - if self._rects.size == 0: - return '' - centers = 0.5 * (self._rects[:, :2] + self._rects[:, 2:]) - centroid = centers.mean(axis=0) - return f'' diff --git a/masque/shapes/shape.py b/masque/shapes/shape.py index efc0859..90bca2b 100644 --- a/masque/shapes/shape.py +++ b/masque/shapes/shape.py @@ -6,8 +6,8 @@ import numpy from numpy.typing import NDArray, ArrayLike from ..traits import ( - Copyable, Scalable, FlippableImpl, - PivotableImpl, RepeatableImpl, AnnotatableImpl, + Rotatable, Mirrorable, Copyable, Scalable, + Positionable, PivotableImpl, RepeatableImpl, AnnotatableImpl, ) if TYPE_CHECKING: @@ -26,9 +26,8 @@ normalized_shape_tuple = tuple[ DEFAULT_POLY_NUM_VERTICES = 24 -class Shape(FlippableImpl, PivotableImpl, RepeatableImpl, AnnotatableImpl, - Copyable, Scalable, - metaclass=ABCMeta): +class Shape(Positionable, Rotatable, Mirrorable, Copyable, Scalable, + PivotableImpl, RepeatableImpl, AnnotatableImpl, metaclass=ABCMeta): """ Class specifying functions common to all shapes. """ @@ -74,7 +73,7 @@ class Shape(FlippableImpl, PivotableImpl, RepeatableImpl, AnnotatableImpl, pass @abstractmethod - def normalized_form(self, norm_value: float) -> normalized_shape_tuple: + def normalized_form(self, norm_value: int) -> normalized_shape_tuple: """ Writes the shape in a standardized notation, with offset, scale, and rotation information separated out from the remaining values. @@ -121,7 +120,7 @@ class Shape(FlippableImpl, PivotableImpl, RepeatableImpl, AnnotatableImpl, Returns: List of `Polygon` objects with grid-aligned edges. """ - from . import Polygon #noqa: PLC0415 + from . import Polygon gx = numpy.unique(grid_x) gy = numpy.unique(grid_y) @@ -139,24 +138,22 @@ class Shape(FlippableImpl, PivotableImpl, RepeatableImpl, AnnotatableImpl, for v, v_next in zip(p_verts, numpy.roll(p_verts, -1, axis=0), strict=True): dv = v_next - v - # Find x-index bounds for the line + # Find x-index bounds for the line # TODO: fix this and err_xmin/xmax for grids smaller than the line / shape gxi_range = numpy.digitize([v[0], v_next[0]], gx) - gxi_min = int(numpy.min(gxi_range - 1).clip(0, len(gx) - 1)) - gxi_max = int(numpy.max(gxi_range).clip(0, len(gx))) + gxi_min = numpy.min(gxi_range - 1).clip(0, len(gx) - 1) + gxi_max = numpy.max(gxi_range).clip(0, len(gx)) - if gxi_min < len(gx) - 1: - err_xmin = (min(v[0], v_next[0]) - gx[gxi_min]) / (gx[gxi_min + 1] - gx[gxi_min]) - if err_xmin >= 0.5: - gxi_min += 1 + err_xmin = (min(v[0], v_next[0]) - gx[gxi_min]) / (gx[gxi_min + 1] - gx[gxi_min]) + err_xmax = (max(v[0], v_next[0]) - gx[gxi_max - 1]) / (gx[gxi_max] - gx[gxi_max - 1]) - if gxi_max > 0 and gxi_max < len(gx): - err_xmax = (max(v[0], v_next[0]) - gx[gxi_max - 1]) / (gx[gxi_max] - gx[gxi_max - 1]) - if err_xmax >= 0.5: - gxi_max += 1 + if err_xmin >= 0.5: + gxi_min += 1 + if err_xmax >= 0.5: + gxi_max += 1 if abs(dv[0]) < 1e-20: # Vertical line, don't calculate slope - xi = [gxi_min, max(gxi_min, gxi_max - 1)] + xi = [gxi_min, gxi_max - 1] ys = numpy.array([v[1], v_next[1]]) yi = numpy.digitize(ys, gy).clip(1, len(gy) - 1) err_y = (ys - gy[yi]) / (gy[yi] - gy[yi - 1]) @@ -252,9 +249,9 @@ class Shape(FlippableImpl, PivotableImpl, RepeatableImpl, AnnotatableImpl, Returns: List of `Polygon` objects with grid-aligned edges. """ - from . import Polygon #noqa: PLC0415 - import skimage.measure #noqa: PLC0415 - import float_raster #noqa: PLC0415 + from . import Polygon + import skimage.measure # type: ignore + import float_raster grx = numpy.unique(grid_x) gry = numpy.unique(grid_y) diff --git a/masque/shapes/text.py b/masque/shapes/text.py index c078879..78632f6 100644 --- a/masque/shapes/text.py +++ b/masque/shapes/text.py @@ -70,48 +70,31 @@ class Text(PositionableImpl, RotatableImpl, Shape): *, offset: ArrayLike = (0.0, 0.0), rotation: float = 0.0, - mirrored: bool = False, repetition: Repetition | None = None, annotations: annotations_t = None, + raw: bool = False, ) -> None: - self.offset = offset - self.string = string - self.height = height - self.rotation = rotation - self.mirrored = mirrored - self.repetition = repetition - self.annotations = annotations + if raw: + assert isinstance(offset, numpy.ndarray) + self._offset = offset + self._string = string + self._height = height + self._rotation = rotation + self._repetition = repetition + self._annotations = annotations + else: + self.offset = offset + self.string = string + self.height = height + self.rotation = rotation + self.repetition = repetition + self.annotations = annotations self.font_path = font_path - @classmethod - def _from_raw( - cls, - *, - string: str, - height: float, - font_path: str, - offset: NDArray[numpy.float64], - rotation: float, - mirrored: bool, - annotations: annotations_t = None, - repetition: Repetition | None = None, - ) -> Self: - new = cls.__new__(cls) - new._offset = offset - new._string = string - new._height = height - new._rotation = rotation % (2 * pi) - new._mirrored = mirrored - new._repetition = repetition - new._annotations = annotations - new.font_path = font_path - return new - def __deepcopy__(self, memo: dict | None = None) -> Self: memo = {} if memo is None else memo new = copy.copy(self) new._offset = self._offset.copy() - new._repetition = copy.deepcopy(self._repetition, memo) new._annotations = copy.deepcopy(self._annotations) return new @@ -122,7 +105,6 @@ class Text(PositionableImpl, RotatableImpl, Shape): and self.string == other.string and self.height == other.height and self.font_path == other.font_path - and self.mirrored == other.mirrored and self.rotation == other.rotation and self.repetition == other.repetition and annotations_eq(self.annotations, other.annotations) @@ -142,8 +124,6 @@ class Text(PositionableImpl, RotatableImpl, Shape): return self.font_path < other.font_path if not numpy.array_equal(self.offset, other.offset): return tuple(self.offset) < tuple(other.offset) - if self.mirrored != other.mirrored: - return self.mirrored < other.mirrored if self.rotation != other.rotation: return self.rotation < other.rotation if self.repetition != other.repetition: @@ -166,7 +146,7 @@ class Text(PositionableImpl, RotatableImpl, Shape): if self.mirrored: poly.mirror() poly.scale_by(self.height) - poly.translate(self.offset + [total_advance, 0]) + poly.offset = self.offset + [total_advance, 0] poly.rotate_around(self.offset, self.rotation) all_polygons += [poly] @@ -191,25 +171,22 @@ class Text(PositionableImpl, RotatableImpl, Shape): (self.offset, self.height / norm_value, rotation, bool(self.mirrored)), lambda: Text( string=self.string, - height=norm_value, + height=self.height * norm_value, font_path=self.font_path, rotation=rotation, ).mirror2d(across_x=self.mirrored), ) - def get_bounds_single(self) -> NDArray[numpy.float64] | None: + def get_bounds_single(self) -> NDArray[numpy.float64]: # rotation makes this a huge pain when using slot.advance and glyph.bbox(), so # just convert to polygons instead polys = self.to_polygons() - if not polys: - return None - pbounds = numpy.full((len(polys), 2, 2), nan) for pp, poly in enumerate(polys): pbounds[pp] = poly.get_bounds_nonempty() bounds = numpy.vstack(( - numpy.min(pbounds[:, 0, :], axis=0), - numpy.max(pbounds[:, 1, :], axis=0), + numpy.min(pbounds[: 0, :], axis=0), + numpy.max(pbounds[: 1, :], axis=0), )) return bounds @@ -225,8 +202,8 @@ def get_char_as_polygons( char: str, resolution: float = 48 * 64, ) -> tuple[list[NDArray[numpy.float64]], float]: - from freetype import Face # type: ignore #noqa: PLC0415 - from matplotlib.path import Path # type: ignore #noqa: PLC0415 + from freetype import Face # type: ignore + from matplotlib.path import Path # type: ignore """ Get a list of polygons representing a single character. diff --git a/masque/test/__init__.py b/masque/test/__init__.py deleted file mode 100644 index e02b636..0000000 --- a/masque/test/__init__.py +++ /dev/null @@ -1,3 +0,0 @@ -""" -Tests (run with `python3 -m pytest -rxPXs | tee results.txt`) -""" diff --git a/masque/test/conftest.py b/masque/test/conftest.py deleted file mode 100644 index 3116ee2..0000000 --- a/masque/test/conftest.py +++ /dev/null @@ -1,13 +0,0 @@ -""" - -Test fixtures - -""" - -# ruff: noqa: ARG001 -from typing import Any -import numpy - - -FixtureRequest = Any -PRNG = numpy.random.RandomState(12345) diff --git a/masque/test/test_abstract.py b/masque/test/test_abstract.py deleted file mode 100644 index d2f54ed..0000000 --- a/masque/test/test_abstract.py +++ /dev/null @@ -1,85 +0,0 @@ -from numpy.testing import assert_allclose -from numpy import pi - -from ..abstract import Abstract -from ..ports import Port -from ..ref import Ref - - -def test_abstract_init() -> None: - ports = {"A": Port((0, 0), 0), "B": Port((10, 0), pi)} - abs_obj = Abstract("test", ports) - assert abs_obj.name == "test" - assert len(abs_obj.ports) == 2 - assert abs_obj.ports["A"] is not ports["A"] # Should be deepcopied - - -def test_abstract_transform() -> None: - abs_obj = Abstract("test", {"A": Port((10, 0), 0)}) - # Rotate 90 deg around (0,0) - abs_obj.rotate_around((0, 0), pi / 2) - # (10, 0) rot 0 -> (0, 10) rot pi/2 - assert_allclose(abs_obj.ports["A"].offset, [0, 10], atol=1e-10) - assert abs_obj.ports["A"].rotation is not None - assert_allclose(abs_obj.ports["A"].rotation, pi / 2, atol=1e-10) - - # Mirror across x axis (axis 0): flips y-offset - abs_obj.mirror(0) - # (0, 10) mirrored(0) -> (0, -10) - # rotation pi/2 mirrored(0) -> -pi/2 == 3pi/2 - assert_allclose(abs_obj.ports["A"].offset, [0, -10], atol=1e-10) - assert abs_obj.ports["A"].rotation is not None - assert_allclose(abs_obj.ports["A"].rotation, 3 * pi / 2, atol=1e-10) - - -def test_abstract_ref_transform() -> None: - abs_obj = Abstract("test", {"A": Port((10, 0), 0)}) - ref = Ref(offset=(100, 100), rotation=pi / 2, mirrored=True) - - # Apply ref transform - abs_obj.apply_ref_transform(ref) - # Ref order: mirror, rotate, scale, translate - - # 1. mirror (across x: y -> -y) - # (10, 0) rot 0 -> (10, 0) rot 0 - - # 2. rotate pi/2 around (0,0) - # (10, 0) rot 0 -> (0, 10) rot pi/2 - - # 3. translate (100, 100) - # (0, 10) -> (100, 110) - - assert_allclose(abs_obj.ports["A"].offset, [100, 110], atol=1e-10) - assert abs_obj.ports["A"].rotation is not None - assert_allclose(abs_obj.ports["A"].rotation, pi / 2, atol=1e-10) - - -def test_abstract_ref_transform_scales_offsets() -> None: - abs_obj = Abstract("test", {"A": Port((10, 0), 0)}) - ref = Ref(offset=(100, 100), rotation=pi / 2, mirrored=True, scale=2) - - abs_obj.apply_ref_transform(ref) - - assert_allclose(abs_obj.ports["A"].offset, [100, 120], atol=1e-10) - assert abs_obj.ports["A"].rotation is not None - assert_allclose(abs_obj.ports["A"].rotation, pi / 2, atol=1e-10) - - -def test_abstract_undo_transform() -> None: - abs_obj = Abstract("test", {"A": Port((100, 110), pi / 2)}) - ref = Ref(offset=(100, 100), rotation=pi / 2, mirrored=True) - - abs_obj.undo_ref_transform(ref) - assert_allclose(abs_obj.ports["A"].offset, [10, 0], atol=1e-10) - assert abs_obj.ports["A"].rotation is not None - assert_allclose(abs_obj.ports["A"].rotation, 0, atol=1e-10) - - -def test_abstract_undo_transform_scales_offsets() -> None: - abs_obj = Abstract("test", {"A": Port((100, 120), pi / 2)}) - ref = Ref(offset=(100, 100), rotation=pi / 2, mirrored=True, scale=2) - - abs_obj.undo_ref_transform(ref) - assert_allclose(abs_obj.ports["A"].offset, [10, 0], atol=1e-10) - assert abs_obj.ports["A"].rotation is not None - assert_allclose(abs_obj.ports["A"].rotation, 0, atol=1e-10) diff --git a/masque/test/test_advanced_routing.py b/masque/test/test_advanced_routing.py deleted file mode 100644 index 0008172..0000000 --- a/masque/test/test_advanced_routing.py +++ /dev/null @@ -1,77 +0,0 @@ -import pytest -from numpy.testing import assert_equal -from numpy import pi - -from ..builder import Pather -from ..builder.tools import PathTool -from ..library import Library -from ..ports import Port - - -@pytest.fixture -def advanced_pather() -> tuple[Pather, PathTool, Library]: - lib = Library() - # Simple PathTool: 2um width on layer (1,0) - tool = PathTool(layer=(1, 0), width=2, ptype="wire") - p = Pather(lib, tools=tool, auto_render=True, auto_render_append=False) - return p, tool, lib - - -def test_path_into_straight(advanced_pather: tuple[Pather, PathTool, Library]) -> None: - p, _tool, _lib = advanced_pather - # Facing ports - p.ports["src"] = Port((0, 0), 0, ptype="wire") # Facing East (into device) - # Forward (+pi relative to port) is West (-x). - # Put destination at (-20, 0) pointing East (pi). - p.ports["dst"] = Port((-20, 0), pi, ptype="wire") - - p.trace_into("src", "dst") - - assert "src" not in p.ports - assert "dst" not in p.ports - # Pather._traceL adds a Reference to the generated pattern - assert len(p.pattern.refs) == 1 - - -def test_path_into_bend(advanced_pather: tuple[Pather, PathTool, Library]) -> None: - p, _tool, _lib = advanced_pather - # Source at (0,0) rot 0 (facing East). Forward is West (-x). - p.ports["src"] = Port((0, 0), 0, ptype="wire") - # Destination at (-20, -20) rot pi (facing West). Forward is East (+x). - # Wait, src forward is -x. dst is at -20, -20. - # To use a single bend, dst should be at some -x, -y and its rotation should be 3pi/2 (facing South). - # Forward for South is North (+y). - p.ports["dst"] = Port((-20, -20), 3 * pi / 2, ptype="wire") - - p.trace_into("src", "dst") - - assert "src" not in p.ports - assert "dst" not in p.ports - # `trace_into()` now batches its internal legs before auto-rendering so the operation - # can roll back cleanly on later failures. - assert len(p.pattern.refs) == 1 - - -def test_path_into_sbend(advanced_pather: tuple[Pather, PathTool, Library]) -> None: - p, _tool, _lib = advanced_pather - # Facing but offset ports - p.ports["src"] = Port((0, 0), 0, ptype="wire") # Forward is West (-x) - p.ports["dst"] = Port((-20, -10), pi, ptype="wire") # Facing East (rot pi) - - p.trace_into("src", "dst") - - assert "src" not in p.ports - assert "dst" not in p.ports - - -def test_path_into_thru(advanced_pather: tuple[Pather, PathTool, Library]) -> None: - p, _tool, _lib = advanced_pather - p.ports["src"] = Port((0, 0), 0, ptype="wire") - p.ports["dst"] = Port((-20, 0), pi, ptype="wire") - p.ports["other"] = Port((10, 10), 0) - - p.trace_into("src", "dst", thru="other") - - assert "src" in p.ports - assert_equal(p.ports["src"].offset, [10, 10]) - assert "other" not in p.ports diff --git a/masque/test/test_autotool.py b/masque/test/test_autotool.py deleted file mode 100644 index e03994e..0000000 --- a/masque/test/test_autotool.py +++ /dev/null @@ -1,81 +0,0 @@ -import pytest -from numpy.testing import assert_allclose -from numpy import pi - -from ..builder import Pather -from ..builder.tools import AutoTool -from ..library import Library -from ..pattern import Pattern -from ..ports import Port - - -def make_straight(length: float, width: float = 2, ptype: str = "wire") -> Pattern: - pat = Pattern() - pat.rect((1, 0), xmin=0, xmax=length, yctr=0, ly=width) - pat.ports["in"] = Port((0, 0), 0, ptype=ptype) - pat.ports["out"] = Port((length, 0), pi, ptype=ptype) - return pat - - -@pytest.fixture -def autotool_setup() -> tuple[Pather, AutoTool, Library]: - lib = Library() - - # Define a simple bend - bend_pat = Pattern() - # 2x2 bend from (0,0) rot 0 to (2, -2) rot pi/2 (Clockwise) - bend_pat.ports["in"] = Port((0, 0), 0, ptype="wire") - bend_pat.ports["out"] = Port((2, -2), pi / 2, ptype="wire") - lib["bend"] = bend_pat - lib.abstract("bend") - - # Define a transition (e.g., via) - via_pat = Pattern() - via_pat.ports["m1"] = Port((0, 0), 0, ptype="wire_m1") - via_pat.ports["m2"] = Port((1, 0), pi, ptype="wire_m2") - lib["via"] = via_pat - via_abs = lib.abstract("via") - - tool_m1 = AutoTool( - straights=[ - AutoTool.Straight(ptype="wire_m1", fn=lambda length: make_straight(length, ptype="wire_m1"), in_port_name="in", out_port_name="out") - ], - bends=[], - sbends=[], - transitions={("wire_m2", "wire_m1"): AutoTool.Transition(via_abs, "m2", "m1")}, - default_out_ptype="wire_m1", - ) - - p = Pather(lib, tools=tool_m1) - # Start with an m2 port - p.ports["start"] = Port((0, 0), pi, ptype="wire_m2") - - return p, tool_m1, lib - - -def test_autotool_transition(autotool_setup: tuple[Pather, AutoTool, Library]) -> None: - p, _tool, _lib = autotool_setup - - # Route m1 from an m2 port. Should trigger via. - # length 10. Via length is 1. So straight m1 should be 9. - p.straight("start", 10) - - # Start at (0,0) rot pi (facing West). - # Forward (+pi relative to port) is East (+x). - # Via: m2(1,0)pi -> m1(0,0)0. - # Plug via m2 into start(0,0)pi: transformation rot=mod(pi-pi-pi, 2pi)=pi. - # rotate via by pi: m2 at (0,0), m1 at (-1, 0) rot pi. - # Then straight m1 of length 9 from (-1, 0) rot pi -> ends at (8, 0) rot pi. - # Wait, (length, 0) relative to (-1, 0) rot pi: - # transform (9, 0) by pi: (-9, 0). - # (-1, 0) + (-9, 0) = (-10, 0)? No. - # Let's re-calculate. - # start (0,0) rot pi. Direction East. - # via m2 is at (0,0), m1 is at (1,0). - # When via is plugged into start: m2 goes to (0,0). - # since start is pi and m2 is pi, rotation is 0. - # so via m1 is at (1,0) rot 0. - # then straight m1 length 9 from (1,0) rot 0: ends at (10, 0) rot 0. - - assert_allclose(p.ports["start"].offset, [10, 0], atol=1e-10) - assert p.ports["start"].ptype == "wire_m1" diff --git a/masque/test/test_autotool_refactor.py b/masque/test/test_autotool_refactor.py deleted file mode 100644 index 3109447..0000000 --- a/masque/test/test_autotool_refactor.py +++ /dev/null @@ -1,306 +0,0 @@ -import pytest -from numpy.testing import assert_allclose -from numpy import pi - -from masque.builder.tools import AutoTool -from masque.pattern import Pattern -from masque.ports import Port -from masque.library import Library -from masque.builder.pather import Pather - -def make_straight(length, width=2, ptype="wire"): - pat = Pattern() - pat.rect((1, 0), xmin=0, xmax=length, yctr=0, ly=width) - pat.ports["A"] = Port((0, 0), 0, ptype=ptype) - pat.ports["B"] = Port((length, 0), pi, ptype=ptype) - return pat - -def make_bend(R, width=2, ptype="wire", clockwise=True): - pat = Pattern() - # 90 degree arc approximation (just two rects for start and end) - if clockwise: - # (0,0) rot 0 to (R, -R) rot pi/2 - pat.rect((1, 0), xmin=0, xmax=R, yctr=0, ly=width) - pat.rect((1, 0), xctr=R, lx=width, ymin=-R, ymax=0) - pat.ports["A"] = Port((0, 0), 0, ptype=ptype) - pat.ports["B"] = Port((R, -R), pi/2, ptype=ptype) - else: - # (0,0) rot 0 to (R, R) rot -pi/2 - pat.rect((1, 0), xmin=0, xmax=R, yctr=0, ly=width) - pat.rect((1, 0), xctr=R, lx=width, ymin=0, ymax=R) - pat.ports["A"] = Port((0, 0), 0, ptype=ptype) - pat.ports["B"] = Port((R, R), -pi/2, ptype=ptype) - return pat - -@pytest.fixture -def multi_bend_tool(): - lib = Library() - - # Bend 1: R=2 - lib["b1"] = make_bend(2, ptype="wire") - b1_abs = lib.abstract("b1") - # Bend 2: R=5 - lib["b2"] = make_bend(5, ptype="wire") - b2_abs = lib.abstract("b2") - - tool = AutoTool( - straights=[ - # Straight 1: only for length < 10 - AutoTool.Straight(ptype="wire", fn=make_straight, in_port_name="A", out_port_name="B", length_range=(0, 10)), - # Straight 2: for length >= 10 - AutoTool.Straight(ptype="wire", fn=lambda l: make_straight(l, width=4), in_port_name="A", out_port_name="B", length_range=(10, 1e8)) - ], - bends=[ - AutoTool.Bend(b1_abs, "A", "B", clockwise=True, mirror=True), - AutoTool.Bend(b2_abs, "A", "B", clockwise=True, mirror=True) - ], - sbends=[], - transitions={}, - default_out_ptype="wire" - ) - return tool, lib - - -@pytest.fixture -def asymmetric_transition_tool() -> AutoTool: - lib = Library() - - bend_pat = Pattern() - bend_pat.ports["in"] = Port((0, 0), 0, ptype="core") - bend_pat.ports["out"] = Port((2, -2), pi / 2, ptype="core") - lib["core_bend"] = bend_pat - - trans_pat = Pattern() - trans_pat.ports["CORE"] = Port((0, 0), 0, ptype="core") - trans_pat.ports["MID"] = Port((3, 1), pi, ptype="mid") - lib["core_mid"] = trans_pat - - return AutoTool( - straights=[ - AutoTool.Straight( - ptype="core", - fn=lambda length: make_straight(length, ptype="core"), - in_port_name="A", - out_port_name="B", - length_range=(0, 3), - ), - AutoTool.Straight( - ptype="mid", - fn=lambda length: make_straight(length, ptype="mid"), - in_port_name="A", - out_port_name="B", - length_range=(0, 1e8), - ), - ], - bends=[ - AutoTool.Bend(lib.abstract("core_bend"), "in", "out", clockwise=True, mirror=True), - ], - sbends=[], - transitions={ - ("mid", "core"): AutoTool.Transition(lib.abstract("core_mid"), "MID", "CORE"), - }, - default_out_ptype="core", - ).add_complementary_transitions() - - -def assert_trace_matches_plan(plan_port: Port, tree: Library, port_names: tuple[str, str] = ("A", "B")) -> None: - pat = tree.top_pattern() - out_port = pat[port_names[1]] - dxy, rot = pat[port_names[0]].measure_travel(out_port) - assert_allclose(dxy, plan_port.offset) - assert rot is not None - assert plan_port.rotation is not None - assert_allclose(rot, plan_port.rotation) - assert out_port.ptype == plan_port.ptype - - -def test_autotool_planL_selection(multi_bend_tool) -> None: - tool, _ = multi_bend_tool - - # Small length: should pick straight 1 and bend 1 (R=2) - # L = straight + R. If L=5, straight=3. - p, data = tool.planL(True, 5) - assert data.straight.length_range == (0, 10) - assert data.straight_length == 3 - assert data.bend.abstract.name == "b1" - assert_allclose(p.offset, [5, 2]) - - # Large length: should pick straight 2 and bend 1 (R=2) - # If L=15, straight=13. - p, data = tool.planL(True, 15) - assert data.straight.length_range == (10, 1e8) - assert data.straight_length == 13 - assert_allclose(p.offset, [15, 2]) - -def test_autotool_planU_consistency(multi_bend_tool) -> None: - tool, lib = multi_bend_tool - - # length=10, jog=20. - # U-turn: Straight1 -> Bend1 -> Straight_mid -> Straight3(0) -> Bend2 - # X = L1_total - R2 = length - # Y = R1 + L2_mid + R2 = jog - - p, data = tool.planU(20, length=10) - assert data.ldata0.straight_length == 7 - assert data.ldata0.bend.abstract.name == "b2" - assert data.l2_length == 13 - assert data.ldata1.straight_length == 0 - assert data.ldata1.bend.abstract.name == "b1" - - -def test_autotool_traceU_matches_plan_with_asymmetric_transition(asymmetric_transition_tool: AutoTool) -> None: - tool = asymmetric_transition_tool - - plan_port, data = tool.planU(12, length=0, in_ptype="core") - - assert data.ldata1.in_transition is not None - assert data.ldata1.b_transition is not None - - tree = tool.traceU(12, length=0, in_ptype="core") - assert_trace_matches_plan(plan_port, tree) - - -def test_autotool_planS_double_L(multi_bend_tool) -> None: - tool, lib = multi_bend_tool - - # length=20, jog=10. S-bend (ccw1, cw2) - # X = L1_total + R2 = length - # Y = R1 + L2_mid + R2 = jog - - p, data = tool.planS(20, 10) - assert_allclose(p.offset, [20, 10]) - assert_allclose(p.rotation, pi) - - assert data.ldata0.straight_length == 16 - assert data.ldata1.straight_length == 0 - assert data.l2_length == 6 - - -def test_autotool_traceS_double_l_matches_plan_with_asymmetric_transition(asymmetric_transition_tool: AutoTool) -> None: - tool = asymmetric_transition_tool - - plan_port, data = tool.planS(4, 10, in_ptype="core") - - assert isinstance(data, AutoTool.UData) - assert data.ldata1.in_transition is not None - assert data.ldata1.b_transition is not None - - tree = tool.traceS(4, 10, in_ptype="core") - assert_trace_matches_plan(plan_port, tree) - - -def test_autotool_planS_pure_sbend_with_transition_dx() -> None: - lib = Library() - - def make_straight(length: float) -> Pattern: - pat = Pattern() - pat.ports["A"] = Port((0, 0), 0, ptype="core") - pat.ports["B"] = Port((length, 0), pi, ptype="core") - return pat - - def make_sbend(jog: float) -> Pattern: - pat = Pattern() - pat.ports["A"] = Port((0, 0), 0, ptype="core") - pat.ports["B"] = Port((10, jog), pi, ptype="core") - return pat - - trans_pat = Pattern() - trans_pat.ports["EXT"] = Port((0, 0), 0, ptype="ext") - trans_pat.ports["CORE"] = Port((5, 0), pi, ptype="core") - lib["xin"] = trans_pat - - tool = AutoTool( - straights=[ - AutoTool.Straight( - ptype="core", - fn=make_straight, - in_port_name="A", - out_port_name="B", - length_range=(1, 1e8), - ) - ], - bends=[], - sbends=[ - AutoTool.SBend( - ptype="core", - fn=make_sbend, - in_port_name="A", - out_port_name="B", - jog_range=(0, 1e8), - ) - ], - transitions={ - ("ext", "core"): AutoTool.Transition(lib.abstract("xin"), "EXT", "CORE"), - }, - default_out_ptype="core", - ) - - p, data = tool.planS(15, 4, in_ptype="ext") - - assert_allclose(p.offset, [15, 4]) - assert_allclose(p.rotation, pi) - assert data.straight_length == 0 - assert data.jog_remaining == 4 - assert data.in_transition is not None - - -def test_renderpather_autotool_double_L(multi_bend_tool) -> None: - tool, lib = multi_bend_tool - rp = Pather(lib, tools=tool, auto_render=False) - rp.ports["A"] = Port((0,0), 0, ptype="wire") - - # This should trigger double-L fallback in planS - rp.jog("A", 10, length=20) - - # port_rot=0 -> forward is -x. jog=10 (left) is -y. - assert_allclose(rp.ports["A"].offset, [-20, -10]) - assert_allclose(rp.ports["A"].rotation, 0) # jog rot is pi relative to input, input rot is pi relative to port. - # Wait, planS returns out_port at (length, jog) rot pi relative to input (0,0) rot 0. - # Input rot relative to port is pi. - # Rotate (length, jog) rot pi by pi: (-length, -jog) rot 0. Correct. - - rp.render() - assert len(rp.pattern.refs) > 0 - -def test_pather_uturn_fallback_no_heuristic(multi_bend_tool) -> None: - tool, lib = multi_bend_tool - - class BasicTool(AutoTool): - def planU(self, *args, **kwargs): - raise NotImplementedError() - - tool_basic = BasicTool( - straights=tool.straights, - bends=tool.bends, - sbends=tool.sbends, - transitions=tool.transitions, - default_out_ptype=tool.default_out_ptype - ) - - p = Pather(lib, tools=tool_basic) - p.ports["A"] = Port((0,0), 0, ptype="wire") # facing West (Actually East points Inwards, West is Extension) - - # uturn jog=10, length=5. - # R=2. L1 = 5+2=7. L2 = 10-2=8. - p.uturn("A", 10, length=5) - - # port_rot=0 -> forward is -x. jog=10 (left) is -y. - # L1=7 along -x -> (-7, 0). Bend1 (ccw) -> rot -pi/2 (South). - # L2=8 along -y -> (-7, -8). Bend2 (ccw) -> rot 0 (East). - # wait. CCW turn from facing South (-y): turn towards East (+x). - # Wait. - # Input facing -x. CCW turn -> face -y. - # Input facing -y. CCW turn -> face +x. - # So final rotation is 0. - # Bend1 (ccw) relative to -x: global offset is (-7, -2)? - # Let's re-run my manual calculation. - # Port rot 0. Wire input rot pi. Wire output relative to input: - # L1=7, R1=2, CCW=True. Output (7, 2) rot pi/2. - # Rotate wire by pi: output (-7, -2) rot 3pi/2. - # Second turn relative to (-7, -2) rot 3pi/2: - # local output (8, 2) rot pi/2. - # global: (-7, -2) + 8*rot(3pi/2)*x + 2*rot(3pi/2)*y - # = (-7, -2) + 8*(0, -1) + 2*(1, 0) = (-7, -2) + (0, -8) + (2, 0) = (-5, -10). - # YES! ACTUAL result was (-5, -10). - assert_allclose(p.ports["A"].offset, [-5, -10]) - assert_allclose(p.ports["A"].rotation, pi) diff --git a/masque/test/test_boolean.py b/masque/test/test_boolean.py deleted file mode 100644 index 0249c64..0000000 --- a/masque/test/test_boolean.py +++ /dev/null @@ -1,247 +0,0 @@ -# ruff: noqa: PLC0415 -import pytest -import numpy -from numpy.testing import assert_allclose -from masque.pattern import Pattern -from masque.shapes.polygon import Polygon -from masque.repetition import Grid -from masque.library import Library -from masque.error import PatternError - - -def _poly_area(poly: Polygon) -> float: - verts = poly.vertices - x = verts[:, 0] - y = verts[:, 1] - return 0.5 * abs(numpy.dot(x, numpy.roll(y, -1)) - numpy.dot(y, numpy.roll(x, -1))) - -def test_layer_as_polygons_basic() -> None: - pat = Pattern() - pat.polygon((1, 0), [[0, 0], [1, 0], [1, 1], [0, 1]]) - - polys = pat.layer_as_polygons((1, 0), flatten=False) - assert len(polys) == 1 - assert isinstance(polys[0], Polygon) - assert_allclose(polys[0].vertices, [[0, 0], [1, 0], [1, 1], [0, 1]]) - -def test_layer_as_polygons_repetition() -> None: - pat = Pattern() - rep = Grid(a_vector=(2, 0), a_count=2) - pat.polygon((1, 0), [[0, 0], [1, 0], [1, 1], [0, 1]], repetition=rep) - - polys = pat.layer_as_polygons((1, 0), flatten=False) - assert len(polys) == 2 - # First polygon at (0,0) - assert_allclose(polys[0].vertices, [[0, 0], [1, 0], [1, 1], [0, 1]]) - # Second polygon at (2,0) - assert_allclose(polys[1].vertices, [[2, 0], [3, 0], [3, 1], [2, 1]]) - -def test_layer_as_polygons_flatten() -> None: - lib = Library() - - child = Pattern() - child.polygon((1, 0), [[0, 0], [1, 0], [1, 1]]) - lib['child'] = child - - parent = Pattern() - parent.ref('child', offset=(10, 10), rotation=numpy.pi/2) - - polys = parent.layer_as_polygons((1, 0), flatten=True, library=lib) - assert len(polys) == 1 - # Original child at (0,0) with rot pi/2 is still at (0,0) in its own space? - # No, ref.as_pattern(child) will apply the transform. - # Child (0,0), (1,0), (1,1) rotated pi/2 around (0,0) -> (0,0), (0,1), (-1,1) - # Then offset by (10,10) -> (10,10), (10,11), (9,11) - - # Let's verify the vertices - expected = numpy.array([[10, 10], [10, 11], [9, 11]]) - assert_allclose(polys[0].vertices, expected, atol=1e-10) - -def test_boolean_import_error() -> None: - from masque import boolean - # If pyclipper is not installed, this should raise ImportError - try: - import pyclipper # noqa: F401 - pytest.skip("pyclipper is installed, cannot test ImportError") - except ImportError: - with pytest.raises(ImportError, match="Boolean operations require 'pyclipper'"): - boolean([], [], operation='union') - -def test_polygon_boolean_shortcut() -> None: - poly = Polygon([[0, 0], [1, 0], [1, 1]]) - # This should also raise ImportError if pyclipper is missing - try: - import pyclipper # noqa: F401 - pytest.skip("pyclipper is installed") - except ImportError: - with pytest.raises(ImportError, match="Boolean operations require 'pyclipper'"): - poly.boolean(poly) - - -def test_boolean_intersection_with_pyclipper() -> None: - pytest.importorskip("pyclipper") - from masque.utils.boolean import boolean - - result = boolean( - [Polygon([[0, 0], [2, 0], [2, 2], [0, 2]])], - [Polygon([[1, 1], [3, 1], [3, 3], [1, 3]])], - operation='intersection', - ) - - assert len(result) == 1 - assert_allclose(result[0].get_bounds_single(), [[1, 1], [2, 2]], atol=1e-10) - - -def test_polygon_boolean_shortcut_with_pyclipper() -> None: - pytest.importorskip("pyclipper") - - poly = Polygon([[0, 0], [2, 0], [2, 2], [0, 2]]) - result = poly.boolean( - Polygon([[1, 1], [3, 1], [3, 3], [1, 3]]), - operation='intersection', - ) - - assert len(result) == 1 - assert_allclose(result[0].get_bounds_single(), [[1, 1], [2, 2]], atol=1e-10) - - -def test_boolean_union_difference_and_xor_with_pyclipper() -> None: - pytest.importorskip("pyclipper") - from masque.utils.boolean import boolean - - rect_a = Polygon([[0, 0], [2, 0], [2, 2], [0, 2]]) - rect_b = Polygon([[1, 1], [3, 1], [3, 3], [1, 3]]) - - union = boolean([rect_a], [rect_b], operation='union') - assert len(union) == 1 - assert_allclose(union[0].get_bounds_single(), [[0, 0], [3, 3]], atol=1e-10) - assert_allclose(_poly_area(union[0]), 7, atol=1e-10) - - difference = boolean([rect_a], [rect_b], operation='difference') - assert len(difference) == 1 - assert_allclose(difference[0].get_bounds_single(), [[0, 0], [2, 2]], atol=1e-10) - assert_allclose(_poly_area(difference[0]), 3, atol=1e-10) - - xor = boolean([rect_a], [rect_b], operation='xor') - assert len(xor) == 2 - assert_allclose(sorted(_poly_area(poly) for poly in xor), [3, 3], atol=1e-10) - xor_bounds = sorted(tuple(map(tuple, poly.get_bounds_single())) for poly in xor) - assert xor_bounds == [((0.0, 0.0), (2.0, 2.0)), ((1.0, 1.0), (3.0, 3.0))] - - -def test_boolean_accepts_raw_vertices_and_single_shape_inputs() -> None: - pytest.importorskip("pyclipper") - from masque.utils.boolean import boolean - - raw_result = boolean( - [numpy.array([[0, 0], [2, 0], [2, 2], [0, 2]])], - numpy.array([[1, 1], [3, 1], [3, 3], [1, 3]]), - operation='intersection', - ) - assert len(raw_result) == 1 - assert_allclose(raw_result[0].get_bounds_single(), [[1, 1], [2, 2]], atol=1e-10) - assert_allclose(_poly_area(raw_result[0]), 1, atol=1e-10) - - single_shape_result = boolean( - Polygon([[0, 0], [2, 0], [2, 2], [0, 2]]), - Polygon([[1, 1], [3, 1], [3, 3], [1, 3]]), - operation='intersection', - ) - assert len(single_shape_result) == 1 - assert_allclose(single_shape_result[0].get_bounds_single(), [[1, 1], [2, 2]], atol=1e-10) - - -def test_boolean_handles_multi_polygon_inputs() -> None: - pytest.importorskip("pyclipper") - from masque.utils.boolean import boolean - - result = boolean( - [ - Polygon([[0, 0], [2, 0], [2, 2], [0, 2]]), - Polygon([[10, 0], [12, 0], [12, 2], [10, 2]]), - ], - [ - Polygon([[1, 1], [3, 1], [3, 3], [1, 3]]), - Polygon([[11, 1], [13, 1], [13, 3], [11, 3]]), - ], - operation='intersection', - ) - assert len(result) == 2 - assert_allclose(sorted(_poly_area(poly) for poly in result), [1, 1], atol=1e-10) - result_bounds = sorted(tuple(map(tuple, poly.get_bounds_single())) for poly in result) - assert result_bounds == [((1.0, 1.0), (2.0, 2.0)), ((11.0, 1.0), (12.0, 2.0))] - - -def test_boolean_difference_preserves_hole_area_via_bridged_polygon() -> None: - pytest.importorskip("pyclipper") - from masque.utils.boolean import boolean - - outer = Polygon([[0, 0], [10, 0], [10, 10], [0, 10]]) - hole = Polygon([[2, 2], [8, 2], [8, 8], [2, 8]]) - result = boolean([outer], [hole], operation='difference') - - assert len(result) == 1 - assert_allclose(result[0].get_bounds_single(), [[0, 0], [10, 10]], atol=1e-10) - assert_allclose(_poly_area(result[0]), 64, atol=1e-10) - - -def test_boolean_nested_hole_and_island_case() -> None: - pytest.importorskip("pyclipper") - from masque.utils.boolean import boolean - - outer = Polygon([[0, 0], [10, 0], [10, 10], [0, 10]]) - hole = Polygon([[2, 2], [8, 2], [8, 8], [2, 8]]) - island = Polygon([[4, 4], [6, 4], [6, 6], [4, 6]]) - - result = boolean([outer, island], [hole], operation='union') - - assert len(result) == 1 - assert_allclose(result[0].get_bounds_single(), [[0, 0], [10, 10]], atol=1e-10) - assert_allclose(_poly_area(result[0]), 100, atol=1e-10) - - -def test_boolean_empty_inputs_follow_set_semantics() -> None: - pytest.importorskip("pyclipper") - from masque.utils.boolean import boolean - - rect = Polygon([[1, 1], [3, 1], [3, 3], [1, 3]]) - - union = boolean([], [rect], operation='union') - assert len(union) == 1 - assert_allclose(union[0].get_bounds_single(), [[1, 1], [3, 3]], atol=1e-10) - - intersection = boolean([], [rect], operation='intersection') - assert intersection == [] - - difference = boolean([], [rect], operation='difference') - assert difference == [] - - xor = boolean([], [rect], operation='xor') - assert len(xor) == 1 - assert_allclose(xor[0].get_bounds_single(), [[1, 1], [3, 3]], atol=1e-10) - - clip_empty_union = boolean([rect], [], operation='union') - assert len(clip_empty_union) == 1 - assert_allclose(clip_empty_union[0].get_bounds_single(), [[1, 1], [3, 3]], atol=1e-10) - - clip_empty_intersection = boolean([rect], [], operation='intersection') - assert clip_empty_intersection == [] - - clip_empty_difference = boolean([rect], [], operation='difference') - assert len(clip_empty_difference) == 1 - assert_allclose(clip_empty_difference[0].get_bounds_single(), [[1, 1], [3, 3]], atol=1e-10) - - clip_empty_xor = boolean([rect], [], operation='xor') - assert len(clip_empty_xor) == 1 - assert_allclose(clip_empty_xor[0].get_bounds_single(), [[1, 1], [3, 3]], atol=1e-10) - - -def test_boolean_invalid_inputs_raise_pattern_error() -> None: - pytest.importorskip("pyclipper") - from masque.utils.boolean import boolean - - rect = Polygon([[0, 0], [1, 0], [1, 1], [0, 1]]) - - for bad in (123, object(), [123]): - with pytest.raises(PatternError, match='Unsupported type'): - boolean([rect], bad, operation='intersection') diff --git a/masque/test/test_builder.py b/masque/test/test_builder.py deleted file mode 100644 index 9f73d2b..0000000 --- a/masque/test/test_builder.py +++ /dev/null @@ -1,163 +0,0 @@ -import numpy -import pytest -from numpy.testing import assert_equal, assert_allclose -from numpy import pi - -from ..builder import Pather -from ..builder.utils import ell -from ..error import BuildError -from ..library import Library -from ..pattern import Pattern -from ..ports import Port - - -def test_builder_init() -> None: - lib = Library() - b = Pather(lib, name="mypat") - assert b.pattern is lib["mypat"] - assert b.library is lib - - -def test_builder_place() -> None: - lib = Library() - child = Pattern() - child.ports["A"] = Port((0, 0), 0) - lib["child"] = child - - b = Pather(lib) - b.place("child", offset=(10, 20), port_map={"A": "child_A"}) - - assert "child_A" in b.ports - assert_equal(b.ports["child_A"].offset, [10, 20]) - assert "child" in b.pattern.refs - - -def test_builder_plug() -> None: - lib = Library() - - wire = Pattern() - wire.ports["in"] = Port((0, 0), 0) - wire.ports["out"] = Port((10, 0), pi) - lib["wire"] = wire - - b = Pather(lib) - b.ports["start"] = Port((100, 100), 0) - - # Plug wire's "in" port into builder's "start" port - # Wire's "out" port should be renamed to "start" because thru=True (default) and wire has 2 ports - # builder start: (100, 100) rotation 0 - # wire in: (0, 0) rotation 0 - # wire out: (10, 0) rotation pi - # Plugging wire in (rot 0) to builder start (rot 0) means wire is rotated by pi (180 deg) - # so wire in is at (100, 100), wire out is at (100 - 10, 100) = (90, 100) - b.plug("wire", map_in={"start": "in"}) - - assert "start" in b.ports - assert_equal(b.ports["start"].offset, [90, 100]) - assert b.ports["start"].rotation is not None - assert_allclose(b.ports["start"].rotation, 0, atol=1e-10) - - -def test_builder_interface() -> None: - lib = Library() - source = Pattern() - source.ports["P1"] = Port((0, 0), 0) - lib["source"] = source - - b = Pather.interface("source", library=lib, name="iface") - assert "in_P1" in b.ports - assert "P1" in b.ports - assert b.pattern is lib["iface"] - - -def test_builder_set_dead() -> None: - lib = Library() - lib["sub"] = Pattern() - b = Pather(lib) - b.set_dead() - - b.place("sub") - assert not b.pattern.has_refs() - - -def test_builder_dead_ports() -> None: - lib = Library() - pat = Pattern() - pat.ports['A'] = Port((0, 0), 0) - b = Pather(lib, pattern=pat) - b.set_dead() - - # Attempt to plug a device where ports don't line up - # A has rotation 0, C has rotation 0. plug() expects opposing rotations (pi difference). - other = Pattern(ports={'C': Port((10, 10), 0), 'D': Port((20, 20), 0)}) - - # This should NOT raise PortError because b is dead - b.plug(other, map_in={'A': 'C'}, map_out={'D': 'B'}) - - # Port A should be removed, and Port B (renamed from D) should be added - assert 'A' not in b.ports - assert 'B' in b.ports - - # Verify geometry was not added - assert not b.pattern.has_refs() - assert not b.pattern.has_shapes() - - -def test_dead_plug_best_effort() -> None: - lib = Library() - pat = Pattern() - pat.ports['A'] = Port((0, 0), 0) - b = Pather(lib, pattern=pat) - b.set_dead() - - # Device with multiple ports, none of which line up correctly - other = Pattern(ports={ - 'P1': Port((10, 10), 0), # Wrong rotation (0 instead of pi) - 'P2': Port((20, 20), pi) # Correct rotation but wrong offset - }) - - # Try to plug. find_transform will fail. - # It should fall back to aligning the first pair ('A' and 'P1'). - b.plug(other, map_in={'A': 'P1'}, map_out={'P2': 'B'}) - - assert 'A' not in b.ports - assert 'B' in b.ports - - # Dummy transform aligns A (0,0) with P1 (10,10) - # A rotation 0, P1 rotation 0 -> rotation = (0 - 0 - pi) = -pi - # P2 (20,20) rotation pi: - # 1. Translate P2 so P1 is at origin: (20,20) - (10,10) = (10,10) - # 2. Rotate (10,10) by -pi: (-10,-10) - # 3. Translate by s_port.offset (0,0): (-10,-10) - assert_allclose(b.ports['B'].offset, [-10, -10], atol=1e-10) - # P2 rot pi + transform rot -pi = 0 - assert b.ports['B'].rotation is not None - assert_allclose(b.ports['B'].rotation, 0, atol=1e-10) - - -def test_ell_validates_spacing_length() -> None: - ports = { - 'A': Port((0, 0), 0), - 'B': Port((0, 1), 0), - 'C': Port((0, 2), 0), - } - - with pytest.raises(BuildError, match='spacing must be scalar or have length 2'): - ell(ports, True, 'min_extension', 5, spacing=[1, 2, 3]) - - with pytest.raises(BuildError, match='spacing must be scalar or have length 2'): - ell(ports, True, 'min_extension', 5, spacing=[]) - - -def test_ell_handles_array_spacing_when_ccw_none() -> None: - ports = { - 'A': Port((0, 0), 0), - 'B': Port((0, 1), 0), - } - - scalar = ell(ports, None, 'min_extension', 5, spacing=0) - array_zero = ell(ports, None, 'min_extension', 5, spacing=numpy.array([0, 0])) - assert scalar == array_zero - - with pytest.raises(BuildError, match='Spacing must be 0 or None'): - ell(ports, None, 'min_extension', 5, spacing=numpy.array([1, 0])) diff --git a/masque/test/test_dxf.py b/masque/test/test_dxf.py deleted file mode 100644 index f6dd177..0000000 --- a/masque/test/test_dxf.py +++ /dev/null @@ -1,184 +0,0 @@ -import io -import numpy -import ezdxf -from numpy.testing import assert_allclose -from pathlib import Path - -from ..pattern import Pattern -from ..library import Library -from ..shapes import Path as MPath, Polygon -from ..repetition import Grid -from ..file import dxf - - -def _matches_open_path(actual: numpy.ndarray, expected: numpy.ndarray) -> bool: - return bool( - numpy.allclose(actual, expected) - or numpy.allclose(actual, expected[::-1]) - ) - - -def _matches_closed_vertices(actual: numpy.ndarray, expected: numpy.ndarray) -> bool: - return {tuple(row) for row in actual.tolist()} == {tuple(row) for row in expected.tolist()} - - -def test_dxf_roundtrip(tmp_path: Path): - lib = Library() - pat = Pattern() - - # 1. Polygon (closed) - poly_verts = numpy.array([[0, 0], [10, 0], [10, 10], [0, 10]]) - pat.polygon("1", vertices=poly_verts) - - # 2. Path (open, 3 points) - path_verts = numpy.array([[20, 0], [30, 0], [30, 10]]) - pat.path("2", vertices=path_verts, width=2) - - # 3. Path (open, 2 points) - Testing the fix for 2-point polylines - path2_verts = numpy.array([[40, 0], [50, 10]]) - pat.path("3", vertices=path2_verts, width=0) # width 0 to be sure it's not a polygonized path if we're not careful - - # 4. Ref with Grid repetition (Manhattan) - subpat = Pattern() - subpat.polygon("sub", vertices=[[0, 0], [1, 0], [1, 1]]) - lib["sub"] = subpat - - pat.ref("sub", offset=(100, 100), repetition=Grid(a_vector=(10, 0), a_count=2, b_vector=(0, 10), b_count=3)) - - lib["top"] = pat - - dxf_file = tmp_path / "test.dxf" - dxf.writefile(lib, "top", dxf_file) - - read_lib, _ = dxf.readfile(dxf_file) - - # In DXF read, the top level is usually called "Model" - top_pat = read_lib.get("Model") or read_lib.get("top") or list(read_lib.values())[0] - - # Verify Polygon - polys = [s for s in top_pat.shapes["1"] if isinstance(s, Polygon)] - assert len(polys) >= 1 - poly_read = polys[0] - assert _matches_closed_vertices(poly_read.vertices, poly_verts) - - # Verify 3-point Path - paths = [s for s in top_pat.shapes["2"] if isinstance(s, MPath)] - assert len(paths) >= 1 - path_read = paths[0] - assert _matches_open_path(path_read.vertices, path_verts) - assert path_read.width == 2 - - # Verify 2-point Path - paths2 = [s for s in top_pat.shapes["3"] if isinstance(s, MPath)] - assert len(paths2) >= 1 - path2_read = paths2[0] - assert _matches_open_path(path2_read.vertices, path2_verts) - assert path2_read.width == 0 - - # Verify Ref with Grid - # Finding the sub pattern name might be tricky because of how DXF stores blocks - # but "sub" should be in read_lib - assert "sub" in read_lib - - # Check refs in the top pattern - found_grid = False - for target, reflist in top_pat.refs.items(): - # DXF names might be case-insensitive or modified, but ezdxf usually preserves them - if target.upper() == "SUB": - for ref in reflist: - if isinstance(ref.repetition, Grid): - assert ref.repetition.a_count == 2 - assert ref.repetition.b_count == 3 - assert_allclose(ref.repetition.a_vector, (10, 0)) - assert_allclose(ref.repetition.b_vector, (0, 10)) - found_grid = True - assert found_grid, f"Manhattan Grid repetition should have been preserved. Targets: {list(top_pat.refs.keys())}" - -def test_dxf_manhattan_precision(tmp_path: Path): - # Test that float precision doesn't break Manhattan grid detection - lib = Library() - sub = Pattern() - sub.polygon("1", vertices=[[0, 0], [1, 0], [1, 1]]) - lib["sub"] = sub - - top = Pattern() - # 90 degree rotation: in masque the grid is NOT rotated, so it stays [[10,0],[0,10]] - # In DXF, an array with rotation 90 has basis vectors [[0,10],[-10,0]]. - # So a masque grid [[10,0],[0,10]] with ref rotation 90 matches a DXF array. - angle = numpy.pi / 2 # 90 degrees - top.ref("sub", offset=(0, 0), rotation=angle, - repetition=Grid(a_vector=(10, 0), a_count=2, b_vector=(0, 10), b_count=2)) - - lib["top"] = top - - dxf_file = tmp_path / "precision.dxf" - dxf.writefile(lib, "top", dxf_file) - - # If the isclose() fix works, this should still be a Grid when read back - read_lib, _ = dxf.readfile(dxf_file) - read_top = read_lib.get("Model") or read_lib.get("top") or list(read_lib.values())[0] - - target_name = next(k for k in read_top.refs if k.upper() == "SUB") - ref = read_top.refs[target_name][0] - assert isinstance(ref.repetition, Grid), "Grid should be preserved for 90-degree rotation" - - -def test_dxf_rotated_grid_roundtrip_preserves_basis_and_counts(tmp_path: Path): - lib = Library() - sub = Pattern() - sub.polygon("1", vertices=[[0, 0], [1, 0], [1, 1]]) - lib["sub"] = sub - - top = Pattern() - top.ref( - "sub", - offset=(0, 0), - rotation=numpy.pi / 2, - repetition=Grid(a_vector=(10, 0), a_count=3, b_vector=(0, 20), b_count=2), - ) - lib["top"] = top - - dxf_file = tmp_path / "rotated_grid.dxf" - dxf.writefile(lib, "top", dxf_file) - - read_lib, _ = dxf.readfile(dxf_file) - read_top = read_lib.get("Model") or read_lib.get("top") or list(read_lib.values())[0] - - target_name = next(k for k in read_top.refs if k.upper() == "SUB") - ref = read_top.refs[target_name][0] - assert isinstance(ref.repetition, Grid) - actual = ref.repetition.displacements - expected = Grid(a_vector=(10, 0), a_count=3, b_vector=(0, 20), b_count=2).displacements - assert_allclose( - actual[numpy.lexsort((actual[:, 1], actual[:, 0]))], - expected[numpy.lexsort((expected[:, 1], expected[:, 0]))], - ) - - -def test_dxf_read_legacy_polyline() -> None: - doc = ezdxf.new() - msp = doc.modelspace() - msp.add_polyline2d([(0, 0), (10, 0), (10, 10)], dxfattribs={"layer": "legacy"}).close(True) - - stream = io.StringIO() - doc.write(stream) - stream.seek(0) - - read_lib, _ = dxf.read(stream) - top_pat = read_lib.get("Model") or list(read_lib.values())[0] - - polys = [shape for shape in top_pat.shapes["legacy"] if isinstance(shape, Polygon)] - assert len(polys) == 1 - assert _matches_closed_vertices(polys[0].vertices, numpy.array([[0, 0], [10, 0], [10, 10]])) - - -def test_dxf_read_ignores_unreferenced_setup_blocks() -> None: - lib = Library({"top": Pattern()}) - stream = io.StringIO() - - dxf.write(lib, "top", stream) - stream.seek(0) - - read_lib, _ = dxf.read(stream) - - assert set(read_lib) == {"Model"} diff --git a/masque/test/test_fdfd.py b/masque/test/test_fdfd.py deleted file mode 100644 index 2b4f3d3..0000000 --- a/masque/test/test_fdfd.py +++ /dev/null @@ -1,24 +0,0 @@ -# ruff: noqa -# ruff: noqa: ARG001 - - -import dataclasses -import pytest # type: ignore -import numpy -from numpy import pi -from numpy.typing import NDArray -# from numpy.testing import assert_allclose, assert_array_equal - -from .. import Pattern, Arc, Circle - - -def test_circle_mirror(): - cc = Circle(radius=4, offset=(10, 20)) - cc.flip_across(axis=0) # flip across y=0 - assert cc.offset[0] == 10 - assert cc.offset[1] == -20 - assert cc.radius == 4 - cc.flip_across(axis=1) # flip across x=0 - assert cc.offset[0] == -10 - assert cc.offset[1] == -20 - assert cc.radius == 4 diff --git a/masque/test/test_file_roundtrip.py b/masque/test/test_file_roundtrip.py deleted file mode 100644 index 283a863..0000000 --- a/masque/test/test_file_roundtrip.py +++ /dev/null @@ -1,179 +0,0 @@ -from pathlib import Path -from typing import cast -import pytest -from numpy.testing import assert_allclose - -from ..pattern import Pattern -from ..library import Library -from ..shapes import Path as MPath, Circle, Polygon, RectCollection -from ..repetition import Grid, Arbitrary - -def create_test_library(for_gds: bool = False) -> Library: - lib = Library() - - # 1. Polygons - pat_poly = Pattern() - pat_poly.polygon((1, 0), vertices=[[0, 0], [10, 0], [5, 10]]) - lib["polygons"] = pat_poly - - # 2. Paths with different endcaps - pat_paths = Pattern() - # Flush - pat_paths.path((2, 0), vertices=[[0, 0], [20, 0]], width=2, cap=MPath.Cap.Flush) - # Square - pat_paths.path((2, 1), vertices=[[0, 10], [20, 10]], width=2, cap=MPath.Cap.Square) - # Circle (Only for GDS) - if for_gds: - pat_paths.path((2, 2), vertices=[[0, 20], [20, 20]], width=2, cap=MPath.Cap.Circle) - # SquareCustom - pat_paths.path((2, 3), vertices=[[0, 30], [20, 30]], width=2, cap=MPath.Cap.SquareCustom, cap_extensions=(1, 5)) - lib["paths"] = pat_paths - - # 3. Circles (only for OASIS or polygonized for GDS) - pat_circles = Pattern() - if for_gds: - # GDS writer calls to_polygons() for non-supported shapes, - # but we can also pre-polygonize - pat_circles.shapes[(3, 0)].append(Circle(radius=5, offset=(10, 10)).to_polygons()[0]) - else: - pat_circles.shapes[(3, 0)].append(Circle(radius=5, offset=(10, 10))) - lib["circles"] = pat_circles - - # 4. Refs with repetitions - pat_refs = Pattern() - # Simple Ref - pat_refs.ref("polygons", offset=(0, 0)) - # Ref with Grid repetition - pat_refs.ref("polygons", offset=(100, 0), repetition=Grid(a_vector=(20, 0), a_count=3, b_vector=(0, 20), b_count=2)) - # Ref with Arbitrary repetition - pat_refs.ref("polygons", offset=(0, 100), repetition=Arbitrary(displacements=[[0, 0], [10, 20], [30, -10]])) - lib["refs"] = pat_refs - - # 5. Shapes with repetitions (OASIS only, must be wrapped for GDS) - pat_rep_shapes = Pattern() - poly_rep = Polygon(vertices=[[0, 0], [5, 0], [5, 5], [0, 5]], repetition=Grid(a_vector=(10, 0), a_count=5)) - pat_rep_shapes.shapes[(4, 0)].append(poly_rep) - lib["rep_shapes"] = pat_rep_shapes - - if for_gds: - lib.wrap_repeated_shapes() - - return lib - -def test_gdsii_full_roundtrip(tmp_path: Path) -> None: - from ..file import gdsii - lib = create_test_library(for_gds=True) - gds_file = tmp_path / "full_test.gds" - gdsii.writefile(lib, gds_file, meters_per_unit=1e-9) - - read_lib, _ = gdsii.readfile(gds_file) - - # Check existence - for name in lib: - assert name in read_lib - - # Check Paths - read_paths = read_lib["paths"] - # Check caps (GDS stores them as path_type) - # Order might be different depending on how they were written, - # but here they should match the order they were added if dict order is preserved. - # Actually, they are grouped by layer. - p_flush = cast("MPath", read_paths.shapes[(2, 0)][0]) - assert p_flush.cap == MPath.Cap.Flush - - p_square = cast("MPath", read_paths.shapes[(2, 1)][0]) - assert p_square.cap == MPath.Cap.Square - - p_circle = cast("MPath", read_paths.shapes[(2, 2)][0]) - assert p_circle.cap == MPath.Cap.Circle - - p_custom = cast("MPath", read_paths.shapes[(2, 3)][0]) - assert p_custom.cap == MPath.Cap.SquareCustom - assert p_custom.cap_extensions is not None - assert_allclose(p_custom.cap_extensions, (1, 5)) - - # Check Refs with repetitions - read_refs = read_lib["refs"] - assert len(read_refs.refs["polygons"]) >= 3 # Simple, Grid (becomes 1 AREF), Arbitrary (becomes 3 SREFs) - - # AREF check - arefs = [r for r in read_refs.refs["polygons"] if r.repetition is not None] - assert len(arefs) == 1 - assert isinstance(arefs[0].repetition, Grid) - assert arefs[0].repetition.a_count == 3 - assert arefs[0].repetition.b_count == 2 - - # Check wrapped shapes - # lib.wrap_repeated_shapes() created new patterns - # Original pattern "rep_shapes" now should have a Ref - assert len(read_lib["rep_shapes"].refs) > 0 - -def test_oasis_full_roundtrip(tmp_path: Path) -> None: - pytest.importorskip("fatamorgana") - from ..file import oasis - lib = create_test_library(for_gds=False) - oas_file = tmp_path / "full_test.oas" - oasis.writefile(lib, oas_file, units_per_micron=1000) - - read_lib, _ = oasis.readfile(oas_file) - - # Check existence - for name in lib: - assert name in read_lib - - # Check Circle - read_circles = read_lib["circles"] - assert isinstance(read_circles.shapes[(3, 0)][0], Circle) - assert read_circles.shapes[(3, 0)][0].radius == 5 - - # Check Path caps - read_paths = read_lib["paths"] - assert cast("MPath", read_paths.shapes[(2, 0)][0]).cap == MPath.Cap.Flush - assert cast("MPath", read_paths.shapes[(2, 1)][0]).cap == MPath.Cap.Square - # OASIS HalfWidth is Square. masque's Square is also HalfWidth extension. - # Wait, Circle cap in OASIS? - # masque/file/oasis.py: - # path_cap_map = { - # PathExtensionScheme.Flush: Path.Cap.Flush, - # PathExtensionScheme.HalfWidth: Path.Cap.Square, - # PathExtensionScheme.Arbitrary: Path.Cap.SquareCustom, - # } - # It seems Circle cap is NOT supported in OASIS by masque currently. - # Let's verify what happens with Circle cap in OASIS write. - # _shapes_to_elements in oasis.py: - # path_type = next(k for k, v in path_cap_map.items() if v == shape.cap) - # This will raise StopIteration if Circle is not in path_cap_map. - - # Check Shape repetition - read_rep_shapes = read_lib["rep_shapes"] - poly = read_rep_shapes.shapes[(4, 0)][0] - assert poly.repetition is not None - assert isinstance(poly.repetition, Grid) - assert poly.repetition.a_count == 5 - - -def test_gdsii_rect_collection_roundtrip(tmp_path: Path) -> None: - from ..file import gdsii - - lib = Library() - pat = Pattern() - pat.shapes[(5, 0)].append( - RectCollection( - rects=[[0, 0, 10, 5], [20, -5, 30, 10]], - annotations={'1': ['rects']}, - ) - ) - lib['rects'] = pat - - gds_file = tmp_path / 'rect_collection.gds' - gdsii.writefile(lib, gds_file, meters_per_unit=1e-9) - - read_lib, _ = gdsii.readfile(gds_file) - polys = read_lib['rects'].shapes[(5, 0)] - - assert len(polys) == 2 - assert all(isinstance(poly, Polygon) for poly in polys) - assert_allclose(polys[0].vertices, [[0, 0], [0, 5], [10, 5], [10, 0]]) - assert_allclose(polys[1].vertices, [[20, -5], [20, 10], [30, 10], [30, -5]]) - assert polys[0].annotations == {'1': ['rects']} - assert polys[1].annotations == {'1': ['rects']} diff --git a/masque/test/test_gdsii.py b/masque/test/test_gdsii.py deleted file mode 100644 index 7a2f5b1..0000000 --- a/masque/test/test_gdsii.py +++ /dev/null @@ -1,80 +0,0 @@ -from pathlib import Path -from typing import cast -import numpy -import pytest -from numpy.testing import assert_equal, assert_allclose - -from ..error import LibraryError -from ..pattern import Pattern -from ..library import Library -from ..file import gdsii -from ..shapes import Path as MPath, Polygon - - -def test_gdsii_roundtrip(tmp_path: Path) -> None: - lib = Library() - - # Simple polygon cell - pat1 = Pattern() - pat1.polygon((1, 0), vertices=[[0, 0], [10, 0], [10, 10], [0, 10]]) - lib["poly_cell"] = pat1 - - # Path cell - pat2 = Pattern() - pat2.path((2, 5), vertices=[[0, 0], [100, 0]], width=10) - lib["path_cell"] = pat2 - - # Cell with Ref - pat3 = Pattern() - pat3.ref("poly_cell", offset=(50, 50), rotation=numpy.pi / 2) - lib["ref_cell"] = pat3 - - gds_file = tmp_path / "test.gds" - gdsii.writefile(lib, gds_file, meters_per_unit=1e-9) - - read_lib, info = gdsii.readfile(gds_file) - - assert "poly_cell" in read_lib - assert "path_cell" in read_lib - assert "ref_cell" in read_lib - - # Check polygon - read_poly = cast("Polygon", read_lib["poly_cell"].shapes[(1, 0)][0]) - # GDSII closes polygons, so it might have an extra vertex or different order - assert len(read_poly.vertices) >= 4 - # Check bounds as a proxy for geometry correctness - assert_equal(read_lib["poly_cell"].get_bounds(), [[0, 0], [10, 10]]) - - # Check path - read_path = cast("MPath", read_lib["path_cell"].shapes[(2, 5)][0]) - assert isinstance(read_path, MPath) - assert read_path.width == 10 - assert_equal(read_path.vertices, [[0, 0], [100, 0]]) - - # Check Ref - read_ref = read_lib["ref_cell"].refs["poly_cell"][0] - assert_equal(read_ref.offset, [50, 50]) - assert_allclose(read_ref.rotation, numpy.pi / 2, atol=1e-5) - - -def test_gdsii_annotations(tmp_path: Path) -> None: - lib = Library() - pat = Pattern() - # GDS only supports integer keys in range [1, 126] for properties - pat.polygon((1, 0), vertices=[[0, 0], [1, 0], [1, 1]], annotations={"1": ["hello"]}) - lib["cell"] = pat - - gds_file = tmp_path / "test_ann.gds" - gdsii.writefile(lib, gds_file, meters_per_unit=1e-9) - - read_lib, _ = gdsii.readfile(gds_file) - read_ann = read_lib["cell"].shapes[(1, 0)][0].annotations - assert read_ann is not None - assert read_ann["1"] == ["hello"] - - -def test_gdsii_check_valid_names_validates_generator_lengths() -> None: - names = (name for name in ("a" * 40,)) - - with pytest.raises(LibraryError, match="invalid names"): - gdsii.check_valid_names(names) diff --git a/masque/test/test_gdsii_arrow.py b/masque/test/test_gdsii_arrow.py deleted file mode 100644 index f3f4f6a..0000000 --- a/masque/test/test_gdsii_arrow.py +++ /dev/null @@ -1,507 +0,0 @@ -from pathlib import Path - -import numpy -import pytest - -pytest.importorskip('pyarrow') - -from .. import Ref, Label -from ..library import Library -from ..pattern import Pattern -from ..repetition import Grid -from ..shapes import Path as MPath, Polygon, PolyCollection, RectCollection -from ..file import gdsii, gdsii_arrow -from ..file.gdsii_perf import write_fixture - - -if not gdsii_arrow.is_available(): - pytest.skip('klamath_rs_ext shared library is not available', allow_module_level=True) - - -def _annotations_key(annotations: dict[str, list[object]] | None) -> tuple[tuple[str, tuple[object, ...]], ...] | None: - if not annotations: - return None - return tuple(sorted((key, tuple(values)) for key, values in annotations.items())) - - -def _coord_key(values: object) -> tuple[int, ...] | tuple[tuple[int, int], ...]: - arr = numpy.rint(numpy.asarray(values, dtype=float)).astype(int) - if arr.ndim == 1: - return tuple(arr.tolist()) - return tuple(tuple(row.tolist()) for row in arr) - - -def _canonical_polygon_key(vertices: object) -> tuple[tuple[int, int], ...]: - arr = numpy.rint(numpy.asarray(vertices, dtype=float)).astype(int) - rows = [tuple(tuple(row.tolist()) for row in numpy.roll(arr, -shift, axis=0)) for shift in range(arr.shape[0])] - rev = arr[::-1] - rows.extend(tuple(tuple(row.tolist()) for row in numpy.roll(rev, -shift, axis=0)) for shift in range(rev.shape[0])) - return min(rows) - - -def _shape_key(shape: object, layer: tuple[int, int]) -> list[tuple[object, ...]]: - if isinstance(shape, MPath): - cap_extensions = None if shape.cap_extensions is None else _coord_key(shape.cap_extensions) - return [( - 'path', - layer, - _coord_key(shape.vertices), - _coord_key(shape.offset), - int(round(float(shape.width))), - shape.cap.name, - cap_extensions, - _annotations_key(shape.annotations), - )] - - keys = [] - for poly in shape.to_polygons(): - keys.append(( - 'polygon', - layer, - _canonical_polygon_key(poly.vertices), - _coord_key(poly.offset), - _annotations_key(poly.annotations), - )) - return keys - - -def _ref_keys(target: str, ref: object) -> list[tuple[object, ...]]: - keys = [] - for transform in ref.as_transforms(): - keys.append(( - target, - _coord_key(transform[:2]), - round(float(transform[2]), 8), - round(float(transform[4]), 8), - bool(int(round(float(transform[3])))), - _annotations_key(ref.annotations), - )) - return keys - - -def _label_key(layer: tuple[int, int], label: object) -> tuple[object, ...]: - return ( - layer, - label.string, - _coord_key(label.offset), - _annotations_key(label.annotations), - ) - - -def _pattern_summary(pattern: Pattern) -> dict[str, object]: - shape_keys: list[tuple[object, ...]] = [] - for layer, shapes in pattern.shapes.items(): - for shape in shapes: - shape_keys.extend(_shape_key(shape, layer)) - - ref_keys: list[tuple[object, ...]] = [] - for target, refs in pattern.refs.items(): - for ref in refs: - ref_keys.extend(_ref_keys(target, ref)) - - label_keys = [ - _label_key(layer, label) - for layer, labels in pattern.labels.items() - for label in labels - ] - - return { - 'shapes': sorted(shape_keys), - 'refs': sorted(ref_keys), - 'labels': sorted(label_keys), - } - - -def _library_summary(lib: Library) -> dict[str, dict[str, object]]: - return {name: _pattern_summary(pattern) for name, pattern in lib.items()} - - -def _make_arrow_test_library() -> Library: - lib = Library() - - leaf = Pattern() - leaf.polygon((1, 0), vertices=[[0, 0], [10, 0], [10, 10], [0, 10]], annotations={'1': ['leaf-poly']}) - leaf.polygon((2, 0), vertices=[[40, 0], [50, 0], [50, 10], [40, 10]]) - leaf.polygon((1, 0), vertices=[[20, 0], [30, 0], [30, 10], [20, 10]]) - leaf.polygon((1, 0), vertices=[[80, 0], [90, 0], [90, 10], [80, 10]]) - leaf.polygon((2, 0), vertices=[[60, 0], [70, 0], [70, 10], [60, 10]], annotations={'18': ['leaf-poly-2']}) - leaf.label((10, 0), string='LEAF', offset=(3, 4), annotations={'10': ['leaf-label']}) - lib['leaf'] = leaf - - child = Pattern() - child.path( - (2, 0), - vertices=[[0, 0], [15, 5], [30, 5]], - width=6, - cap=MPath.Cap.SquareCustom, - cap_extensions=(2, 4), - annotations={'2': ['child-path']}, - ) - child.label((11, 0), string='CHILD', offset=(7, 8), annotations={'11': ['child-label']}) - child.ref('leaf', offset=(100, 200), rotation=numpy.pi / 2, mirrored=True, scale=1.25, annotations={'12': ['child-ref']}) - lib['child'] = child - - sibling = Pattern() - sibling.polygon((3, 0), vertices=[[0, 0], [5, 0], [5, 6], [0, 6]]) - sibling.label((12, 0), string='SIB', offset=(1, 2), annotations={'13': ['sib-label']}) - sibling.ref( - 'leaf', - offset=(-50, 60), - repetition=Grid(a_vector=(20, 0), a_count=3, b_vector=(0, 30), b_count=2), - annotations={'14': ['sib-ref']}, - ) - lib['sibling'] = sibling - - fanout = Pattern() - fanout.ref('leaf', offset=(0, 0)) - fanout.ref('child', offset=(10, 0), mirrored=True, rotation=numpy.pi / 6, scale=1.1) - fanout.ref('leaf', offset=(20, 0)) - fanout.ref('leaf', offset=(30, 0), repetition=Grid(a_vector=(5, 0), a_count=2, b_vector=(0, 7), b_count=3)) - fanout.ref('child', offset=(40, 0), mirrored=True, rotation=numpy.pi / 4, scale=1.2, - repetition=Grid(a_vector=(9, 0), a_count=2, b_vector=(0, 11), b_count=2)) - fanout.ref('leaf', offset=(50, 0), repetition=Grid(a_vector=(6, 0), a_count=3, b_vector=(0, 8), b_count=2)) - fanout.ref('leaf', offset=(60, 0), annotations={'19': ['fanout-sref']}) - fanout.ref('child', offset=(70, 0), repetition=Grid(a_vector=(4, 0), a_count=2, b_vector=(0, 5), b_count=2), - annotations={'20': ['fanout-aref']}) - lib['fanout'] = fanout - - top = Pattern() - top.ref('child', offset=(500, 600), annotations={'15': ['top-child-ref']}) - top.ref('sibling', offset=(-100, 50), rotation=numpy.pi, annotations={'16': ['top-sibling-ref']}) - top.ref('fanout', offset=(250, -75)) - top.label((13, 0), string='TOP', offset=(0, 0), annotations={'17': ['top-label']}) - lib['top'] = top - - return lib - - -def test_gdsii_arrow_matches_gdsii_readfile(tmp_path: Path) -> None: - lib = _make_arrow_test_library() - gds_file = tmp_path / 'arrow_roundtrip.gds' - gdsii.writefile(lib, gds_file, meters_per_unit=1e-9) - - canonical_lib, canonical_info = gdsii.readfile(gds_file) - arrow_lib, arrow_info = gdsii_arrow.readfile(gds_file) - - assert canonical_info == arrow_info - assert _library_summary(canonical_lib) == _library_summary(arrow_lib) - - -def test_gdsii_arrow_readfile_arrow_returns_native_payload(tmp_path: Path) -> None: - gds_file = tmp_path / 'many_cells_native.gds' - manifest = write_fixture(gds_file, preset='many_cells', scale=0.001) - - libarr, info = gdsii_arrow.readfile_arrow(gds_file) - - assert info['name'] == manifest.library_name - assert libarr['lib_name'].as_py() == manifest.library_name - assert len(libarr['cells']) == manifest.cells - assert 0 < len(libarr['layers']) <= manifest.layers - - -def test_gdsii_arrow_reads_small_perf_fixture(tmp_path: Path) -> None: - gds_file = tmp_path / 'many_cells_smoke.gds' - manifest = write_fixture(gds_file, preset='many_cells', scale=0.001) - - lib, info = gdsii_arrow.readfile(gds_file) - - assert info['name'] == manifest.library_name - assert len(lib) == manifest.cells - assert 'TOP' in lib - assert sum(len(refs) for refs in lib['TOP'].refs.values()) > 0 - - -def test_gdsii_arrow_degenerate_aref_decodes_as_single_transform(tmp_path: Path) -> None: - lib = Library() - leaf = Pattern() - leaf.polygon((1, 0), vertices=[[0, 0], [5, 0], [5, 5], [0, 5]]) - lib['leaf'] = leaf - - top = Pattern() - top.ref('leaf', offset=(100, 200), repetition=Grid(a_vector=(7, 0), a_count=1, b_vector=(0, 9), b_count=1)) - lib['top'] = top - - gds_file = tmp_path / 'degenerate_aref.gds' - gdsii.writefile(lib, gds_file, meters_per_unit=1e-9) - - canonical_lib, _ = gdsii.readfile(gds_file) - arrow_lib, _ = gdsii_arrow.readfile(gds_file) - assert _library_summary(arrow_lib) == _library_summary(canonical_lib) - - decoded_ref = arrow_lib['top'].refs['leaf'][0] - assert decoded_ref.repetition is None - - -def test_gdsii_arrow_plain_srefs_decode_without_arbitrary(tmp_path: Path) -> None: - lib = _make_arrow_test_library() - gds_file = tmp_path / 'plain_srefs.gds' - gdsii.writefile(lib, gds_file, meters_per_unit=1e-9) - - arrow_lib, _ = gdsii_arrow.readfile(gds_file) - fanout = arrow_lib['fanout'] - - plain_leaf_refs = [ - ref - for ref in fanout.refs['leaf'] - if ref.annotations is None and ref.repetition is None - ] - assert len(plain_leaf_refs) == 2 - assert all(type(ref.repetition) is not Grid for ref in plain_leaf_refs) - - -def test_gdsii_arrow_degenerate_aref_schema_normalizes_to_sref(tmp_path: Path) -> None: - lib = Library() - leaf = Pattern() - leaf.polygon((1, 0), vertices=[[0, 0], [5, 0], [5, 5], [0, 5]]) - lib['leaf'] = leaf - - top = Pattern() - top.ref('leaf', offset=(100, 200), repetition=Grid(a_vector=(7, 0), a_count=1, b_vector=(0, 9), b_count=1)) - lib['top'] = top - - gds_file = tmp_path / 'degenerate_aref_schema.gds' - gdsii.writefile(lib, gds_file, meters_per_unit=1e-9) - - libarr = gdsii_arrow._read_to_arrow(gds_file)[0] - cells = libarr['cells'].values - cell_ids = cells.field('id').to_numpy() - cell_names = libarr['cell_names'].as_py() - top_index = next(ii for ii, cell_id in enumerate(cell_ids) if cell_names[cell_id] == 'top') - - srefs = cells.field('srefs')[top_index].as_py() - arefs = cells.field('arefs')[top_index].as_py() - - assert len(srefs) == 1 - assert len(arefs) == 0 - assert cell_names[srefs[0]['target']] == 'leaf' - - -def test_gdsii_arrow_boundary_batch_schema(tmp_path: Path) -> None: - lib = _make_arrow_test_library() - gds_file = tmp_path / 'arrow_batches.gds' - gdsii.writefile(lib, gds_file, meters_per_unit=1e-9) - - libarr = gdsii_arrow._read_to_arrow(gds_file)[0] - cells = libarr['cells'].values - cell_ids = cells.field('id').to_numpy() - cell_names = libarr['cell_names'].as_py() - layer_table = [ - ((int(layer) >> 16) & 0xFFFF, int(layer) & 0xFFFF) - for layer in libarr['layers'].values.to_numpy() - ] - - leaf_index = next(ii for ii, cell_id in enumerate(cell_ids) if cell_names[cell_id] == 'leaf') - - rect_batches = cells.field('rect_batches')[leaf_index].as_py() - boundary_batches = cells.field('boundary_batches')[leaf_index].as_py() - boundary_props = cells.field('boundary_props')[leaf_index].as_py() - - assert len(rect_batches) == 2 - assert len(boundary_batches) == 0 - assert len(boundary_props) == 2 - - rects_by_layer = {tuple(layer_table[entry['layer']]): entry for entry in rect_batches} - assert rects_by_layer[(1, 0)]['rects'] == [20, 0, 30, 10, 80, 0, 90, 10] - assert rects_by_layer[(2, 0)]['rects'] == [40, 0, 50, 10] - - props_by_layer = {tuple(layer_table[entry['layer']]): entry for entry in boundary_props} - assert sorted(props_by_layer) == [(1, 0), (2, 0)] - assert props_by_layer[(1, 0)]['properties'][0]['value'] == 'leaf-poly' - assert props_by_layer[(2, 0)]['properties'][0]['value'] == 'leaf-poly-2' - - -def test_gdsii_arrow_rect_batch_schema_for_mixed_layer(tmp_path: Path) -> None: - lib = Library() - top = Pattern() - top.shapes[(1, 0)].append(RectCollection(rects=[[0, 0, 10, 10], [20, 0, 30, 10], [40, 0, 50, 10], [60, 0, 70, 10]])) - top.polygon((1, 0), vertices=[[80, 0], [85, 10], [90, 0]]) - top.polygon((1, 0), vertices=[[100, 0], [105, 10], [110, 0]]) - lib['top'] = top - - gds_file = tmp_path / 'arrow_rect_batches.gds' - gdsii.writefile(lib, gds_file, meters_per_unit=1e-9) - - libarr = gdsii_arrow._read_to_arrow(gds_file)[0] - cells = libarr['cells'].values - cell_ids = cells.field('id').to_numpy() - cell_names = libarr['cell_names'].as_py() - layer_table = [ - ((int(layer) >> 16) & 0xFFFF, int(layer) & 0xFFFF) - for layer in libarr['layers'].values.to_numpy() - ] - top_index = next(ii for ii, cell_id in enumerate(cell_ids) if cell_names[cell_id] == 'top') - - rect_batches = cells.field('rect_batches')[top_index].as_py() - boundary_batches = cells.field('boundary_batches')[top_index].as_py() - - assert len(rect_batches) == 1 - assert tuple(layer_table[rect_batches[0]['layer']]) == (1, 0) - assert rect_batches[0]['rects'] == [ - 0, 0, 10, 10, - 20, 0, 30, 10, - 40, 0, 50, 10, - 60, 0, 70, 10, - ] - - assert len(boundary_batches) == 1 - assert tuple(layer_table[boundary_batches[0]['layer']]) == (1, 0) - assert boundary_batches[0]['vertex_offsets'] == [0, 3] - - -def test_gdsii_arrow_ref_schema(tmp_path: Path) -> None: - lib = _make_arrow_test_library() - gds_file = tmp_path / 'arrow_ref_batches.gds' - gdsii.writefile(lib, gds_file, meters_per_unit=1e-9) - - libarr = gdsii_arrow._read_to_arrow(gds_file)[0] - cells = libarr['cells'].values - cell_ids = cells.field('id').to_numpy() - cell_names = libarr['cell_names'].as_py() - - fanout_index = next(ii for ii, cell_id in enumerate(cell_ids) if cell_names[cell_id] == 'fanout') - - srefs = cells.field('srefs')[fanout_index].as_py() - arefs = cells.field('arefs')[fanout_index].as_py() - sref_props = cells.field('sref_props')[fanout_index].as_py() - aref_props = cells.field('aref_props')[fanout_index].as_py() - - sref_target_ids = [entry['target'] for entry in srefs] - sref_targets = [cell_names[target] for target in sref_target_ids] - assert sorted(sref_targets) == ['child', 'leaf', 'leaf'] - assert sref_target_ids == sorted(sref_target_ids) - sref_by_target = {} - for entry in srefs: - sref_by_target.setdefault(cell_names[entry['target']], []).append(entry) - assert [entry['invert_y'] for entry in sref_by_target['child']] == [True] - assert [entry['scale'] for entry in sref_by_target['child']] == pytest.approx([1.1]) - assert len(sref_by_target['leaf']) == 2 - - aref_target_ids = [entry['target'] for entry in arefs] - aref_targets = [cell_names[target] for target in aref_target_ids] - assert sorted(aref_targets) == ['child', 'leaf', 'leaf'] - assert aref_target_ids == sorted(aref_target_ids) - aref_by_target = {} - for entry in arefs: - aref_by_target.setdefault(cell_names[entry['target']], []).append(entry) - assert [entry['invert_y'] for entry in aref_by_target['child']] == [True] - assert [entry['scale'] for entry in aref_by_target['child']] == pytest.approx([1.2]) - assert len(aref_by_target['leaf']) == 2 - - assert len(sref_props) == 1 - assert cell_names[sref_props[0]['target']] == 'leaf' - assert sref_props[0]['properties'][0]['value'] == 'fanout-sref' - - assert len(aref_props) == 1 - assert cell_names[aref_props[0]['target']] == 'child' - assert aref_props[0]['properties'][0]['value'] == 'fanout-aref' - - -def test_raw_ref_grid_label_constructors_match_public() -> None: - raw_grid = Grid._from_raw( - a_vector=numpy.array([20, 0]), - a_count=3, - b_vector=numpy.array([0, 30]), - b_count=2, - ) - public_grid = Grid(a_vector=(20, 0), a_count=3, b_vector=(0, 30), b_count=2) - assert raw_grid == public_grid - - raw_poly = Polygon._from_raw( - vertices=numpy.array([[0.0, 0.0], [5.0, 0.0], [5.0, 5.0], [0.0, 5.0]]), - annotations={'1': ['poly']}, - ) - public_poly = Polygon( - vertices=[[0, 0], [5, 0], [5, 5], [0, 5]], - annotations={'1': ['poly']}, - ) - assert raw_poly == public_poly - - raw_poly_collection = PolyCollection._from_raw( - vertex_lists=numpy.array([ - [0.0, 0.0], [2.0, 0.0], [2.0, 2.0], - [10.0, 10.0], [12.0, 10.0], [12.0, 12.0], - ]), - vertex_offsets=numpy.array([0, 3], dtype=numpy.uint32), - annotations={'2': ['pc']}, - ) - public_poly_collection = PolyCollection( - vertex_lists=[[0, 0], [2, 0], [2, 2], [10, 10], [12, 10], [12, 12]], - vertex_offsets=[0, 3], - annotations={'2': ['pc']}, - ) - assert raw_poly_collection == public_poly_collection - assert [tuple(s.indices(len(raw_poly_collection.vertex_lists))) for s in raw_poly_collection.vertex_slices] == [(0, 3, 1), (3, 6, 1)] - - raw_rect_collection = RectCollection._from_raw( - rects=numpy.array([[10.0, 10.0, 12.0, 12.0], [0.0, 0.0, 5.0, 5.0]]), - annotations={'3': ['rects']}, - ) - public_rect_collection = RectCollection( - rects=[[0, 0, 5, 5], [10, 10, 12, 12]], - annotations={'3': ['rects']}, - ) - assert raw_rect_collection == public_rect_collection - - raw_ref_empty = Ref._from_raw( - offset=numpy.array([100, 200]), - rotation=numpy.pi / 2, - mirrored=False, - scale=1.0, - repetition=None, - annotations=None, - ) - public_ref_empty = Ref( - offset=(100, 200), - rotation=numpy.pi / 2, - mirrored=False, - scale=1.0, - repetition=None, - annotations=None, - ) - assert raw_ref_empty.annotations is None - assert raw_ref_empty == public_ref_empty - - raw_ref = Ref._from_raw( - offset=numpy.array([100, 200]), - rotation=numpy.pi / 2, - mirrored=True, - scale=1.25, - repetition=raw_grid, - annotations={'12': ['child-ref']}, - ) - public_ref = Ref( - offset=(100, 200), - rotation=numpy.pi / 2, - mirrored=True, - scale=1.25, - repetition=public_grid, - annotations={'12': ['child-ref']}, - ) - assert raw_ref == public_ref - assert numpy.array_equal(raw_ref.as_transforms(), public_ref.as_transforms()) - - raw_label_empty = Label._from_raw( - 'LEAF', - offset=numpy.array([3, 4]), - annotations=None, - ) - public_label_empty = Label( - 'LEAF', - offset=(3, 4), - annotations=None, - ) - assert raw_label_empty.annotations is None - assert raw_label_empty == public_label_empty - - raw_label = Label._from_raw( - 'LEAF', - offset=numpy.array([3, 4]), - annotations={'10': ['leaf-label']}, - ) - public_label = Label( - 'LEAF', - offset=(3, 4), - annotations={'10': ['leaf-label']}, - ) - assert raw_label == public_label - assert numpy.array_equal(raw_label.get_bounds_single(), public_label.get_bounds_single()) diff --git a/masque/test/test_gdsii_lazy_arrow.py b/masque/test/test_gdsii_lazy_arrow.py deleted file mode 100644 index 61a99af..0000000 --- a/masque/test/test_gdsii_lazy_arrow.py +++ /dev/null @@ -1,174 +0,0 @@ -from pathlib import Path - -import numpy -import pytest - -pytest.importorskip('pyarrow') - -from ..library import Library -from ..pattern import Pattern -from ..repetition import Grid -from ..file import gdsii, gdsii_lazy_arrow -from ..file.gdsii_perf import write_fixture - - -if not gdsii_lazy_arrow.is_available(): - pytest.skip('klamath_rs_ext shared library is not available', allow_module_level=True) - - -def _make_small_library() -> Library: - lib = Library() - - leaf = Pattern() - leaf.polygon((1, 0), vertices=[[0, 0], [10, 0], [10, 5], [0, 5]]) - lib['leaf'] = leaf - - mid = Pattern() - mid.ref('leaf', offset=(10, 20)) - mid.ref('leaf', offset=(40, 0), repetition=Grid(a_vector=(12, 0), a_count=2, b_vector=(0, 9), b_count=2)) - lib['mid'] = mid - - top = Pattern() - top.ref('mid', offset=(100, 200)) - lib['top'] = top - return lib - - -def test_gdsii_lazy_arrow_loads_perf_fixture(tmp_path: Path) -> None: - gds_file = tmp_path / 'many_cells_lazy.gds' - manifest = write_fixture(gds_file, preset='many_cells', scale=0.001) - - lib, info = gdsii_lazy_arrow.readfile(gds_file) - - assert info['name'] == manifest.library_name - assert len(lib) == manifest.cells - assert lib.top() == 'TOP' - assert 'TOP' in lib.child_graph(dangling='ignore') - - -def test_gdsii_lazy_arrow_local_and_global_refs(tmp_path: Path) -> None: - gds_file = tmp_path / 'refs.gds' - src = _make_small_library() - gdsii.writefile(src, gds_file, meters_per_unit=1e-9, library_name='lazy-refs') - - lib, _ = gdsii_lazy_arrow.readfile(gds_file) - - local = lib.find_refs_local('leaf') - assert set(local) == {'mid'} - assert sum(arr.shape[0] for arr in local['mid']) == 5 - - global_refs = lib.find_refs_global('leaf') - assert {path for path in global_refs} == {('top', 'mid', 'leaf')} - assert global_refs[('top', 'mid', 'leaf')].shape[0] == 5 - - -def test_gdsii_lazy_arrow_untouched_write_is_copy_through(tmp_path: Path) -> None: - gds_file = tmp_path / 'copy_source.gds' - src = _make_small_library() - gdsii.writefile(src, gds_file, meters_per_unit=1e-9, library_name='copy-through') - - lib, info = gdsii_lazy_arrow.readfile(gds_file) - out_file = tmp_path / 'copy_out.gds' - gdsii_lazy_arrow.writefile( - lib, - out_file, - meters_per_unit=info['meters_per_unit'], - logical_units_per_unit=info['logical_units_per_unit'], - library_name=info['name'], - ) - - assert out_file.read_bytes() == gds_file.read_bytes() - - -def test_gdsii_lazy_overlay_merge_and_write(tmp_path: Path) -> None: - base_a = Library() - leaf_a = Pattern() - leaf_a.polygon((1, 0), vertices=[[0, 0], [8, 0], [8, 8], [0, 8]]) - base_a['leaf'] = leaf_a - top_a = Pattern() - top_a.ref('leaf', offset=(0, 0)) - base_a['top_a'] = top_a - - base_b = Library() - leaf_b = Pattern() - leaf_b.polygon((2, 0), vertices=[[0, 0], [5, 0], [5, 5], [0, 5]]) - base_b['leaf'] = leaf_b - top_b = Pattern() - top_b.ref('leaf', offset=(20, 30)) - base_b['top_b'] = top_b - - gds_a = tmp_path / 'a.gds' - gds_b = tmp_path / 'b.gds' - gdsii.writefile(base_a, gds_a, meters_per_unit=1e-9, library_name='overlay') - gdsii.writefile(base_b, gds_b, meters_per_unit=1e-9, library_name='overlay') - - lib_a, _ = gdsii_lazy_arrow.readfile(gds_a) - lib_b, _ = gdsii_lazy_arrow.readfile(gds_b) - - overlay = gdsii_lazy_arrow.OverlayLibrary() - overlay.add_source(lib_a) - rename_map = overlay.add_source(lib_b, rename_theirs=lambda lib, name: lib.get_name(name)) - renamed_leaf = rename_map['leaf'] - - assert rename_map == {'leaf': renamed_leaf} - assert renamed_leaf != 'leaf' - assert len(lib_a._cache) == 0 - assert len(lib_b._cache) == 0 - - overlay.move_references('leaf', renamed_leaf) - - out_file = tmp_path / 'overlay_out.gds' - gdsii_lazy_arrow.writefile(overlay, out_file) - - roundtrip, _ = gdsii.readfile(out_file) - assert set(roundtrip.keys()) == {'leaf', renamed_leaf, 'top_a', 'top_b'} - assert 'top_b' in roundtrip - assert list(roundtrip['top_b'].refs.keys()) == [renamed_leaf] - - -def test_gdsii_writer_accepts_overlay_library(tmp_path: Path) -> None: - gds_file = tmp_path / 'overlay_source.gds' - src = _make_small_library() - gdsii.writefile(src, gds_file, meters_per_unit=1e-9, library_name='overlay-src') - - lib, info = gdsii_lazy_arrow.readfile(gds_file) - - overlay = gdsii_lazy_arrow.OverlayLibrary() - overlay.add_source(lib) - overlay.rename('leaf', 'leaf_copy', move_references=True) - - out_file = tmp_path / 'overlay_via_eager_writer.gds' - gdsii.writefile( - overlay, - out_file, - meters_per_unit=info['meters_per_unit'], - logical_units_per_unit=info['logical_units_per_unit'], - library_name=info['name'], - ) - - roundtrip, _ = gdsii.readfile(out_file) - assert set(roundtrip.keys()) == {'leaf_copy', 'mid', 'top'} - assert list(roundtrip['mid'].refs.keys()) == ['leaf_copy'] - - -def test_svg_writer_uses_detached_materialized_copy(tmp_path: Path) -> None: - pytest.importorskip('svgwrite') - from ..file import svg - from ..shapes import Path as MPath - - gds_file = tmp_path / 'svg_source.gds' - src = _make_small_library() - src['top'].path((3, 0), vertices=[[0, 0], [0, 20]], width=4) - gdsii.writefile(src, gds_file, meters_per_unit=1e-9, library_name='svg-src') - - lib, _ = gdsii_lazy_arrow.readfile(gds_file) - top_pat = lib['top'] - assert list(top_pat.refs.keys()) == ['mid'] - assert any(isinstance(shape, MPath) for shape in top_pat.shapes[(3, 0)]) - - svg_path = tmp_path / 'lazy.svg' - svg.writefile(lib, 'top', str(svg_path)) - - assert svg_path.exists() - assert list(top_pat.refs.keys()) == ['mid'] - assert any(isinstance(shape, MPath) for shape in top_pat.shapes[(3, 0)]) diff --git a/masque/test/test_gdsii_perf.py b/masque/test/test_gdsii_perf.py deleted file mode 100644 index a595fe8..0000000 --- a/masque/test/test_gdsii_perf.py +++ /dev/null @@ -1,24 +0,0 @@ -from dataclasses import asdict -import json -from pathlib import Path - -from ..file import gdsii -from ..file.gdsii_perf import fixture_manifest, write_fixture - - -def test_gdsii_perf_fixture_smoke(tmp_path: Path) -> None: - output = tmp_path / 'many_cells.gds' - manifest = write_fixture(output, preset='many_cells', scale=0.002) - expected = fixture_manifest(output, preset='many_cells', scale=0.002) - - assert output.exists() - assert manifest == expected - - sidecar = json.loads(output.with_suffix('.gds.json').read_text()) - assert sidecar == asdict(manifest) - - read_lib, info = gdsii.readfile(output) - assert info['name'] == manifest.library_name - assert len(read_lib) == manifest.cells - assert 'TOP' in read_lib - assert len(read_lib['TOP'].refs) > 0 diff --git a/masque/test/test_label.py b/masque/test/test_label.py deleted file mode 100644 index f4f364b..0000000 --- a/masque/test/test_label.py +++ /dev/null @@ -1,54 +0,0 @@ -import copy -from numpy.testing import assert_equal, assert_allclose -from numpy import pi - -from ..label import Label -from ..repetition import Grid -from ..utils import annotations_eq - - -def test_label_init() -> None: - lbl = Label("test", offset=(10, 20)) - assert lbl.string == "test" - assert_equal(lbl.offset, [10, 20]) - - -def test_label_transform() -> None: - lbl = Label("test", offset=(10, 0)) - # Rotate 90 deg CCW around (0,0) - lbl.rotate_around((0, 0), pi / 2) - assert_allclose(lbl.offset, [0, 10], atol=1e-10) - - # Translate - lbl.translate((5, 5)) - assert_allclose(lbl.offset, [5, 15], atol=1e-10) - - -def test_label_repetition() -> None: - rep = Grid(a_vector=(10, 0), a_count=3) - lbl = Label("rep", offset=(0, 0), repetition=rep) - assert lbl.repetition is rep - assert_equal(lbl.get_bounds_single(), [[0, 0], [0, 0]]) - # Note: Bounded.get_bounds_nonempty() for labels with repetition doesn't - # seem to automatically include repetition bounds in label.py itself, - # it's handled during pattern bounding. - - -def test_label_copy() -> None: - l1 = Label("test", offset=(1, 2), annotations={"a": [1]}) - l2 = copy.deepcopy(l1) - - print(f"l1: string={l1.string}, offset={l1.offset}, repetition={l1.repetition}, annotations={l1.annotations}") - print(f"l2: string={l2.string}, offset={l2.offset}, repetition={l2.repetition}, annotations={l2.annotations}") - print(f"annotations_eq: {annotations_eq(l1.annotations, l2.annotations)}") - - assert l1 == l2 - assert l1 is not l2 - l2.offset[0] = 100 - assert l1.offset[0] == 1 - - -def test_label_eq_unrelated_objects_is_false() -> None: - lbl = Label("test") - assert not (lbl == None) - assert not (lbl == object()) diff --git a/masque/test/test_library.py b/masque/test/test_library.py deleted file mode 100644 index ce564aa..0000000 --- a/masque/test/test_library.py +++ /dev/null @@ -1,483 +0,0 @@ -import pytest -from typing import cast, TYPE_CHECKING -from numpy.testing import assert_allclose -from ..library import Library, LazyLibrary -from ..pattern import Pattern -from ..error import LibraryError, PatternError -from ..ports import Port -from ..repetition import Grid -from ..shapes import Arc, Ellipse, Path, Text -from ..file.utils import preflight - -if TYPE_CHECKING: - from ..shapes import Polygon - - -def test_library_basic() -> None: - lib = Library() - pat = Pattern() - lib["cell1"] = pat - - assert "cell1" in lib - assert lib["cell1"] is pat - assert len(lib) == 1 - - with pytest.raises(LibraryError): - lib["cell1"] = Pattern() # Overwriting not allowed - - -def test_library_tops() -> None: - lib = Library() - lib["child"] = Pattern() - lib["parent"] = Pattern() - lib["parent"].ref("child") - - assert set(lib.tops()) == {"parent"} - assert lib.top() == "parent" - - -def test_library_dangling() -> None: - lib = Library() - lib["parent"] = Pattern() - lib["parent"].ref("missing") - - assert lib.dangling_refs() == {"missing"} - - -def test_library_dangling_graph_modes() -> None: - lib = Library() - lib["parent"] = Pattern() - lib["parent"].ref("missing") - - with pytest.raises(LibraryError, match="Dangling refs found"): - lib.child_graph() - with pytest.raises(LibraryError, match="Dangling refs found"): - lib.parent_graph() - with pytest.raises(LibraryError, match="Dangling refs found"): - lib.child_order() - - assert lib.child_graph(dangling="ignore") == {"parent": set()} - assert lib.parent_graph(dangling="ignore") == {"parent": set()} - assert lib.child_order(dangling="ignore") == ["parent"] - - assert lib.child_graph(dangling="include") == {"parent": {"missing"}, "missing": set()} - assert lib.parent_graph(dangling="include") == {"parent": set(), "missing": {"parent"}} - assert lib.child_order(dangling="include") == ["missing", "parent"] - - -def test_find_refs_with_dangling_modes() -> None: - lib = Library() - lib["target"] = Pattern() - - mid = Pattern() - mid.ref("target", offset=(2, 0)) - lib["mid"] = mid - - top = Pattern() - top.ref("mid", offset=(5, 0)) - top.ref("missing", offset=(9, 0)) - lib["top"] = top - - assert lib.find_refs_local("missing", dangling="ignore") == {} - assert lib.find_refs_global("missing", dangling="ignore") == {} - - local_missing = lib.find_refs_local("missing", dangling="include") - assert set(local_missing) == {"top"} - assert_allclose(local_missing["top"][0], [[9, 0, 0, 0, 1]]) - - global_missing = lib.find_refs_global("missing", dangling="include") - assert_allclose(global_missing[("top", "missing")], [[9, 0, 0, 0, 1]]) - - with pytest.raises(LibraryError, match="missing"): - lib.find_refs_local("missing") - with pytest.raises(LibraryError, match="missing"): - lib.find_refs_global("missing") - - global_target = lib.find_refs_global("target") - assert_allclose(global_target[("top", "mid", "target")], [[7, 0, 0, 0, 1]]) - - -def test_preflight_prune_empty_preserves_dangling_policy(caplog: pytest.LogCaptureFixture) -> None: - def make_lib() -> Library: - lib = Library() - lib["empty"] = Pattern() - lib["top"] = Pattern() - lib["top"].ref("missing") - return lib - - caplog.set_level("WARNING") - warned = preflight(make_lib(), allow_dangling_refs=None, prune_empty_patterns=True) - assert "empty" not in warned - assert any("Dangling refs found" in record.message for record in caplog.records) - - allowed = preflight(make_lib(), allow_dangling_refs=True, prune_empty_patterns=True) - assert "empty" not in allowed - - with pytest.raises(LibraryError, match="Dangling refs found"): - preflight(make_lib(), allow_dangling_refs=False, prune_empty_patterns=True) - - -def test_library_flatten() -> None: - lib = Library() - child = Pattern() - child.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]]) - lib["child"] = child - - parent = Pattern() - parent.ref("child", offset=(10, 10)) - lib["parent"] = parent - - flat_lib = lib.flatten("parent") - flat_parent = flat_lib["parent"] - - assert not flat_parent.has_refs() - assert len(flat_parent.shapes[(1, 0)]) == 1 - # Transformations are baked into vertices for Polygon - assert_vertices = cast("Polygon", flat_parent.shapes[(1, 0)][0]).vertices - assert tuple(assert_vertices[0]) == (10.0, 10.0) - - -def test_library_flatten_preserves_ports_only_child() -> None: - lib = Library() - child = Pattern(ports={"P1": Port((1, 2), 0)}) - lib["child"] = child - - parent = Pattern() - parent.ref("child", offset=(10, 10)) - lib["parent"] = parent - - flat_parent = lib.flatten("parent", flatten_ports=True)["parent"] - - assert set(flat_parent.ports) == {"P1"} - assert cast("Port", flat_parent.ports["P1"]).rotation == 0 - assert tuple(flat_parent.ports["P1"].offset) == (11.0, 12.0) - - -def test_library_flatten_repeated_ref_with_ports_raises() -> None: - lib = Library() - child = Pattern(ports={"P1": Port((1, 2), 0)}) - child.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]]) - lib["child"] = child - - parent = Pattern() - parent.ref("child", repetition=Grid(a_vector=(10, 0), a_count=2)) - lib["parent"] = parent - - with pytest.raises(PatternError, match='Cannot flatten ports from repeated ref'): - lib.flatten("parent", flatten_ports=True) - - -def test_library_flatten_dangling_ok_nested_preserves_dangling_refs() -> None: - lib = Library() - child = Pattern() - child.ref("missing") - lib["child"] = child - - parent = Pattern() - parent.ref("child") - lib["parent"] = parent - - flat = lib.flatten("parent", dangling_ok=True) - - assert set(flat["child"].refs) == {"missing"} - assert flat["child"].has_refs() - assert set(flat["parent"].refs) == {"missing"} - assert flat["parent"].has_refs() - - -def test_lazy_library() -> None: - lib = LazyLibrary() - called = 0 - - def make_pat() -> Pattern: - nonlocal called - called += 1 - return Pattern() - - lib["lazy"] = make_pat - assert called == 0 - - pat = lib["lazy"] - assert called == 1 - assert isinstance(pat, Pattern) - - # Second access should be cached - pat2 = lib["lazy"] - assert called == 1 - assert pat is pat2 - - -def test_library_rename() -> None: - lib = Library() - lib["old"] = Pattern() - lib["parent"] = Pattern() - lib["parent"].ref("old") - - lib.rename("old", "new", move_references=True) - - assert "old" not in lib - assert "new" in lib - assert "new" in lib["parent"].refs - assert "old" not in lib["parent"].refs - - -@pytest.mark.parametrize("library_cls", (Library, LazyLibrary)) -def test_library_rename_self_is_noop(library_cls: type[Library] | type[LazyLibrary]) -> None: - lib = library_cls() - lib["top"] = Pattern() - lib["parent"] = Pattern() - lib["parent"].ref("top") - - lib.rename("top", "top", move_references=True) - - assert set(lib.keys()) == {"top", "parent"} - assert "top" in lib["parent"].refs - assert len(lib["parent"].refs["top"]) == 1 - - -@pytest.mark.parametrize("library_cls", (Library, LazyLibrary)) -def test_library_rename_top_self_is_noop(library_cls: type[Library] | type[LazyLibrary]) -> None: - lib = library_cls() - lib["top"] = Pattern() - - lib.rename_top("top") - - assert list(lib.keys()) == ["top"] - - -@pytest.mark.parametrize("library_cls", (Library, LazyLibrary)) -def test_library_rename_missing_raises_library_error(library_cls: type[Library] | type[LazyLibrary]) -> None: - lib = library_cls() - lib["top"] = Pattern() - - with pytest.raises(LibraryError, match="does not exist"): - lib.rename("missing", "new") - - -@pytest.mark.parametrize("library_cls", (Library, LazyLibrary)) -def test_library_move_references_same_target_is_noop(library_cls: type[Library] | type[LazyLibrary]) -> None: - lib = library_cls() - lib["top"] = Pattern() - lib["parent"] = Pattern() - lib["parent"].ref("top") - - lib.move_references("top", "top") - - assert "top" in lib["parent"].refs - assert len(lib["parent"].refs["top"]) == 1 - - -def test_library_dfs_can_replace_existing_patterns() -> None: - lib = Library() - child = Pattern() - lib["child"] = child - top = Pattern() - top.ref("child") - lib["top"] = top - - replacement_top = Pattern(ports={"T": Port((1, 2), 0)}) - replacement_child = Pattern(ports={"C": Port((3, 4), 0)}) - - def visit_after(pattern: Pattern, hierarchy: tuple[str | None, ...], **kwargs) -> Pattern: # noqa: ARG001 - if hierarchy[-1] == "child": - return replacement_child - if hierarchy[-1] == "top": - return replacement_top - return pattern - - lib.dfs(lib["top"], visit_after=visit_after, hierarchy=("top",), transform=True) - - assert lib["top"] is replacement_top - assert lib["child"] is replacement_child - - -def test_lazy_library_dfs_can_replace_existing_patterns() -> None: - lib = LazyLibrary() - lib["child"] = lambda: Pattern() - lib["top"] = lambda: Pattern(refs={"child": []}) - - top = lib["top"] - top.ref("child") - - replacement_top = Pattern(ports={"T": Port((1, 2), 0)}) - replacement_child = Pattern(ports={"C": Port((3, 4), 0)}) - - def visit_after(pattern: Pattern, hierarchy: tuple[str | None, ...], **kwargs) -> Pattern: # noqa: ARG001 - if hierarchy[-1] == "child": - return replacement_child - if hierarchy[-1] == "top": - return replacement_top - return pattern - - lib.dfs(top, visit_after=visit_after, hierarchy=("top",), transform=True) - - assert lib["top"] is replacement_top - assert lib["child"] is replacement_child - - -def test_library_add_no_duplicates_respects_mutate_other_false() -> None: - src_pat = Pattern(ports={"A": Port((0, 0), 0)}) - lib = Library({"a": Pattern()}) - - lib.add({"b": src_pat}, mutate_other=False) - - assert lib["b"] is not src_pat - lib["b"].ports["A"].offset[0] = 123 - assert tuple(src_pat.ports["A"].offset) == (0.0, 0.0) - - -def test_library_add_returns_only_renamed_entries() -> None: - lib = Library({"a": Pattern(), "_shape": Pattern()}) - - assert lib.add({"b": Pattern(), "c": Pattern()}, mutate_other=False) == {} - - rename_map = lib.add({"_shape": Pattern(), "keep": Pattern()}, mutate_other=False) - - assert set(rename_map) == {"_shape"} - assert rename_map["_shape"] != "_shape" - assert "keep" not in rename_map - - -def test_library_subtree() -> None: - lib = Library() - lib["a"] = Pattern() - lib["b"] = Pattern() - lib["c"] = Pattern() - lib["a"].ref("b") - - sub = lib.subtree("a") - assert "a" in sub - assert "b" in sub - assert "c" not in sub - - -def test_library_child_order_cycle_raises_library_error() -> None: - lib = Library() - lib["a"] = Pattern() - lib["a"].ref("b") - lib["b"] = Pattern() - lib["b"].ref("a") - - with pytest.raises(LibraryError, match="Cycle found while building child order"): - lib.child_order() - - -def test_library_find_refs_global_cycle_raises_library_error() -> None: - lib = Library() - lib["a"] = Pattern() - lib["a"].ref("a") - - with pytest.raises(LibraryError, match="Cycle found while building child order"): - lib.find_refs_global("a") - - -def test_library_get_name() -> None: - lib = Library() - lib["cell"] = Pattern() - - name1 = lib.get_name("cell") - assert name1 != "cell" - assert name1.startswith("cell") - - name2 = lib.get_name("other") - assert name2 == "other" - - -def test_library_dedup_shapes_does_not_merge_custom_capped_paths() -> None: - lib = Library() - pat = Pattern() - pat.shapes[(1, 0)] += [ - Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.SquareCustom, cap_extensions=(1, 2)), - Path(vertices=[[20, 0], [30, 0]], width=2, cap=Path.Cap.SquareCustom, cap_extensions=(3, 4)), - ] - lib["top"] = pat - - lib.dedup(norm_value=1, threshold=2) - - assert not lib["top"].refs - assert len(lib["top"].shapes[(1, 0)]) == 2 - - -def test_library_dedup_text_preserves_scale_and_mirror_flag() -> None: - lib = Library() - pat = Pattern() - pat.shapes[(1, 0)] += [ - Text("A", 10, "dummy.ttf", offset=(0, 0)), - Text("A", 10, "dummy.ttf", offset=(100, 0)), - ] - lib["top"] = pat - - lib.dedup(exclude_types=(), norm_value=5, threshold=2) - - target_name = next(iter(lib["top"].refs)) - refs = lib["top"].refs[target_name] - assert [ref.mirrored for ref in refs] == [False, False] - assert [ref.scale for ref in refs] == [2.0, 2.0] - assert cast("Text", lib[target_name].shapes[(1, 0)][0]).height == 5 - - flat = lib.flatten("top")["top"] - assert [cast("Text", shape).height for shape in flat.shapes[(1, 0)]] == [10, 10] - - -def test_library_dedup_handles_arc_and_ellipse_labels() -> None: - lib = Library() - pat = Pattern() - pat.shapes[(1, 0)] += [ - Arc(radii=(10, 20), angles=(0, 1), width=2, offset=(0, 0)), - Arc(radii=(10, 20), angles=(0, 1), width=2, offset=(50, 0)), - ] - pat.shapes[(2, 0)] += [ - Ellipse(radii=(10, 20), offset=(0, 0)), - Ellipse(radii=(10, 20), offset=(50, 0)), - ] - lib["top"] = pat - - lib.dedup(exclude_types=(), norm_value=1, threshold=2) - - assert len(lib["top"].refs) == 2 - assert lib["top"].shapes[(1, 0)] == [] - assert lib["top"].shapes[(2, 0)] == [] - - flat = lib.flatten("top")["top"] - assert sum(isinstance(shape, Arc) for shape in flat.shapes[(1, 0)]) == 2 - assert sum(isinstance(shape, Ellipse) for shape in flat.shapes[(2, 0)]) == 2 - - -def test_library_dedup_handles_multiple_duplicate_groups() -> None: - from ..shapes import Circle - - lib = Library() - pat = Pattern() - pat.shapes[(1, 0)] += [Circle(radius=1, offset=(0, 0)), Circle(radius=1, offset=(10, 0))] - pat.shapes[(2, 0)] += [Path(vertices=[[0, 0], [5, 0]], width=2), Path(vertices=[[10, 0], [15, 0]], width=2)] - lib["top"] = pat - - lib.dedup(exclude_types=(), norm_value=1, threshold=2) - - assert len(lib["top"].refs) == 2 - assert all(len(refs) == 2 for refs in lib["top"].refs.values()) - assert len(lib["top"].shapes[(1, 0)]) == 0 - assert len(lib["top"].shapes[(2, 0)]) == 0 - - -def test_library_dedup_uses_stable_target_names_per_label() -> None: - from ..shapes import Circle - - lib = Library() - - p1 = Pattern() - p1.shapes[(1, 0)] += [Circle(radius=1, offset=(0, 0)), Circle(radius=1, offset=(10, 0))] - lib["p1"] = p1 - - p2 = Pattern() - p2.shapes[(2, 0)] += [Path(vertices=[[0, 0], [5, 0]], width=2), Path(vertices=[[10, 0], [15, 0]], width=2)] - lib["p2"] = p2 - - lib.dedup(exclude_types=(), norm_value=1, threshold=2) - - circle_target = next(iter(lib["p1"].refs)) - path_target = next(iter(lib["p2"].refs)) - - assert circle_target != path_target - assert all(isinstance(shape, Circle) for shapes in lib[circle_target].shapes.values() for shape in shapes) - assert all(isinstance(shape, Path) for shapes in lib[path_target].shapes.values() for shape in shapes) diff --git a/masque/test/test_oasis.py b/masque/test/test_oasis.py deleted file mode 100644 index f549db7..0000000 --- a/masque/test/test_oasis.py +++ /dev/null @@ -1,60 +0,0 @@ -import io -from pathlib import Path -import pytest -from numpy.testing import assert_equal - -from ..error import PatternError -from ..pattern import Pattern -from ..library import Library -from ..shapes import Path as MPath - - -def test_oasis_roundtrip(tmp_path: Path) -> None: - # Skip if fatamorgana is not installed - pytest.importorskip("fatamorgana") - from ..file import oasis - - lib = Library() - pat1 = Pattern() - pat1.polygon((1, 0), vertices=[[0, 0], [10, 0], [10, 10], [0, 10]]) - lib["cell1"] = pat1 - - oas_file = tmp_path / "test.oas" - # OASIS needs units_per_micron - oasis.writefile(lib, oas_file, units_per_micron=1000) - - read_lib, info = oasis.readfile(oas_file) - assert "cell1" in read_lib - - # Check bounds - assert_equal(read_lib["cell1"].get_bounds(), [[0, 0], [10, 10]]) - - -def test_oasis_properties_to_annotations_merges_repeated_keys() -> None: - pytest.importorskip("fatamorgana") - import fatamorgana.records as fatrec - from ..file.oasis import properties_to_annotations - - annotations = properties_to_annotations( - [ - fatrec.Property("k", [1], is_standard=False), - fatrec.Property("k", [2, 3], is_standard=False), - ], - {}, - {}, - ) - - assert annotations == {"k": [1, 2, 3]} - - -def test_oasis_write_rejects_circle_path_caps() -> None: - pytest.importorskip("fatamorgana") - from ..file import oasis - - lib = Library() - pat = Pattern() - pat.path((1, 0), vertices=[[0, 0], [10, 0]], width=2, cap=MPath.Cap.Circle) - lib["cell1"] = pat - - with pytest.raises(PatternError, match="does not support path cap"): - oasis.write(lib, io.BytesIO(), units_per_micron=1000) diff --git a/masque/test/test_pack2d.py b/masque/test/test_pack2d.py deleted file mode 100644 index 914c23e..0000000 --- a/masque/test/test_pack2d.py +++ /dev/null @@ -1,96 +0,0 @@ -from ..utils.pack2d import maxrects_bssf, guillotine_bssf_sas, pack_patterns -from ..library import Library -from ..pattern import Pattern - - -def test_maxrects_bssf_simple() -> None: - # Pack two 10x10 squares into one 20x10 container - rects = [[10, 10], [10, 10]] - containers = [[0, 0, 20, 10]] - - locs, rejects = maxrects_bssf(rects, containers) - - assert not rejects - # They should be at (0,0) and (10,0) - assert {tuple(loc) for loc in locs} == {(0.0, 0.0), (10.0, 0.0)} - - -def test_maxrects_bssf_reject() -> None: - # Try to pack a too-large rectangle - rects = [[10, 10], [30, 30]] - containers = [[0, 0, 20, 20]] - - locs, rejects = maxrects_bssf(rects, containers, allow_rejects=True) - assert 1 in rejects # Second rect rejected - assert 0 not in rejects - - -def test_maxrects_bssf_exact_fill_rejects_remaining() -> None: - rects = [[20, 20], [1, 1]] - containers = [[0, 0, 20, 20]] - - locs, rejects = maxrects_bssf(rects, containers, presort=False, allow_rejects=True) - - assert tuple(locs[0]) == (0.0, 0.0) - assert rejects == {1} - - -def test_maxrects_bssf_presort_reject_mapping() -> None: - rects = [[10, 12], [19, 14], [13, 11]] - containers = [[0, 0, 20, 20]] - - _locs, rejects = maxrects_bssf(rects, containers, presort=True, allow_rejects=True) - - assert rejects == {0, 2} - - -def test_guillotine_bssf_sas_presort_reject_mapping() -> None: - rects = [[2, 1], [17, 15], [16, 11]] - containers = [[0, 0, 20, 20]] - - _locs, rejects = guillotine_bssf_sas(rects, containers, presort=True, allow_rejects=True) - - assert rejects == {2} - - -def test_pack_patterns() -> None: - lib = Library() - p1 = Pattern() - p1.polygon((1, 0), vertices=[[0, 0], [10, 0], [10, 10], [0, 10]]) - lib["p1"] = p1 - - p2 = Pattern() - p2.polygon((1, 0), vertices=[[0, 0], [5, 0], [5, 5], [0, 5]]) - lib["p2"] = p2 - - # Containers: one 20x20 - containers = [[0, 0, 20, 20]] - # 2um spacing - pat, rejects = pack_patterns(lib, ["p1", "p2"], containers, spacing=(2, 2)) - - assert not rejects - assert len(pat.refs) == 2 - assert "p1" in pat.refs - assert "p2" in pat.refs - - # Check that they don't overlap (simple check via bounds) - # p1 size 10x10, effectively 12x12 - # p2 size 5x5, effectively 7x7 - # Both should fit in 20x20 - - -def test_pack_patterns_reject_names_match_original_patterns() -> None: - lib = Library() - for name, (lx, ly) in { - "p0": (10, 12), - "p1": (19, 14), - "p2": (13, 11), - }.items(): - pat = Pattern() - pat.rect((1, 0), xmin=0, xmax=lx, ymin=0, ymax=ly) - lib[name] = pat - - pat, rejects = pack_patterns(lib, ["p0", "p1", "p2"], [[0, 0, 20, 20]], spacing=(0, 0)) - - assert set(rejects) == {"p0", "p2"} - assert set(pat.refs) == {"p1"} diff --git a/masque/test/test_path.py b/masque/test/test_path.py deleted file mode 100644 index 1cdd872..0000000 --- a/masque/test/test_path.py +++ /dev/null @@ -1,111 +0,0 @@ -from numpy.testing import assert_equal, assert_allclose - -from ..shapes import Path - - -def test_path_init() -> None: - p = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.Flush) - assert_equal(p.vertices, [[0, 0], [10, 0]]) - assert p.width == 2 - assert p.cap == Path.Cap.Flush - - -def test_path_to_polygons_flush() -> None: - p = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.Flush) - polys = p.to_polygons() - assert len(polys) == 1 - # Rectangle from (0, -1) to (10, 1) - bounds = polys[0].get_bounds_single() - assert_equal(bounds, [[0, -1], [10, 1]]) - - -def test_path_to_polygons_square() -> None: - p = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.Square) - polys = p.to_polygons() - assert len(polys) == 1 - # Square cap adds width/2 = 1 to each end - # Rectangle from (-1, -1) to (11, 1) - bounds = polys[0].get_bounds_single() - assert_equal(bounds, [[-1, -1], [11, 1]]) - - -def test_path_to_polygons_circle() -> None: - p = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.Circle) - polys = p.to_polygons(num_vertices=32) - # Path.to_polygons for Circle cap returns 1 polygon for the path + polygons for the caps - assert len(polys) >= 3 - - # Combined bounds should be from (-1, -1) to (11, 1) - # But wait, Path.get_bounds_single() handles this more directly - bounds = p.get_bounds_single() - assert_equal(bounds, [[-1, -1], [11, 1]]) - - -def test_path_custom_cap() -> None: - p = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.SquareCustom, cap_extensions=(5, 10)) - polys = p.to_polygons() - assert len(polys) == 1 - # Extends 5 units at start, 10 at end - # Starts at -5, ends at 20 - bounds = polys[0].get_bounds_single() - assert_equal(bounds, [[-5, -1], [20, 1]]) - - -def test_path_bend() -> None: - # L-shaped path - p = Path(vertices=[[0, 0], [10, 0], [10, 10]], width=2) - polys = p.to_polygons() - assert len(polys) == 1 - bounds = polys[0].get_bounds_single() - # Outer corner at (11, -1) is not right. - # Segments: (0,0)-(10,0) and (10,0)-(10,10) - # Corners of segment 1: (0,1), (10,1), (10,-1), (0,-1) - # Corners of segment 2: (9,0), (9,10), (11,10), (11,0) - # Bounds should be [[-1 (if start is square), -1], [11, 11]]? - # Flush cap start at (0,0) with width 2 means y from -1 to 1. - # Vertical segment end at (10,10) with width 2 means x from 9 to 11. - # So bounds should be x: [0, 11], y: [-1, 10] - assert_equal(bounds, [[0, -1], [11, 10]]) - - -def test_path_mirror() -> None: - p = Path(vertices=[[10, 5], [20, 10]], width=2) - p.mirror(0) # Mirror across x axis (y -> -y) - assert_equal(p.vertices, [[10, -5], [20, -10]]) - - -def test_path_scale() -> None: - p = Path(vertices=[[0, 0], [10, 0]], width=2) - p.scale_by(2) - assert_equal(p.vertices, [[0, 0], [20, 0]]) - assert p.width == 4 - - -def test_path_scale_custom_cap_extensions() -> None: - p = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.SquareCustom, cap_extensions=(1, 2)) - p.scale_by(3) - - assert_equal(p.vertices, [[0, 0], [30, 0]]) - assert p.width == 6 - assert p.cap_extensions is not None - assert_allclose(p.cap_extensions, [3, 6]) - assert_equal(p.to_polygons()[0].get_bounds_single(), [[-3, -3], [36, 3]]) - - -def test_path_normalized_form_preserves_width_and_custom_cap_extensions() -> None: - p = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.SquareCustom, cap_extensions=(1, 2)) - - intrinsic, _extrinsic, ctor = p.normalized_form(5) - q = ctor() - - assert intrinsic[-1] == (0.2, 0.4) - assert q.width == 2 - assert q.cap_extensions is not None - assert_allclose(q.cap_extensions, [1, 2]) - - -def test_path_normalized_form_distinguishes_custom_caps() -> None: - p1 = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.SquareCustom, cap_extensions=(1, 2)) - p2 = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.SquareCustom, cap_extensions=(3, 4)) - - assert p1.normalized_form(1)[0] != p2.normalized_form(1)[0] diff --git a/masque/test/test_pather.py b/masque/test/test_pather.py deleted file mode 100644 index 47cae29..0000000 --- a/masque/test/test_pather.py +++ /dev/null @@ -1,108 +0,0 @@ -import pytest -from numpy.testing import assert_equal, assert_allclose -from numpy import pi - -from ..builder import Pather -from ..builder.tools import PathTool -from ..library import Library -from ..ports import Port - - -@pytest.fixture -def pather_setup() -> tuple[Pather, PathTool, Library]: - lib = Library() - # Simple PathTool: 2um width on layer (1,0) - tool = PathTool(layer=(1, 0), width=2, ptype="wire") - p = Pather(lib, tools=tool) - # Add an initial port facing North (pi/2) - # Port rotation points INTO device. So "North" rotation means device is North of port. - # Pathing "forward" moves South. - p.ports["start"] = Port((0, 0), pi / 2, ptype="wire") - return p, tool, lib - - -def test_pather_straight(pather_setup: tuple[Pather, PathTool, Library]) -> None: - p, tool, lib = pather_setup - # Route 10um "forward" - p.straight("start", 10) - - # port rot pi/2 (North). Travel +pi relative to port -> South. - assert_allclose(p.ports["start"].offset, [0, -10], atol=1e-10) - assert p.ports["start"].rotation is not None - assert_allclose(p.ports["start"].rotation, pi / 2, atol=1e-10) - - -def test_pather_bend(pather_setup: tuple[Pather, PathTool, Library]) -> None: - p, tool, lib = pather_setup - # Start (0,0) rot pi/2 (North). - # Path 10um "forward" (South), then turn Clockwise (ccw=False). - # Facing South, turn Right -> West. - p.cw("start", 10) - - # PathTool.planL(ccw=False, length=10) returns out_port at (10, -1) relative to (0,0) rot 0. - # Transformed by port rot pi/2 (North) + pi (to move "forward" away from device): - # Transformation rot = pi/2 + pi = 3pi/2. - # (10, -1) rotated 3pi/2: (x,y) -> (y, -x) -> (-1, -10). - - assert_allclose(p.ports["start"].offset, [-1, -10], atol=1e-10) - # North (pi/2) + CW (90 deg) -> West (pi)? - # Actual behavior results in 0 (East) - apparently rotation is flipped. - assert p.ports["start"].rotation is not None - assert_allclose(p.ports["start"].rotation, 0, atol=1e-10) - - -def test_pather_path_to(pather_setup: tuple[Pather, PathTool, Library]) -> None: - p, tool, lib = pather_setup - # start at (0,0) rot pi/2 (North) - # path "forward" (South) to y=-50 - p.straight("start", y=-50) - assert_equal(p.ports["start"].offset, [0, -50]) - - -def test_pather_mpath(pather_setup: tuple[Pather, PathTool, Library]) -> None: - p, tool, lib = pather_setup - p.ports["A"] = Port((0, 0), pi / 2, ptype="wire") - p.ports["B"] = Port((10, 0), pi / 2, ptype="wire") - - # Path both "forward" (South) to y=-20 - p.straight(["A", "B"], ymin=-20) - assert_equal(p.ports["A"].offset, [0, -20]) - assert_equal(p.ports["B"].offset, [10, -20]) - - -def test_pather_at_chaining(pather_setup: tuple[Pather, PathTool, Library]) -> None: - p, tool, lib = pather_setup - # Fluent API test - p.at("start").straight(10).ccw(10) - # 10um South -> (0, -10) rot pi/2 - # then 10um South and turn CCW (Facing South, CCW is East) - # PathTool.planL(ccw=True, length=10) -> out_port=(10, 1) rot -pi/2 relative to rot 0 - # Transform (10, 1) by 3pi/2: (x,y) -> (y, -x) -> (1, -10) - # (0, -10) + (1, -10) = (1, -20) - assert_allclose(p.ports["start"].offset, [1, -20], atol=1e-10) - # pi/2 (North) + CCW (90 deg) -> 0 (East)? - # Actual behavior results in pi (West). - assert p.ports["start"].rotation is not None - assert_allclose(p.ports["start"].rotation, pi, atol=1e-10) - - -def test_pather_dead_ports() -> None: - lib = Library() - tool = PathTool(layer=(1, 0), width=1) - p = Pather(lib, ports={"in": Port((0, 0), 0)}, tools=tool) - p.set_dead() - - # Path with negative length (impossible for PathTool, would normally raise BuildError) - p.straight("in", -10) - - # Port 'in' should be updated by dummy extension despite tool failure - # port_rot=0, forward is -x. path(-10) means moving -10 in -x direction -> +10 in x. - assert_allclose(p.ports["in"].offset, [10, 0], atol=1e-10) - - # Downstream path should work correctly using the dummy port location - p.straight("in", 20) - # 10 + (-20) = -10 - assert_allclose(p.ports["in"].offset, [-10, 0], atol=1e-10) - - # Verify no geometry - assert not p.pattern.has_shapes() diff --git a/masque/test/test_pather_api.py b/masque/test/test_pather_api.py deleted file mode 100644 index d01ed44..0000000 --- a/masque/test/test_pather_api.py +++ /dev/null @@ -1,936 +0,0 @@ -from typing import Any - -import pytest -import numpy -from numpy import pi -from masque import Pather, Library, Pattern, Port -from masque.builder.tools import PathTool, Tool -from masque.error import BuildError, PortError, PatternError - -def test_pather_trace_basic() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1000) - p = Pather(lib, tools=tool, auto_render=False) - - # Port rotation 0 points in +x (INTO device). - # To extend it, we move in -x direction. - p.pattern.ports['A'] = Port((0, 0), rotation=0) - - # Trace single port - p.at('A').trace(None, 5000) - assert numpy.allclose(p.pattern.ports['A'].offset, (-5000, 0)) - - # Trace with bend - p.at('A').trace(True, 5000) # CCW bend - # Port was at (-5000, 0) rot 0. - # New wire starts at (-5000, 0) rot 0. - # Output port of wire before rotation: (5000, 500) rot -pi/2 - # Rotate by pi (since dev port rot is 0 and tool port rot is 0): - # (-5000, -500) rot pi - pi/2 = pi/2 - # Add to start: (-10000, -500) rot pi/2 - assert numpy.allclose(p.pattern.ports['A'].offset, (-10000, -500)) - assert p.pattern.ports['A'].rotation is not None - assert numpy.isclose(p.pattern.ports['A'].rotation, pi/2) - -def test_pather_trace_to() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1000) - p = Pather(lib, tools=tool, auto_render=False) - - p.pattern.ports['A'] = Port((0, 0), rotation=0) - - # Trace to x=-10000 - p.at('A').trace_to(None, x=-10000) - assert numpy.allclose(p.pattern.ports['A'].offset, (-10000, 0)) - - # Trace to position=-20000 - p.at('A').trace_to(None, p=-20000) - assert numpy.allclose(p.pattern.ports['A'].offset, (-20000, 0)) - -def test_pather_bundle_trace() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1000) - p = Pather(lib, tools=tool, auto_render=False) - - p.pattern.ports['A'] = Port((0, 0), rotation=0) - p.pattern.ports['B'] = Port((0, 2000), rotation=0) - - # Straight bundle - all should align to same x - p.at(['A', 'B']).straight(xmin=-10000) - assert numpy.isclose(p.pattern.ports['A'].offset[0], -10000) - assert numpy.isclose(p.pattern.ports['B'].offset[0], -10000) - - # Bundle with bend - p.at(['A', 'B']).ccw(xmin=-20000, spacing=2000) - # Traveling in -x direction. CCW turn turns towards -y. - # A is at y=0, B is at y=2000. - # Rotation center is at y = -R. - # A is closer to center than B. So A is inner, B is outer. - # xmin is coordinate of innermost bend (A). - assert numpy.isclose(p.pattern.ports['A'].offset[0], -20000) - # B's bend is further out (more negative x) - assert numpy.isclose(p.pattern.ports['B'].offset[0], -22000) - -def test_pather_each_bound() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1000) - p = Pather(lib, tools=tool, auto_render=False) - - p.pattern.ports['A'] = Port((0, 0), rotation=0) - p.pattern.ports['B'] = Port((-1000, 2000), rotation=0) - - # Each should move by 5000 (towards -x) - p.at(['A', 'B']).trace(None, each=5000) - assert numpy.allclose(p.pattern.ports['A'].offset, (-5000, 0)) - assert numpy.allclose(p.pattern.ports['B'].offset, (-6000, 2000)) - -def test_selection_management() -> None: - lib = Library() - p = Pather(lib) - p.pattern.ports['A'] = Port((0, 0), rotation=0) - p.pattern.ports['B'] = Port((0, 0), rotation=0) - - pp = p.at('A') - assert pp.ports == ['A'] - - pp.select('B') - assert pp.ports == ['A', 'B'] - - pp.deselect('A') - assert pp.ports == ['B'] - - pp.select(['A']) - assert pp.ports == ['B', 'A'] - - pp.drop() - assert 'A' not in p.pattern.ports - assert 'B' not in p.pattern.ports - assert pp.ports == [] - -def test_mark_fork() -> None: - lib = Library() - p = Pather(lib) - p.pattern.ports['A'] = Port((100, 200), rotation=1) - - pp = p.at('A') - pp.mark('B') - assert 'B' in p.pattern.ports - assert numpy.allclose(p.pattern.ports['B'].offset, (100, 200)) - assert p.pattern.ports['B'].rotation == 1 - assert pp.ports == ['A'] # mark keeps current selection - - pp.fork('C') - assert 'C' in p.pattern.ports - assert pp.ports == ['C'] # fork switches to new name - - -def test_mark_fork_reject_overwrite_and_duplicate_targets() -> None: - lib = Library() - - p_mark = Pather(lib, pattern=Pattern(ports={ - 'A': Port((0, 0), rotation=0), - 'C': Port((2, 0), rotation=0), - })) - with pytest.raises(PortError, match='overwrite existing ports'): - p_mark.at('A').mark('C') - assert numpy.allclose(p_mark.pattern.ports['C'].offset, (2, 0)) - - p_fork = Pather(lib, pattern=Pattern(ports={ - 'A': Port((0, 0), rotation=0), - 'B': Port((1, 0), rotation=0), - })) - pp = p_fork.at(['A', 'B']) - with pytest.raises(PortError, match='targets would collide'): - pp.fork({'A': 'X', 'B': 'X'}) - assert set(p_fork.pattern.ports) == {'A', 'B'} - assert pp.ports == ['A', 'B'] - - -def test_mark_fork_dead_overwrite_and_duplicate_targets() -> None: - lib = Library() - p = Pather(lib, pattern=Pattern(ports={ - 'A': Port((0, 0), rotation=0), - 'B': Port((1, 0), rotation=0), - 'C': Port((2, 0), rotation=0), - })) - p.set_dead() - - p.at('A').mark('C') - assert numpy.allclose(p.pattern.ports['C'].offset, (0, 0)) - - pp = p.at(['A', 'B']) - pp.fork({'A': 'X', 'B': 'X'}) - assert numpy.allclose(p.pattern.ports['X'].offset, (1, 0)) - assert pp.ports == ['X'] - - -def test_mark_fork_reject_missing_sources() -> None: - lib = Library() - p = Pather(lib, pattern=Pattern(ports={ - 'A': Port((0, 0), rotation=0), - 'B': Port((1, 0), rotation=0), - })) - - with pytest.raises(PortError, match='selected ports'): - p.at(['A', 'B']).mark({'Z': 'C'}) - - with pytest.raises(PortError, match='selected ports'): - p.at(['A', 'B']).fork({'Z': 'C'}) - -def test_rename() -> None: - lib = Library() - p = Pather(lib) - p.pattern.ports['A'] = Port((0, 0), rotation=0) - - p.at('A').rename('B') - assert 'A' not in p.pattern.ports - assert 'B' in p.pattern.ports - - p.pattern.ports['C'] = Port((0, 0), rotation=0) - pp = p.at(['B', 'C']) - pp.rename({'B': 'D', 'C': 'E'}) - assert 'B' not in p.pattern.ports - assert 'C' not in p.pattern.ports - assert 'D' in p.pattern.ports - assert 'E' in p.pattern.ports - assert set(pp.ports) == {'D', 'E'} - -def test_renderpather_uturn_fallback() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1000) - rp = Pather(lib, tools=tool, auto_render=False) - rp.pattern.ports['A'] = Port((0, 0), rotation=0) - - # PathTool doesn't implement planU, so it should fall back to two planL calls - rp.at('A').uturn(offset=10000, length=5000) - - # Two steps should be added - assert len(rp.paths['A']) == 2 - assert rp.paths['A'][0].opcode == 'L' - assert rp.paths['A'][1].opcode == 'L' - - rp.render() - assert rp.pattern.ports['A'].rotation is not None - assert numpy.isclose(rp.pattern.ports['A'].rotation, pi) - -def test_autotool_uturn() -> None: - from masque.builder.tools import AutoTool - lib = Library() - - # Setup AutoTool with a simple straight and a bend - def make_straight(length: float) -> Pattern: - pat = Pattern() - pat.rect(layer='M1', xmin=0, xmax=length, yctr=0, ly=1000) - pat.ports['in'] = Port((0, 0), 0) - pat.ports['out'] = Port((length, 0), pi) - return pat - - bend_pat = Pattern() - bend_pat.polygon(layer='M1', vertices=[(0, -500), (0, 500), (1000, -500)]) - bend_pat.ports['in'] = Port((0, 0), 0) - bend_pat.ports['out'] = Port((500, -500), pi/2) - lib['bend'] = bend_pat - - tool = AutoTool( - straights=[AutoTool.Straight(ptype='wire', fn=make_straight, in_port_name='in', out_port_name='out')], - bends=[AutoTool.Bend(abstract=lib.abstract('bend'), in_port_name='in', out_port_name='out', clockwise=True)], - sbends=[], - transitions={}, - default_out_ptype='wire' - ) - - p = Pather(lib, tools=tool, auto_render=False) - p.pattern.ports['A'] = Port((0, 0), 0) - - # CW U-turn (jog < 0) - # R = 500. jog = -2000. length = 1000. - # p0 = planL(length=1000) -> out at (1000, -500) rot pi/2 - # R2 = 500. - # l2_length = abs(-2000) - abs(-500) - 500 = 1000. - p.at('A').uturn(offset=-2000, length=1000) - - # Final port should be at (-1000, 2000) rot pi - # Start: (0,0) rot 0. Wire direction is rot + pi = pi (West, -x). - # Tool planU returns (length, jog) = (1000, -2000) relative to (0,0) rot 0. - # Rotation of pi transforms (1000, -2000) to (-1000, 2000). - # Final rotation: 0 + pi = pi. - assert numpy.allclose(p.pattern.ports['A'].offset, (-1000, 2000)) - assert p.pattern.ports['A'].rotation is not None - assert numpy.isclose(p.pattern.ports['A'].rotation, pi) - -def test_pather_trace_into() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1000) - p = Pather(lib, tools=tool, auto_render=False) - - # 1. Straight connector - p.pattern.ports['A'] = Port((0, 0), rotation=0) - p.pattern.ports['B'] = Port((-10000, 0), rotation=pi) - p.at('A').trace_into('B', plug_destination=False) - assert 'B' in p.pattern.ports - assert 'A' in p.pattern.ports - assert numpy.allclose(p.pattern.ports['A'].offset, (-10000, 0)) - - # 2. Single bend - p.pattern.ports['C'] = Port((0, 0), rotation=0) - p.pattern.ports['D'] = Port((-5000, 5000), rotation=pi/2) - p.at('C').trace_into('D', plug_destination=False) - assert 'D' in p.pattern.ports - assert 'C' in p.pattern.ports - assert numpy.allclose(p.pattern.ports['C'].offset, (-5000, 5000)) - - # 3. Jog (S-bend) - p.pattern.ports['E'] = Port((0, 0), rotation=0) - p.pattern.ports['F'] = Port((-10000, 2000), rotation=pi) - p.at('E').trace_into('F', plug_destination=False) - assert 'F' in p.pattern.ports - assert 'E' in p.pattern.ports - assert numpy.allclose(p.pattern.ports['E'].offset, (-10000, 2000)) - - # 4. U-bend (0 deg angle) - p.pattern.ports['G'] = Port((0, 0), rotation=0) - p.pattern.ports['H'] = Port((-10000, 2000), rotation=0) - p.at('G').trace_into('H', plug_destination=False) - assert 'H' in p.pattern.ports - assert 'G' in p.pattern.ports - # A U-bend with length=-travel=10000 and jog=-2000 from (0,0) rot 0 - # ends up at (-10000, 2000) rot pi. - assert numpy.allclose(p.pattern.ports['G'].offset, (-10000, 2000)) - assert p.pattern.ports['G'].rotation is not None - assert numpy.isclose(p.pattern.ports['G'].rotation, pi) - - # 5. Vertical straight connector - p.pattern.ports['I'] = Port((0, 0), rotation=pi / 2) - p.pattern.ports['J'] = Port((0, -10000), rotation=3 * pi / 2) - p.at('I').trace_into('J', plug_destination=False) - assert 'J' in p.pattern.ports - assert 'I' in p.pattern.ports - assert numpy.allclose(p.pattern.ports['I'].offset, (0, -10000)) - assert p.pattern.ports['I'].rotation is not None - assert numpy.isclose(p.pattern.ports['I'].rotation, pi / 2) - - -def test_pather_trace_into_dead_updates_ports_without_geometry() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1000, ptype='wire') - p = Pather(lib, tools=tool, auto_render=False) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - p.pattern.ports['B'] = Port((-10000, 0), rotation=pi, ptype='wire') - p.set_dead() - - p.trace_into('A', 'B', plug_destination=False) - - assert set(p.pattern.ports) == {'A', 'B'} - assert numpy.allclose(p.pattern.ports['A'].offset, (-10000, 0)) - assert p.pattern.ports['A'].rotation is not None - assert numpy.isclose(p.pattern.ports['A'].rotation, 0) - assert len(p.paths['A']) == 0 - assert not p.pattern.has_shapes() - assert not p.pattern.has_refs() - - -def test_pather_dead_fallback_preserves_out_ptype() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1000, ptype='wire') - p = Pather(lib, tools=tool, auto_render=False) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - p.set_dead() - - p.straight('A', -1000, out_ptype='other') - - assert numpy.allclose(p.pattern.ports['A'].offset, (1000, 0)) - assert p.pattern.ports['A'].ptype == 'other' - assert len(p.paths['A']) == 0 - - -def test_pather_dead_place_overwrites_colliding_ports_last_wins() -> None: - lib = Library() - p = Pather(lib, pattern=Pattern(ports={ - 'A': Port((5, 5), rotation=0), - 'keep': Port((9, 9), rotation=0), - })) - p.set_dead() - - other = Pattern() - other.ports['X'] = Port((1, 0), rotation=0) - other.ports['Y'] = Port((2, 0), rotation=pi / 2) - - p.place(other, port_map={'X': 'A', 'Y': 'A'}) - - assert set(p.pattern.ports) == {'A', 'keep'} - assert numpy.allclose(p.pattern.ports['A'].offset, (2, 0)) - assert p.pattern.ports['A'].rotation is not None - assert numpy.isclose(p.pattern.ports['A'].rotation, pi / 2) - - -def test_pather_dead_plug_overwrites_colliding_outputs_last_wins() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1000, ptype='wire') - p = Pather(lib, tools=tool, pattern=Pattern(ports={ - 'A': Port((0, 0), rotation=0, ptype='wire'), - 'B': Port((99, 99), rotation=0, ptype='wire'), - })) - p.set_dead() - - other = Pattern() - other.ports['in'] = Port((0, 0), rotation=pi, ptype='wire') - other.ports['X'] = Port((10, 0), rotation=0, ptype='wire') - other.ports['Y'] = Port((20, 0), rotation=0, ptype='wire') - - p.plug(other, map_in={'A': 'in'}, map_out={'X': 'B', 'Y': 'B'}) - - assert 'A' not in p.pattern.ports - assert 'B' in p.pattern.ports - assert numpy.allclose(p.pattern.ports['B'].offset, (20, 0)) - assert p.pattern.ports['B'].rotation is not None - assert numpy.isclose(p.pattern.ports['B'].rotation, 0) - - -def test_pather_dead_rename_overwrites_colliding_ports_last_wins() -> None: - p = Pather(Library(), pattern=Pattern(ports={ - 'A': Port((0, 0), rotation=0), - 'B': Port((1, 0), rotation=0), - 'C': Port((2, 0), rotation=0), - })) - p.set_dead() - - p.rename_ports({'A': 'C', 'B': 'C'}) - - assert set(p.pattern.ports) == {'C'} - assert numpy.allclose(p.pattern.ports['C'].offset, (1, 0)) - - -def test_pather_jog_failed_fallback_is_atomic() -> None: - lib = Library() - tool = PathTool(layer='M1', width=2, ptype='wire') - p = Pather(lib, tools=tool) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - - with pytest.raises(BuildError, match='shorter than required bend'): - p.jog('A', 1.5, length=1.5) - - assert numpy.allclose(p.pattern.ports['A'].offset, (0, 0)) - assert p.pattern.ports['A'].rotation == 0 - assert len(p.paths['A']) == 0 - - -def test_pather_jog_accepts_sub_width_offset_when_length_is_sufficient() -> None: - lib = Library() - tool = PathTool(layer='M1', width=2, ptype='wire') - p = Pather(lib, tools=tool) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - - p.jog('A', 1.5, length=5) - - assert numpy.allclose(p.pattern.ports['A'].offset, (-5, -1.5)) - assert p.pattern.ports['A'].rotation == 0 - assert len(p.paths['A']) == 0 - - -def test_pather_jog_length_solved_from_single_position_bound() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1, ptype='wire') - p = Pather(lib, tools=tool) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - - p.jog('A', 2, x=-6) - assert numpy.allclose(p.pattern.ports['A'].offset, (-6, -2)) - assert p.pattern.ports['A'].rotation is not None - assert numpy.isclose(p.pattern.ports['A'].rotation, 0) - - q = Pather(Library(), tools=tool) - q.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - q.jog('A', 2, p=-6) - assert numpy.allclose(q.pattern.ports['A'].offset, (-6, -2)) - - -def test_pather_jog_requires_length_or_one_position_bound() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1, ptype='wire') - p = Pather(lib, tools=tool) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - - with pytest.raises(BuildError, match='requires either length'): - p.jog('A', 2) - - with pytest.raises(BuildError, match='exactly one positional bound'): - p.jog('A', 2, x=-6, p=-6) - - -def test_pather_trace_to_rejects_conflicting_position_bounds() -> None: - tool = PathTool(layer='M1', width=1, ptype='wire') - - for kwargs in ({'x': -5, 'y': 2}, {'y': 2, 'x': -5}, {'p': -7, 'x': -5}): - p = Pather(Library(), tools=tool) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - with pytest.raises(BuildError, match='exactly one positional bound'): - p.trace_to('A', None, **kwargs) - - p = Pather(Library(), tools=tool) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - with pytest.raises(BuildError, match='length cannot be combined'): - p.trace_to('A', None, x=-5, length=3) - - -def test_pather_trace_rejects_length_with_bundle_bound() -> None: - p = Pather(Library(), tools=PathTool(layer='M1', width=1, ptype='wire')) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - - with pytest.raises(BuildError, match='length cannot be combined'): - p.trace('A', None, length=5, xmin=-100) - - -@pytest.mark.parametrize('kwargs', ({'xmin': -10, 'xmax': -20}, {'xmax': -20, 'xmin': -10})) -def test_pather_trace_rejects_multiple_bundle_bounds(kwargs: dict[str, int]) -> None: - p = Pather(Library(), tools=PathTool(layer='M1', width=1, ptype='wire')) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - p.pattern.ports['B'] = Port((0, 5), rotation=0, ptype='wire') - - with pytest.raises(BuildError, match='exactly one bundle bound'): - p.trace(['A', 'B'], None, **kwargs) - - -def test_pather_jog_rejects_length_with_position_bound() -> None: - p = Pather(Library(), tools=PathTool(layer='M1', width=1, ptype='wire')) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - - with pytest.raises(BuildError, match='length cannot be combined'): - p.jog('A', 2, length=5, x=-999) - - -@pytest.mark.parametrize('kwargs', ({'x': -999}, {'xmin': -10})) -def test_pather_uturn_rejects_routing_bounds(kwargs: dict[str, int]) -> None: - p = Pather(Library(), tools=PathTool(layer='M1', width=1, ptype='wire')) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - - with pytest.raises(BuildError, match='Unsupported routing bounds for uturn'): - p.uturn('A', 4, **kwargs) - - -def test_pather_uturn_none_length_defaults_to_zero() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1, ptype='wire') - p = Pather(lib, tools=tool) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - - p.uturn('A', 4) - - assert numpy.allclose(p.pattern.ports['A'].offset, (0, -4)) - assert p.pattern.ports['A'].rotation is not None - assert numpy.isclose(p.pattern.ports['A'].rotation, pi) - - -def test_pather_trace_into_failure_rolls_back_ports_and_paths() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1, ptype='wire') - p = Pather(lib, tools=tool) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - p.pattern.ports['B'] = Port((-5, 5), rotation=pi / 2, ptype='wire') - - with pytest.raises(BuildError, match='does not match path ptype'): - p.trace_into('A', 'B', plug_destination=False, out_ptype='other') - - assert numpy.allclose(p.pattern.ports['A'].offset, (0, 0)) - assert numpy.isclose(p.pattern.ports['A'].rotation, 0) - assert numpy.allclose(p.pattern.ports['B'].offset, (-5, 5)) - assert numpy.isclose(p.pattern.ports['B'].rotation, pi / 2) - assert len(p.paths['A']) == 0 - - -def test_pather_trace_into_rename_failure_rolls_back_ports_and_paths() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1, ptype='wire') - p = Pather(lib, tools=tool) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - p.pattern.ports['B'] = Port((-10, 0), rotation=pi, ptype='wire') - p.pattern.ports['other'] = Port((3, 4), rotation=0, ptype='wire') - - with pytest.raises(PortError, match='overwritten'): - p.trace_into('A', 'B', plug_destination=False, thru='other') - - assert set(p.pattern.ports) == {'A', 'B', 'other'} - assert numpy.allclose(p.pattern.ports['A'].offset, (0, 0)) - assert numpy.allclose(p.pattern.ports['B'].offset, (-10, 0)) - assert numpy.allclose(p.pattern.ports['other'].offset, (3, 4)) - assert len(p.paths['A']) == 0 - - -@pytest.mark.parametrize( - ('dst', 'kwargs', 'match'), - ( - (Port((-5, 5), rotation=pi / 2, ptype='wire'), {'x': -99}, r'trace_to\(\) arguments: x'), - (Port((-10, 2), rotation=pi, ptype='wire'), {'length': 1}, r'jog\(\) arguments: length'), - (Port((-10, 2), rotation=0, ptype='wire'), {'length': 1}, r'uturn\(\) arguments: length'), - ), -) -def test_pather_trace_into_rejects_reserved_route_kwargs( - dst: Port, - kwargs: dict[str, Any], - match: str, - ) -> None: - lib = Library() - tool = PathTool(layer='M1', width=1, ptype='wire') - p = Pather(lib, tools=tool) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - p.pattern.ports['B'] = dst - - with pytest.raises(BuildError, match=match): - p.trace_into('A', 'B', plug_destination=False, **kwargs) - - assert numpy.allclose(p.pattern.ports['A'].offset, (0, 0)) - assert numpy.isclose(p.pattern.ports['A'].rotation, 0) - assert numpy.allclose(p.pattern.ports['B'].offset, dst.offset) - assert dst.rotation is not None - assert p.pattern.ports['B'].rotation is not None - assert numpy.isclose(p.pattern.ports['B'].rotation, dst.rotation) - assert len(p.paths['A']) == 0 - - -def test_pather_two_l_fallback_validation_rejects_out_ptype_sensitive_jog() -> None: - class OutPtypeSensitiveTool(Tool): - def planL(self, ccw, length, *, in_ptype=None, out_ptype=None, **kwargs): - radius = 1 if out_ptype is None else 2 - if ccw is None: - rotation = pi - jog = 0 - elif bool(ccw): - rotation = -pi / 2 - jog = radius - else: - rotation = pi / 2 - jog = -radius - ptype = out_ptype or in_ptype or 'wire' - return Port((length, jog), rotation=rotation, ptype=ptype), {'ccw': ccw, 'length': length} - - p = Pather(Library(), tools=OutPtypeSensitiveTool()) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - - with pytest.raises(BuildError, match='fallback via two planL'): - p.jog('A', 5, length=10, out_ptype='wide') - - assert numpy.allclose(p.pattern.ports['A'].offset, (0, 0)) - assert numpy.isclose(p.pattern.ports['A'].rotation, 0) - assert len(p.paths['A']) == 0 - - -def test_pather_two_l_fallback_validation_rejects_out_ptype_sensitive_uturn() -> None: - class OutPtypeSensitiveTool(Tool): - def planL(self, ccw, length, *, in_ptype=None, out_ptype=None, **kwargs): - radius = 1 if out_ptype is None else 2 - if ccw is None: - rotation = pi - jog = 0 - elif bool(ccw): - rotation = -pi / 2 - jog = radius - else: - rotation = pi / 2 - jog = -radius - ptype = out_ptype or in_ptype or 'wire' - return Port((length, jog), rotation=rotation, ptype=ptype), {'ccw': ccw, 'length': length} - - p = Pather(Library(), tools=OutPtypeSensitiveTool()) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - - with pytest.raises(BuildError, match='fallback via two planL'): - p.uturn('A', 5, length=10, out_ptype='wide') - - assert numpy.allclose(p.pattern.ports['A'].offset, (0, 0)) - assert numpy.isclose(p.pattern.ports['A'].rotation, 0) - assert len(p.paths['A']) == 0 - - -def test_tool_planL_fallback_accepts_custom_port_names() -> None: - class DummyTool(Tool): - def traceL(self, ccw, length, *, in_ptype=None, out_ptype=None, port_names=('A', 'B'), **kwargs) -> Library: - lib = Library() - pat = Pattern() - pat.ports[port_names[0]] = Port((0, 0), 0, ptype='wire') - pat.ports[port_names[1]] = Port((length, 0), pi, ptype='wire') - lib['top'] = pat - return lib - - out_port, _ = DummyTool().planL(None, 5, port_names=('X', 'Y')) - assert numpy.allclose(out_port.offset, (5, 0)) - assert numpy.isclose(out_port.rotation, pi) - - -def test_tool_planS_fallback_accepts_custom_port_names() -> None: - class DummyTool(Tool): - def traceS(self, length, jog, *, in_ptype=None, out_ptype=None, port_names=('A', 'B'), **kwargs) -> Library: - lib = Library() - pat = Pattern() - pat.ports[port_names[0]] = Port((0, 0), 0, ptype='wire') - pat.ports[port_names[1]] = Port((length, jog), pi, ptype='wire') - lib['top'] = pat - return lib - - out_port, _ = DummyTool().planS(5, 2, port_names=('X', 'Y')) - assert numpy.allclose(out_port.offset, (5, 2)) - assert numpy.isclose(out_port.rotation, pi) - - -def test_pather_uturn_failed_fallback_is_atomic() -> None: - lib = Library() - tool = PathTool(layer='M1', width=2, ptype='wire') - p = Pather(lib, tools=tool) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - - with pytest.raises(BuildError, match='shorter than required bend'): - p.uturn('A', 1.5, length=0) - - assert numpy.allclose(p.pattern.ports['A'].offset, (0, 0)) - assert p.pattern.ports['A'].rotation == 0 - assert len(p.paths['A']) == 0 - - -def test_pather_render_auto_renames_single_use_tool_children() -> None: - class FullTreeTool(Tool): - def planL(self, ccw, length, *, in_ptype=None, out_ptype=None, **kwargs): # noqa: ANN001,ANN202 - ptype = out_ptype or in_ptype or 'wire' - return Port((length, 0), rotation=pi, ptype=ptype), {'length': length} - - def render(self, batch, *, port_names=('A', 'B'), **kwargs) -> Library: # noqa: ANN001,ANN202 - tree = Library() - top = Pattern(ports={ - port_names[0]: Port((0, 0), 0, ptype='wire'), - port_names[1]: Port((1, 0), pi, ptype='wire'), - }) - child = Pattern(annotations={'batch': [len(batch)]}) - top.ref('_seg') - tree['_top'] = top - tree['_seg'] = child - return tree - - lib = Library() - p = Pather(lib, tools=FullTreeTool(), auto_render=False) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - - p.straight('A', 10) - p.render() - p.straight('A', 10) - p.render() - - assert len(lib) == 2 - assert set(lib.keys()) == set(p.pattern.refs.keys()) - assert len(set(p.pattern.refs.keys())) == 2 - assert all(name.startswith('_seg') for name in lib) - assert p.pattern.referenced_patterns() <= set(lib.keys()) - - -def test_tool_render_fallback_preserves_segment_subtrees() -> None: - class TraceTreeTool(Tool): - def traceL(self, ccw, length, *, in_ptype=None, out_ptype=None, port_names=('A', 'B'), **kwargs) -> Library: # noqa: ANN001 - tree = Library() - top = Pattern(ports={ - port_names[0]: Port((0, 0), 0, ptype='wire'), - port_names[1]: Port((length, 0), pi, ptype='wire'), - }) - child = Pattern(annotations={'length': [length]}) - top.ref('_seg') - tree['_top'] = top - tree['_seg'] = child - return tree - - lib = Library() - p = Pather(lib, tools=TraceTreeTool(), auto_render=False) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - - p.straight('A', 10) - p.render() - - assert '_seg' in lib - assert '_seg' in p.pattern.refs - assert p.pattern.referenced_patterns() <= set(lib.keys()) - - -def test_pather_render_rejects_missing_single_use_tool_refs() -> None: - class MissingSingleUseTool(Tool): - def planL(self, ccw, length, *, in_ptype=None, out_ptype=None, **kwargs): # noqa: ANN001,ANN202 - ptype = out_ptype or in_ptype or 'wire' - return Port((length, 0), rotation=pi, ptype=ptype), {'length': length} - - def render(self, batch, *, port_names=('A', 'B'), **kwargs) -> Library: # noqa: ANN001,ANN202 - tree = Library() - top = Pattern(ports={ - port_names[0]: Port((0, 0), 0, ptype='wire'), - port_names[1]: Port((1, 0), pi, ptype='wire'), - }) - top.ref('_seg') - tree['_top'] = top - return tree - - lib = Library() - lib['_seg'] = Pattern(annotations={'stale': [1]}) - p = Pather(lib, tools=MissingSingleUseTool(), auto_render=False) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - p.straight('A', 10) - - with pytest.raises(BuildError, match='missing single-use refs'): - p.render() - - assert list(lib.keys()) == ['_seg'] - assert not p.pattern.refs - - -def test_pather_render_allows_missing_non_single_use_tool_refs() -> None: - class SharedRefTool(Tool): - def planL(self, ccw, length, *, in_ptype=None, out_ptype=None, **kwargs): # noqa: ANN001,ANN202 - ptype = out_ptype or in_ptype or 'wire' - return Port((length, 0), rotation=pi, ptype=ptype), {'length': length} - - def render(self, batch, *, port_names=('A', 'B'), **kwargs) -> Library: # noqa: ANN001,ANN202 - tree = Library() - top = Pattern(ports={ - port_names[0]: Port((0, 0), 0, ptype='wire'), - port_names[1]: Port((1, 0), pi, ptype='wire'), - }) - top.ref('shared') - tree['_top'] = top - return tree - - lib = Library() - lib['shared'] = Pattern(annotations={'shared': [1]}) - p = Pather(lib, tools=SharedRefTool(), auto_render=False) - p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire') - - p.straight('A', 10) - p.render() - - assert 'shared' in p.pattern.refs - assert p.pattern.referenced_patterns() <= set(lib.keys()) - - -def test_renderpather_rename_to_none_keeps_pending_geometry_without_port() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1000) - rp = Pather(lib, tools=tool, auto_render=False) - rp.pattern.ports['A'] = Port((0, 0), rotation=0) - - rp.at('A').straight(5000) - rp.rename_ports({'A': None}) - - assert 'A' not in rp.pattern.ports - assert len(rp.paths['A']) == 1 - - rp.render() - assert rp.pattern.has_shapes() - assert 'A' not in rp.pattern.ports - - -def test_pather_place_treeview_resolves_once() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1000) - p = Pather(lib, tools=tool) - - tree = {'child': Pattern(ports={'B': Port((1, 0), pi)})} - - p.place(tree) - - assert len(lib) == 1 - assert 'child' in lib - assert 'child' in p.pattern.refs - assert 'B' in p.pattern.ports - - -def test_pather_plug_treeview_resolves_once() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1000) - p = Pather(lib, tools=tool) - p.pattern.ports['A'] = Port((0, 0), rotation=0) - - tree = {'child': Pattern(ports={'B': Port((0, 0), pi)})} - - p.plug(tree, {'A': 'B'}) - - assert len(lib) == 1 - assert 'child' in lib - assert 'child' in p.pattern.refs - assert 'A' not in p.pattern.ports - - -def test_pather_failed_plug_does_not_add_break_marker() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1000) - p = Pather(lib, tools=tool, auto_render=False) - p.pattern.annotations = {'k': [1]} - p.pattern.ports['A'] = Port((0, 0), rotation=0) - - p.at('A').trace(None, 5000) - assert [step.opcode for step in p.paths['A']] == ['L'] - - other = Pattern( - annotations={'k': [2]}, - ports={'X': Port((0, 0), pi), 'Y': Port((5, 0), 0)}, - ) - - with pytest.raises(PatternError, match='Annotation keys overlap'): - p.plug(other, {'A': 'X'}, map_out={'Y': 'Z'}, append=True) - - assert [step.opcode for step in p.paths['A']] == ['L'] - assert set(p.pattern.ports) == {'A'} - - -def test_pather_place_reused_deleted_name_keeps_break_marker() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1000) - p = Pather(lib, tools=tool, auto_render=False) - p.pattern.ports['A'] = Port((0, 0), rotation=0) - - p.at('A').straight(5000) - p.rename_ports({'A': None}) - - other = Pattern(ports={'X': Port((-5000, 0), rotation=0)}) - p.place(other, port_map={'X': 'A'}, append=True) - p.at('A').straight(2000) - - assert [step.opcode for step in p.paths['A']] == ['L', 'P', 'L'] - - p.render() - assert p.pattern.has_shapes() - assert 'A' in p.pattern.ports - assert numpy.allclose(p.pattern.ports['A'].offset, (-7000, 0)) - - -def test_pather_plug_reused_deleted_name_keeps_break_marker() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1000) - p = Pather(lib, tools=tool, auto_render=False) - p.pattern.ports['A'] = Port((0, 0), rotation=0) - p.pattern.ports['B'] = Port((0, 0), rotation=0) - - p.at('A').straight(5000) - p.rename_ports({'A': None}) - - other = Pattern( - ports={ - 'X': Port((0, 0), rotation=pi), - 'Y': Port((-5000, 0), rotation=0), - }, - ) - p.plug(other, {'B': 'X'}, map_out={'Y': 'A'}, append=True) - p.at('A').straight(2000) - - assert [step.opcode for step in p.paths['A']] == ['L', 'P', 'L'] - - p.render() - assert p.pattern.has_shapes() - assert 'A' in p.pattern.ports - assert 'B' not in p.pattern.ports - assert numpy.allclose(p.pattern.ports['A'].offset, (-7000, 0)) - - -def test_pather_failed_plugged_does_not_add_break_marker() -> None: - lib = Library() - tool = PathTool(layer='M1', width=1000) - p = Pather(lib, tools=tool, auto_render=False) - p.pattern.ports['A'] = Port((0, 0), rotation=0) - - p.at('A').straight(5000) - assert [step.opcode for step in p.paths['A']] == ['L'] - - with pytest.raises(PortError, match='Connection destination ports were not found'): - p.plugged({'A': 'missing'}) - - assert [step.opcode for step in p.paths['A']] == ['L'] - assert set(p.paths) == {'A'} diff --git a/masque/test/test_pattern.py b/masque/test/test_pattern.py deleted file mode 100644 index 26b1255..0000000 --- a/masque/test/test_pattern.py +++ /dev/null @@ -1,310 +0,0 @@ -import pytest -import copy -from typing import cast -from numpy.testing import assert_equal, assert_allclose -from numpy import pi - -from ..error import PatternError -from ..abstract import Abstract -from ..pattern import Pattern -from ..shapes import Polygon -from ..ref import Ref -from ..ports import Port, PortError -from ..label import Label -from ..repetition import Grid - - -def test_pattern_init() -> None: - pat = Pattern() - assert pat.is_empty() - assert not pat.has_shapes() - assert not pat.has_refs() - assert not pat.has_labels() - assert not pat.has_ports() - - -def test_pattern_with_elements() -> None: - poly = Polygon.square(10) - label = Label("test", offset=(5, 5)) - ref = Ref(offset=(100, 100)) - port = Port((0, 0), 0) - - pat = Pattern(shapes={(1, 0): [poly]}, labels={(1, 2): [label]}, refs={"sub": [ref]}, ports={"P1": port}) - - assert pat.has_shapes() - assert pat.has_labels() - assert pat.has_refs() - assert pat.has_ports() - assert not pat.is_empty() - assert pat.shapes[(1, 0)] == [poly] - assert pat.labels[(1, 2)] == [label] - assert pat.refs["sub"] == [ref] - assert pat.ports["P1"] == port - - -def test_pattern_append() -> None: - pat1 = Pattern() - pat1.polygon((1, 0), vertices=[[0, 0], [1, 0], [1, 1]]) - - pat2 = Pattern() - pat2.polygon((2, 0), vertices=[[10, 10], [11, 10], [11, 11]]) - - pat1.append(pat2) - assert len(pat1.shapes[(1, 0)]) == 1 - assert len(pat1.shapes[(2, 0)]) == 1 - - -def test_pattern_translate() -> None: - pat = Pattern() - pat.polygon((1, 0), vertices=[[0, 0], [1, 0], [1, 1]]) - pat.ports["P1"] = Port((5, 5), 0) - - pat.translate_elements((10, 20)) - - # Polygon.translate adds to vertices, and offset is always (0,0) - assert_equal(cast("Polygon", pat.shapes[(1, 0)][0]).vertices[0], [10, 20]) - assert_equal(pat.ports["P1"].offset, [15, 25]) - - -def test_pattern_scale() -> None: - pat = Pattern() - # Polygon.rect sets an offset in its constructor which is immediately translated into vertices - pat.rect((1, 0), xmin=0, xmax=1, ymin=0, ymax=1) - pat.scale_by(2) - - # Vertices should be scaled - assert_equal(cast("Polygon", pat.shapes[(1, 0)][0]).vertices, [[0, 0], [0, 2], [2, 2], [2, 0]]) - - -def test_pattern_rotate() -> None: - pat = Pattern() - pat.polygon((1, 0), vertices=[[10, 0], [11, 0], [10, 1]]) - # Rotate 90 degrees CCW around (0,0) - pat.rotate_around((0, 0), pi / 2) - - # [10, 0] rotated 90 deg around (0,0) is [0, 10] - assert_allclose(cast("Polygon", pat.shapes[(1, 0)][0]).vertices[0], [0, 10], atol=1e-10) - - -def test_pattern_mirror() -> None: - pat = Pattern() - pat.polygon((1, 0), vertices=[[10, 5], [11, 5], [10, 6]]) - # Mirror across X axis (y -> -y) - pat.mirror(0) - - assert_equal(cast("Polygon", pat.shapes[(1, 0)][0]).vertices[0], [10, -5]) - - -def test_pattern_get_bounds() -> None: - pat = Pattern() - pat.polygon((1, 0), vertices=[[0, 0], [10, 0], [10, 10]]) - pat.polygon((1, 0), vertices=[[-5, -5], [5, -5], [5, 5]]) - - bounds = pat.get_bounds() - assert_equal(bounds, [[-5, -5], [10, 10]]) - - -def test_pattern_flatten_preserves_ports_only_child() -> None: - child = Pattern(ports={"P1": Port((1, 2), 0)}) - - parent = Pattern() - parent.ref("child", offset=(10, 10)) - - parent.flatten({"child": child}, flatten_ports=True) - - assert set(parent.ports) == {"P1"} - assert parent.ports["P1"].rotation == 0 - assert tuple(parent.ports["P1"].offset) == (11.0, 12.0) - - -def test_pattern_flatten_repeated_ref_with_ports_raises() -> None: - child = Pattern(ports={"P1": Port((1, 2), 0)}) - child.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]]) - - parent = Pattern() - parent.ref("child", repetition=Grid(a_vector=(10, 0), a_count=2)) - - with pytest.raises(PatternError, match='Cannot flatten ports from repeated ref'): - parent.flatten({"child": child}, flatten_ports=True) - - -def test_pattern_place_requires_abstract_for_reference() -> None: - parent = Pattern() - child = Pattern() - - with pytest.raises(PatternError, match='Must provide an `Abstract`'): - parent.place(child) - - assert not parent.ports - - -def test_pattern_place_append_requires_pattern_atomically() -> None: - parent = Pattern() - child = Abstract("child", {"A": Port((1, 2), 0)}) - - with pytest.raises(PatternError, match='Must provide a full `Pattern`'): - parent.place(child, append=True) - - assert not parent.ports - - -def test_pattern_place_append_annotation_conflict_is_atomic() -> None: - parent = Pattern(annotations={"k": [1]}) - child = Pattern(annotations={"k": [2]}, ports={"A": Port((1, 2), 0)}) - - with pytest.raises(PatternError, match="Annotation keys overlap"): - parent.place(child, append=True) - - assert not parent.ports - assert parent.annotations == {"k": [1]} - - -def test_pattern_place_skip_geometry_overwrites_colliding_ports_last_wins() -> None: - parent = Pattern(ports={ - "A": Port((5, 5), 0), - "keep": Port((9, 9), 0), - }) - child = Pattern(ports={ - "X": Port((1, 0), 0), - "Y": Port((2, 0), pi / 2), - }) - - parent.place(child, port_map={"X": "A", "Y": "A"}, skip_geometry=True, append=True) - - assert set(parent.ports) == {"A", "keep"} - assert_allclose(parent.ports["A"].offset, (2, 0)) - assert parent.ports["A"].rotation is not None - assert_allclose(parent.ports["A"].rotation, pi / 2) - - -def test_pattern_interface() -> None: - source = Pattern() - source.ports["A"] = Port((10, 20), 0, ptype="test") - - iface = Pattern.interface(source, in_prefix="in_", out_prefix="out_") - - assert "in_A" in iface.ports - assert "out_A" in iface.ports - assert iface.ports["in_A"].rotation is not None - assert_allclose(iface.ports["in_A"].rotation, pi, atol=1e-10) - assert iface.ports["out_A"].rotation is not None - assert_allclose(iface.ports["out_A"].rotation, 0, atol=1e-10) - assert iface.ports["in_A"].ptype == "test" - assert iface.ports["out_A"].ptype == "test" - - -def test_pattern_interface_duplicate_port_map_targets_raise() -> None: - source = Pattern() - source.ports["A"] = Port((10, 20), 0) - source.ports["B"] = Port((30, 40), pi) - - with pytest.raises(PortError, match='Duplicate targets in `port_map`'): - Pattern.interface(source, port_map={"A": "X", "B": "X"}) - - -def test_pattern_interface_empty_port_map_copies_no_ports() -> None: - source = Pattern() - source.ports["A"] = Port((10, 20), 0) - source.ports["B"] = Port((30, 40), pi) - - assert not Pattern.interface(source, port_map={}).ports - assert not Pattern.interface(source, port_map=[]).ports - - -def test_pattern_plug_requires_abstract_for_reference_atomically() -> None: - parent = Pattern(ports={"X": Port((0, 0), 0)}) - child = Pattern(ports={"A": Port((0, 0), pi)}) - - with pytest.raises(PatternError, match='Must provide an `Abstract`'): - parent.plug(child, {"X": "A"}) - - assert set(parent.ports) == {"X"} - - -def test_pattern_plug_append_annotation_conflict_is_atomic() -> None: - parent = Pattern( - annotations={"k": [1]}, - ports={"X": Port((0, 0), 0), "Q": Port((9, 9), 0)}, - ) - child = Pattern( - annotations={"k": [2]}, - ports={"A": Port((0, 0), pi), "B": Port((5, 0), 0)}, - ) - - with pytest.raises(PatternError, match="Annotation keys overlap"): - parent.plug(child, {"X": "A"}, map_out={"B": "Y"}, append=True) - - assert set(parent.ports) == {"X", "Q"} - assert_allclose(parent.ports["X"].offset, (0, 0)) - assert_allclose(parent.ports["Q"].offset, (9, 9)) - assert parent.annotations == {"k": [1]} - - -def test_pattern_plug_skip_geometry_overwrites_colliding_ports_last_wins() -> None: - parent = Pattern(ports={ - "A": Port((0, 0), 0, ptype="wire"), - "B": Port((99, 99), 0, ptype="wire"), - }) - child = Pattern(ports={ - "in": Port((0, 0), pi, ptype="wire"), - "X": Port((10, 0), 0, ptype="wire"), - "Y": Port((20, 0), 0, ptype="wire"), - }) - - parent.plug(child, {"A": "in"}, map_out={"X": "B", "Y": "B"}, skip_geometry=True, append=True) - - assert "A" not in parent.ports - assert "B" in parent.ports - assert_allclose(parent.ports["B"].offset, (20, 0)) - assert parent.ports["B"].rotation is not None - assert_allclose(parent.ports["B"].rotation, 0) - - -def test_pattern_append_port_conflict_is_atomic() -> None: - pat1 = Pattern() - pat1.ports["A"] = Port((0, 0), 0) - - pat2 = Pattern() - pat2.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]]) - pat2.ports["A"] = Port((1, 0), 0) - - with pytest.raises(PatternError, match="Port names overlap"): - pat1.append(pat2) - - assert not pat1.shapes - assert set(pat1.ports) == {"A"} - - -def test_pattern_append_annotation_conflict_is_atomic() -> None: - pat1 = Pattern(annotations={"k": [1]}) - pat2 = Pattern(annotations={"k": [2]}) - pat2.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]]) - - with pytest.raises(PatternError, match="Annotation keys overlap"): - pat1.append(pat2) - - assert not pat1.shapes - assert pat1.annotations == {"k": [1]} - - -def test_pattern_deepcopy_does_not_share_shape_repetitions() -> None: - pat = Pattern() - pat.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]], repetition=Grid(a_vector=(10, 0), a_count=2)) - - pat2 = copy.deepcopy(pat) - pat2.scale_by(2) - - assert_allclose(cast("Polygon", pat.shapes[(1, 0)][0]).repetition.a_vector, [10, 0]) - assert_allclose(cast("Polygon", pat2.shapes[(1, 0)][0]).repetition.a_vector, [20, 0]) - - -def test_pattern_flatten_does_not_mutate_child_repetitions() -> None: - child = Pattern() - child.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]], repetition=Grid(a_vector=(10, 0), a_count=2)) - - parent = Pattern() - parent.ref("child", scale=2) - - parent.flatten({"child": child}) - - assert_allclose(cast("Polygon", child.shapes[(1, 0)][0]).repetition.a_vector, [10, 0]) diff --git a/masque/test/test_polygon.py b/masque/test/test_polygon.py deleted file mode 100644 index 5d98ad9..0000000 --- a/masque/test/test_polygon.py +++ /dev/null @@ -1,125 +0,0 @@ -import pytest -import numpy -from numpy.testing import assert_equal - - -from ..shapes import Polygon -from ..utils import R90 -from ..error import PatternError - - -@pytest.fixture -def polygon() -> Polygon: - return Polygon([[0, 0], [1, 0], [1, 1], [0, 1]]) - - -def test_vertices(polygon: Polygon) -> None: - assert_equal(polygon.vertices, [[0, 0], [1, 0], [1, 1], [0, 1]]) - - -def test_xs(polygon: Polygon) -> None: - assert_equal(polygon.xs, [0, 1, 1, 0]) - - -def test_ys(polygon: Polygon) -> None: - assert_equal(polygon.ys, [0, 0, 1, 1]) - - -def test_offset(polygon: Polygon) -> None: - assert_equal(polygon.offset, [0, 0]) - - -def test_square() -> None: - square = Polygon.square(1) - assert_equal(square.vertices, [[-0.5, -0.5], [-0.5, 0.5], [0.5, 0.5], [0.5, -0.5]]) - - -def test_rectangle() -> None: - rectangle = Polygon.rectangle(1, 2) - assert_equal(rectangle.vertices, [[-0.5, -1], [-0.5, 1], [0.5, 1], [0.5, -1]]) - - -def test_rect() -> None: - rect1 = Polygon.rect(xmin=0, xmax=1, ymin=-1, ymax=1) - assert_equal(rect1.vertices, [[0, -1], [0, 1], [1, 1], [1, -1]]) - - rect2 = Polygon.rect(xmin=0, lx=1, ymin=-1, ly=2) - assert_equal(rect2.vertices, [[0, -1], [0, 1], [1, 1], [1, -1]]) - - rect3 = Polygon.rect(xctr=0, lx=1, yctr=-2, ly=2) - assert_equal(rect3.vertices, [[-0.5, -3], [-0.5, -1], [0.5, -1], [0.5, -3]]) - - rect4 = Polygon.rect(xctr=0, xmax=1, yctr=-2, ymax=0) - assert_equal(rect4.vertices, [[-1, -4], [-1, 0], [1, 0], [1, -4]]) - - with pytest.raises(PatternError): - Polygon.rect(xctr=0, yctr=-2, ymax=0) - with pytest.raises(PatternError): - Polygon.rect(xmin=0, yctr=-2, ymax=0) - with pytest.raises(PatternError): - Polygon.rect(xmax=0, yctr=-2, ymax=0) - with pytest.raises(PatternError): - Polygon.rect(lx=0, yctr=-2, ymax=0) - with pytest.raises(PatternError): - Polygon.rect(yctr=0, xctr=-2, xmax=0) - with pytest.raises(PatternError): - Polygon.rect(ymin=0, xctr=-2, xmax=0) - with pytest.raises(PatternError): - Polygon.rect(ymax=0, xctr=-2, xmax=0) - with pytest.raises(PatternError): - Polygon.rect(ly=0, xctr=-2, xmax=0) - - -def test_octagon() -> None: - octagon = Polygon.octagon(side_length=1) # regular=True - assert_equal(octagon.vertices.shape, (8, 2)) - diff = octagon.vertices - numpy.roll(octagon.vertices, -1, axis=0) - side_len = numpy.sqrt((diff * diff).sum(axis=1)) - assert numpy.allclose(side_len, 1) - - -def test_to_polygons(polygon: Polygon) -> None: - assert polygon.to_polygons() == [polygon] - - -def test_get_bounds_single(polygon: Polygon) -> None: - assert_equal(polygon.get_bounds_single(), [[0, 0], [1, 1]]) - - -def test_rotate(polygon: Polygon) -> None: - rotated_polygon = polygon.rotate(R90) - assert_equal(rotated_polygon.vertices, [[0, 0], [0, 1], [-1, 1], [-1, 0]]) - - -def test_mirror(polygon: Polygon) -> None: - mirrored_by_y = polygon.deepcopy().mirror(1) - assert_equal(mirrored_by_y.vertices, [[0, 0], [-1, 0], [-1, 1], [0, 1]]) - print(polygon.vertices) - mirrored_by_x = polygon.deepcopy().mirror(0) - assert_equal(mirrored_by_x.vertices, [[0, 0], [1, 0], [1, -1], [0, -1]]) - - -def test_scale_by(polygon: Polygon) -> None: - scaled_polygon = polygon.scale_by(2) - assert_equal(scaled_polygon.vertices, [[0, 0], [2, 0], [2, 2], [0, 2]]) - - -def test_clean_vertices(polygon: Polygon) -> None: - polygon = Polygon([[0, 0], [1, 1], [2, 2], [2, 2], [2, -4], [2, 0], [0, 0]]).clean_vertices() - assert_equal(polygon.vertices, [[0, 0], [2, 2], [2, 0]]) - - -def test_remove_duplicate_vertices() -> None: - polygon = Polygon([[0, 0], [1, 1], [2, 2], [2, 2], [2, 0], [0, 0]]).remove_duplicate_vertices() - assert_equal(polygon.vertices, [[0, 0], [1, 1], [2, 2], [2, 0]]) - - -def test_remove_colinear_vertices() -> None: - polygon = Polygon([[0, 0], [1, 1], [2, 2], [2, 2], [2, 0], [0, 0]]).remove_colinear_vertices() - assert_equal(polygon.vertices, [[0, 0], [2, 2], [2, 0]]) - - -def test_vertices_dtype() -> None: - polygon = Polygon(numpy.array([[0, 0], [1, 0], [1, 1], [0, 1], [0, 0]], dtype=numpy.int32)) - polygon.scale_by(0.5) - assert_equal(polygon.vertices, [[0, 0], [0.5, 0], [0.5, 0.5], [0, 0.5], [0, 0]]) diff --git a/masque/test/test_ports.py b/masque/test/test_ports.py deleted file mode 100644 index 6d24879..0000000 --- a/masque/test/test_ports.py +++ /dev/null @@ -1,293 +0,0 @@ -import pytest -from numpy.testing import assert_equal, assert_allclose -from numpy import pi - -from ..ports import Port, PortList -from ..error import PortError -from ..pattern import Pattern - - -def test_port_init() -> None: - p = Port(offset=(10, 20), rotation=pi / 2, ptype="test") - assert_equal(p.offset, [10, 20]) - assert p.rotation == pi / 2 - assert p.ptype == "test" - - -def test_port_transform() -> None: - p = Port(offset=(10, 0), rotation=0) - p.rotate_around((0, 0), pi / 2) - assert_allclose(p.offset, [0, 10], atol=1e-10) - assert p.rotation is not None - assert_allclose(p.rotation, pi / 2, atol=1e-10) - - p.mirror(0) # Mirror across x axis (axis 0): in-place relative to offset - assert_allclose(p.offset, [0, 10], atol=1e-10) - # rotation was pi/2 (90 deg), mirror across x (0 deg) -> -pi/2 == 3pi/2 - assert p.rotation is not None - assert_allclose(p.rotation, 3 * pi / 2, atol=1e-10) - - -def test_port_flip_across() -> None: - p = Port(offset=(10, 0), rotation=0) - p.flip_across(axis=1) # Mirror across x=0: flips x-offset - assert_equal(p.offset, [-10, 0]) - # rotation was 0, mirrored(1) -> pi - assert p.rotation is not None - assert_allclose(p.rotation, pi, atol=1e-10) - - -def test_port_measure_travel() -> None: - p1 = Port((0, 0), 0) - p2 = Port((10, 5), pi) # Facing each other - - (travel, jog), rotation = p1.measure_travel(p2) - assert travel == 10 - assert jog == 5 - assert rotation == pi - - -def test_port_list_measure_travel() -> None: - class MyPorts(PortList): - def __init__(self) -> None: - self._ports = { - "A": Port((0, 0), 0), - "B": Port((10, 5), pi), - } - - @property - def ports(self) -> dict[str, Port]: - return self._ports - - @ports.setter - def ports(self, val: dict[str, Port]) -> None: - self._ports = val - - pl = MyPorts() - (travel, jog), rotation = pl.measure_travel("A", "B") - assert travel == 10 - assert jog == 5 - assert rotation == pi - - -def test_port_describe_any_rotation() -> None: - p = Port((0, 0), None) - assert p.describe() == "pos=(0, 0), rot=any" - - -def test_port_list_rename() -> None: - class MyPorts(PortList): - def __init__(self) -> None: - self._ports = {"A": Port((0, 0), 0)} - - @property - def ports(self) -> dict[str, Port]: - return self._ports - - @ports.setter - def ports(self, val: dict[str, Port]) -> None: - self._ports = val - - pl = MyPorts() - pl.rename_ports({"A": "B"}) - assert "A" not in pl.ports - assert "B" in pl.ports - - -def test_port_list_rename_missing_port_raises() -> None: - class MyPorts(PortList): - def __init__(self) -> None: - self._ports = {"A": Port((0, 0), 0)} - - @property - def ports(self) -> dict[str, Port]: - return self._ports - - @ports.setter - def ports(self, val: dict[str, Port]) -> None: - self._ports = val - - pl = MyPorts() - with pytest.raises(PortError, match="Ports to rename were not found"): - pl.rename_ports({"missing": "B"}) - assert set(pl.ports) == {"A"} - - -def test_port_list_rename_colliding_targets_raises() -> None: - class MyPorts(PortList): - def __init__(self) -> None: - self._ports = {"A": Port((0, 0), 0), "B": Port((1, 0), 0)} - - @property - def ports(self) -> dict[str, Port]: - return self._ports - - @ports.setter - def ports(self, val: dict[str, Port]) -> None: - self._ports = val - - pl = MyPorts() - with pytest.raises(PortError, match="Renamed ports would collide"): - pl.rename_ports({"A": "C", "B": "C"}) - assert set(pl.ports) == {"A", "B"} - - -def test_port_list_add_port_pair_requires_distinct_names() -> None: - class MyPorts(PortList): - def __init__(self) -> None: - self._ports: dict[str, Port] = {} - - @property - def ports(self) -> dict[str, Port]: - return self._ports - - @ports.setter - def ports(self, val: dict[str, Port]) -> None: - self._ports = val - - pl = MyPorts() - with pytest.raises(PortError, match="Port names must be distinct"): - pl.add_port_pair(names=("A", "A")) - assert not pl.ports - - -def test_port_list_plugged() -> None: - class MyPorts(PortList): - def __init__(self) -> None: - self._ports = {"A": Port((10, 10), 0), "B": Port((10, 10), pi)} - - @property - def ports(self) -> dict[str, Port]: - return self._ports - - @ports.setter - def ports(self, val: dict[str, Port]) -> None: - self._ports = val - - pl = MyPorts() - pl.plugged({"A": "B"}) - assert not pl.ports # Both should be removed - - -def test_port_list_plugged_empty_raises() -> None: - class MyPorts(PortList): - def __init__(self) -> None: - self._ports = {"A": Port((10, 10), 0), "B": Port((10, 10), pi)} - - @property - def ports(self) -> dict[str, Port]: - return self._ports - - @ports.setter - def ports(self, val: dict[str, Port]) -> None: - self._ports = val - - pl = MyPorts() - with pytest.raises(PortError, match="Must provide at least one port connection"): - pl.plugged({}) - assert set(pl.ports) == {"A", "B"} - - -def test_port_list_plugged_missing_port_raises() -> None: - class MyPorts(PortList): - def __init__(self) -> None: - self._ports = {"A": Port((10, 10), 0), "B": Port((10, 10), pi)} - - @property - def ports(self) -> dict[str, Port]: - return self._ports - - @ports.setter - def ports(self, val: dict[str, Port]) -> None: - self._ports = val - - pl = MyPorts() - with pytest.raises(PortError, match="Connection source ports were not found"): - pl.plugged({"missing": "B"}) - assert set(pl.ports) == {"A", "B"} - - -def test_port_list_plugged_reused_port_raises_atomically() -> None: - class MyPorts(PortList): - def __init__(self) -> None: - self._ports = {"A": Port((0, 0), None), "B": Port((0, 0), None), "C": Port((0, 0), None)} - - @property - def ports(self) -> dict[str, Port]: - return self._ports - - @ports.setter - def ports(self, val: dict[str, Port]) -> None: - self._ports = val - - for connections in ({"A": "A"}, {"A": "B", "C": "B"}): - pl = MyPorts() - with pytest.raises(PortError, match="Each port may appear in at most one connection"): - pl.plugged(connections) - assert set(pl.ports) == {"A", "B", "C"} - - pl = MyPorts() - with pytest.raises(PortError, match="Connection destination ports were not found"): - pl.plugged({"A": "missing"}) - assert set(pl.ports) == {"A", "B", "C"} - - -def test_port_list_plugged_mismatch() -> None: - class MyPorts(PortList): - def __init__(self) -> None: - self._ports = { - "A": Port((10, 10), 0), - "B": Port((11, 10), pi), # Offset mismatch - } - - @property - def ports(self) -> dict[str, Port]: - return self._ports - - @ports.setter - def ports(self, val: dict[str, Port]) -> None: - self._ports = val - - pl = MyPorts() - with pytest.raises(PortError): - pl.plugged({"A": "B"}) - - -def test_port_list_check_ports_duplicate_map_in_values_raise() -> None: - class MyPorts(PortList): - def __init__(self) -> None: - self._ports = {"A": Port((0, 0), 0), "B": Port((0, 0), 0)} - - @property - def ports(self) -> dict[str, Port]: - return self._ports - - @ports.setter - def ports(self, val: dict[str, Port]) -> None: - self._ports = val - - pl = MyPorts() - with pytest.raises(PortError, match="Duplicate values in `map_in`"): - pl.check_ports({"X", "Y"}, map_in={"A": "X", "B": "X"}) - assert set(pl.ports) == {"A", "B"} - - -def test_pattern_plug_rejects_map_out_on_connected_ports_atomically() -> None: - host = Pattern(ports={"A": Port((0, 0), 0)}) - other = Pattern(ports={"X": Port((0, 0), pi), "Y": Port((5, 0), 0)}) - - with pytest.raises(PortError, match="`map_out` keys conflict with connected ports"): - host.plug(other, {"A": "X"}, map_out={"X": "renamed", "Y": "out"}, append=True) - - assert set(host.ports) == {"A"} - - -def test_find_transform_requires_connection_map() -> None: - host = Pattern(ports={"A": Port((0, 0), 0)}) - other = Pattern(ports={"X": Port((0, 0), pi)}) - - with pytest.raises(PortError, match="at least one port connection"): - host.find_transform(other, {}) - - with pytest.raises(PortError, match="at least one port connection"): - Pattern.find_port_transform({}, {}, {}) diff --git a/masque/test/test_ports2data.py b/masque/test/test_ports2data.py deleted file mode 100644 index 3f642ab..0000000 --- a/masque/test/test_ports2data.py +++ /dev/null @@ -1,132 +0,0 @@ -import numpy -import pytest -from numpy.testing import assert_allclose - -from ..utils.ports2data import ports_to_data, data_to_ports -from ..pattern import Pattern -from ..ports import Port -from ..library import Library -from ..error import PortError -from ..repetition import Grid - - -def test_ports2data_roundtrip() -> None: - pat = Pattern() - pat.ports["P1"] = Port((10, 20), numpy.pi / 2, ptype="test") - - layer = (10, 0) - ports_to_data(pat, layer) - - assert len(pat.labels[layer]) == 1 - assert pat.labels[layer][0].string == "P1:test 90" - assert tuple(pat.labels[layer][0].offset) == (10.0, 20.0) - - # New pattern, read ports back - pat2 = Pattern() - pat2.labels[layer] = pat.labels[layer] - data_to_ports([layer], {}, pat2) - - assert "P1" in pat2.ports - assert_allclose(pat2.ports["P1"].offset, [10, 20], atol=1e-10) - assert pat2.ports["P1"].rotation is not None - assert_allclose(pat2.ports["P1"].rotation, numpy.pi / 2, atol=1e-10) - assert pat2.ports["P1"].ptype == "test" - - -def test_data_to_ports_hierarchical() -> None: - lib = Library() - - # Child has port data in labels - child = Pattern() - layer = (10, 0) - child.label(layer=layer, string="A:type1 0", offset=(5, 0)) - lib["child"] = child - - # Parent references child - parent = Pattern() - parent.ref("child", offset=(100, 100), rotation=numpy.pi / 2) - - # Read ports hierarchically (max_depth > 0) - data_to_ports([layer], lib, parent, max_depth=1) - - # child port A (5,0) rot 0 - # transformed by parent ref: rot pi/2, trans (100, 100) - # (5,0) rot pi/2 -> (0, 5) - # (0, 5) + (100, 100) = (100, 105) - # rot 0 + pi/2 = pi/2 - assert "A" in parent.ports - assert_allclose(parent.ports["A"].offset, [100, 105], atol=1e-10) - assert parent.ports["A"].rotation is not None - assert_allclose(parent.ports["A"].rotation, numpy.pi / 2, atol=1e-10) - - -def test_data_to_ports_hierarchical_scaled_ref() -> None: - lib = Library() - - child = Pattern() - layer = (10, 0) - child.label(layer=layer, string="A:type1 0", offset=(5, 0)) - lib["child"] = child - - parent = Pattern() - parent.ref("child", offset=(100, 100), rotation=numpy.pi / 2, scale=2) - - data_to_ports([layer], lib, parent, max_depth=1) - - assert "A" in parent.ports - assert_allclose(parent.ports["A"].offset, [100, 110], atol=1e-10) - assert parent.ports["A"].rotation is not None - assert_allclose(parent.ports["A"].rotation, numpy.pi / 2, atol=1e-10) - - -def test_data_to_ports_hierarchical_repeated_ref_warns_and_keeps_best_effort( - caplog: pytest.LogCaptureFixture, - ) -> None: - lib = Library() - - child = Pattern() - layer = (10, 0) - child.label(layer=layer, string="A:type1 0", offset=(5, 0)) - lib["child"] = child - - parent = Pattern() - parent.ref("child", repetition=Grid(a_vector=(100, 0), a_count=3)) - - caplog.set_level("WARNING") - data_to_ports([layer], lib, parent, max_depth=1) - - assert "A" in parent.ports - assert_allclose(parent.ports["A"].offset, [5, 0], atol=1e-10) - assert any("importing only the base instance ports" in record.message for record in caplog.records) - - -def test_data_to_ports_hierarchical_collision_is_atomic() -> None: - lib = Library() - - child = Pattern() - layer = (10, 0) - child.label(layer=layer, string="A:type1 0", offset=(5, 0)) - lib["child"] = child - - parent = Pattern() - parent.ref("child", offset=(0, 0)) - parent.ref("child", offset=(10, 0)) - - with pytest.raises(PortError, match="Device ports conflict with existing ports"): - data_to_ports([layer], lib, parent, max_depth=1) - - assert not parent.ports - - -def test_data_to_ports_flat_bad_angle_warns_and_skips( - caplog: pytest.LogCaptureFixture, - ) -> None: - layer = (10, 0) - pat = Pattern() - pat.label(layer=layer, string="A:type1 nope", offset=(5, 0)) - - caplog.set_level("WARNING") - data_to_ports([layer], {}, pat) - - assert not pat.ports - assert any('bad angle' in record.message for record in caplog.records) diff --git a/masque/test/test_raw_constructors.py b/masque/test/test_raw_constructors.py deleted file mode 100644 index 2f86ba0..0000000 --- a/masque/test/test_raw_constructors.py +++ /dev/null @@ -1,97 +0,0 @@ -import numpy -from numpy import pi -from numpy.testing import assert_allclose - -from ..shapes import Arc, Circle, Ellipse, Path, Text - - -def test_circle_raw_constructor_matches_public() -> None: - raw = Circle._from_raw( - radius=5.0, - offset=numpy.array([1.0, 2.0]), - annotations={'1': ['circle']}, - ) - public = Circle( - radius=5.0, - offset=(1.0, 2.0), - annotations={'1': ['circle']}, - ) - assert raw == public - - -def test_ellipse_raw_constructor_matches_public() -> None: - raw = Ellipse._from_raw( - radii=numpy.array([3.0, 5.0]), - offset=numpy.array([1.0, 2.0]), - rotation=5 * pi / 2, - annotations={'2': ['ellipse']}, - ) - public = Ellipse( - radii=(3.0, 5.0), - offset=(1.0, 2.0), - rotation=5 * pi / 2, - annotations={'2': ['ellipse']}, - ) - assert raw == public - - -def test_arc_raw_constructor_matches_public() -> None: - raw = Arc._from_raw( - radii=numpy.array([10.0, 6.0]), - angles=numpy.array([0.0, pi / 2]), - width=2.0, - offset=numpy.array([1.0, 2.0]), - rotation=5 * pi / 2, - annotations={'3': ['arc']}, - ) - public = Arc( - radii=(10.0, 6.0), - angles=(0.0, pi / 2), - width=2.0, - offset=(1.0, 2.0), - rotation=5 * pi / 2, - annotations={'3': ['arc']}, - ) - assert raw == public - - -def test_path_raw_constructor_matches_public() -> None: - raw = Path._from_raw( - vertices=numpy.array([[0.0, 0.0], [10.0, 0.0], [10.0, 5.0]]), - width=2.0, - cap=Path.Cap.SquareCustom, - cap_extensions=numpy.array([1.0, 3.0]), - annotations={'4': ['path']}, - ) - public = Path( - vertices=((0.0, 0.0), (10.0, 0.0), (10.0, 5.0)), - width=2.0, - cap=Path.Cap.SquareCustom, - cap_extensions=(1.0, 3.0), - annotations={'4': ['path']}, - ) - assert raw == public - assert raw.cap_extensions is not None - assert_allclose(raw.cap_extensions, [1.0, 3.0]) - - -def test_text_raw_constructor_matches_public() -> None: - raw = Text._from_raw( - string='RAW', - height=12.0, - font_path='font.otf', - offset=numpy.array([1.0, 2.0]), - rotation=5 * pi / 2, - mirrored=True, - annotations={'5': ['text']}, - ) - public = Text( - string='RAW', - height=12.0, - font_path='font.otf', - offset=(1.0, 2.0), - rotation=5 * pi / 2, - mirrored=True, - annotations={'5': ['text']}, - ) - assert raw == public diff --git a/masque/test/test_rect_collection.py b/masque/test/test_rect_collection.py deleted file mode 100644 index 449f4fa..0000000 --- a/masque/test/test_rect_collection.py +++ /dev/null @@ -1,70 +0,0 @@ -import copy - -import numpy -import pytest -from numpy.testing import assert_allclose, assert_equal - -from ..error import PatternError -from ..shapes import Polygon, RectCollection - - -def test_rect_collection_init_and_to_polygons() -> None: - rects = RectCollection([[10, 10, 12, 12], [0, 0, 5, 5]]) - assert_equal(rects.rects, [[0, 0, 5, 5], [10, 10, 12, 12]]) - - polys = rects.to_polygons() - assert len(polys) == 2 - assert all(isinstance(poly, Polygon) for poly in polys) - assert_equal(polys[0].vertices, [[0, 0], [0, 5], [5, 5], [5, 0]]) - - -def test_rect_collection_rejects_invalid_rects() -> None: - with pytest.raises(PatternError): - RectCollection([[0, 0, 1]]) - with pytest.raises(PatternError): - RectCollection([[5, 0, 1, 2]]) - with pytest.raises(PatternError): - RectCollection([[0, 5, 1, 2]]) - - -def test_rect_collection_raw_constructor_matches_public() -> None: - raw = RectCollection._from_raw( - rects=numpy.array([[10.0, 10.0, 12.0, 12.0], [0.0, 0.0, 5.0, 5.0]]), - annotations={'1': ['rects']}, - ) - public = RectCollection( - [[0, 0, 5, 5], [10, 10, 12, 12]], - annotations={'1': ['rects']}, - ) - assert raw == public - assert_equal(raw.get_bounds_single(), [[0, 0], [12, 12]]) - - -def test_rect_collection_manhattan_transforms() -> None: - rects = RectCollection([[0, 0, 2, 4], [10, 20, 12, 22]]) - - mirrored = copy.deepcopy(rects).mirror(1) - assert_equal(mirrored.rects, [[-2, 0, 0, 4], [-12, 20, -10, 22]]) - - scaled = copy.deepcopy(rects).scale_by(-2) - assert_equal(scaled.rects, [[-4, -8, 0, 0], [-24, -44, -20, -40]]) - - rotated = copy.deepcopy(rects).rotate(numpy.pi / 2) - assert_equal(rotated.rects, [[-4, 0, 0, 2], [-22, 10, -20, 12]]) - - -def test_rect_collection_non_manhattan_rotation_raises() -> None: - rects = RectCollection([[0, 0, 2, 4]]) - with pytest.raises(PatternError, match='Manhattan rotations'): - rects.rotate(numpy.pi / 4) - - -def test_rect_collection_normalized_form_rebuild_is_independent() -> None: - rects = RectCollection([[0, 0, 2, 4], [10, 20, 12, 22]]) - _intrinsic, extrinsic, rebuild = rects.normalized_form(2) - - clone = rebuild() - clone.rects[:] = [[1, 1, 2, 2], [3, 3, 4, 4]] - - assert_allclose(extrinsic[0], [6, 11.5]) - assert_equal(rects.rects, [[0, 0, 2, 4], [10, 20, 12, 22]]) diff --git a/masque/test/test_ref.py b/masque/test/test_ref.py deleted file mode 100644 index de330fa..0000000 --- a/masque/test/test_ref.py +++ /dev/null @@ -1,111 +0,0 @@ -from typing import cast, TYPE_CHECKING -import pytest -from numpy.testing import assert_equal, assert_allclose -from numpy import pi - -from ..error import MasqueError -from ..pattern import Pattern -from ..ref import Ref -from ..repetition import Grid - -if TYPE_CHECKING: - from ..shapes import Polygon - - -def test_ref_init() -> None: - ref = Ref(offset=(10, 20), rotation=pi / 4, mirrored=True, scale=2.0) - assert_equal(ref.offset, [10, 20]) - assert ref.rotation == pi / 4 - assert ref.mirrored is True - assert ref.scale == 2.0 - - -def test_ref_as_pattern() -> None: - sub_pat = Pattern() - sub_pat.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]]) - - ref = Ref(offset=(10, 10), rotation=pi / 2, scale=2.0) - transformed_pat = ref.as_pattern(sub_pat) - - # Check transformed shape - shape = cast("Polygon", transformed_pat.shapes[(1, 0)][0]) - # ref.as_pattern deepcopies sub_pat then applies transformations: - # 1. pattern.scale_by(2) -> vertices [[0,0], [2,0], [0,2]] - # 2. pattern.rotate_around((0,0), pi/2) -> vertices [[0,0], [0,2], [-2,0]] - # 3. pattern.translate_elements((10,10)) -> vertices [[10,10], [10,12], [8,10]] - - assert_allclose(shape.vertices, [[10, 10], [10, 12], [8, 10]], atol=1e-10) - - -def test_ref_with_repetition() -> None: - sub_pat = Pattern() - sub_pat.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]]) - - rep = Grid(a_vector=(10, 0), b_vector=(0, 10), a_count=2, b_count=2) - ref = Ref(repetition=rep) - - repeated_pat = ref.as_pattern(sub_pat) - # Should have 4 shapes - assert len(repeated_pat.shapes[(1, 0)]) == 4 - - first_verts = sorted([tuple(cast("Polygon", s).vertices[0]) for s in repeated_pat.shapes[(1, 0)]]) - assert first_verts == [(0.0, 0.0), (0.0, 10.0), (10.0, 0.0), (10.0, 10.0)] - - -def test_ref_get_bounds() -> None: - sub_pat = Pattern() - sub_pat.polygon((1, 0), vertices=[[0, 0], [5, 0], [0, 5]]) - - ref = Ref(offset=(10, 10), scale=2.0) - bounds = ref.get_bounds_single(sub_pat) - # sub_pat bounds [[0,0], [5,5]] - # scaled [[0,0], [10,10]] - # translated [[10,10], [20,20]] - assert_equal(bounds, [[10, 10], [20, 20]]) - - -def test_ref_get_bounds_single_ignores_repetition_for_non_manhattan_rotation() -> None: - sub_pat = Pattern() - sub_pat.rect((1, 0), xmin=0, xmax=1, ymin=0, ymax=2) - - rep = Grid(a_vector=(5, 0), b_vector=(0, 7), a_count=3, b_count=2) - ref = Ref(offset=(10, 20), rotation=pi / 4, repetition=rep) - - bounds = ref.get_bounds_single(sub_pat) - repeated_bounds = ref.get_bounds(sub_pat) - - assert bounds is not None - assert repeated_bounds is not None - assert repeated_bounds[1, 0] > bounds[1, 0] - assert repeated_bounds[1, 1] > bounds[1, 1] - - -def test_ref_copy() -> None: - ref1 = Ref(offset=(1, 2), rotation=0.5, annotations={"a": [1]}) - ref2 = ref1.copy() - assert ref1 == ref2 - assert ref1 is not ref2 - - ref2.offset[0] = 100 - assert ref1.offset[0] == 1 - - -def test_ref_rejects_nonpositive_scale() -> None: - with pytest.raises(MasqueError, match='Scale must be positive'): - Ref(scale=0) - - with pytest.raises(MasqueError, match='Scale must be positive'): - Ref(scale=-1) - - -def test_ref_scale_by_rejects_nonpositive_scale() -> None: - ref = Ref(scale=2.0) - - with pytest.raises(MasqueError, match='Scale must be positive'): - ref.scale_by(-1) - - -def test_ref_eq_unrelated_objects_is_false() -> None: - ref = Ref() - assert not (ref == None) - assert not (ref == object()) diff --git a/masque/test/test_renderpather.py b/masque/test/test_renderpather.py deleted file mode 100644 index b518a1f..0000000 --- a/masque/test/test_renderpather.py +++ /dev/null @@ -1,199 +0,0 @@ -import pytest -from typing import cast, TYPE_CHECKING -from numpy.testing import assert_allclose -from numpy import pi - -from ..builder import Pather -from ..builder.tools import PathTool -from ..library import Library -from ..ports import Port - -if TYPE_CHECKING: - from ..shapes import Path - - -@pytest.fixture -def rpather_setup() -> tuple[Pather, PathTool, Library]: - lib = Library() - tool = PathTool(layer=(1, 0), width=2, ptype="wire") - rp = Pather(lib, tools=tool, auto_render=False) - rp.ports["start"] = Port((0, 0), pi / 2, ptype="wire") - return rp, tool, lib - - -def test_renderpather_basic(rpather_setup: tuple[Pather, PathTool, Library]) -> None: - rp, tool, lib = rpather_setup - # Plan two segments - rp.at("start").straight(10).straight(10) - - # Before rendering, no shapes in pattern - assert not rp.pattern.has_shapes() - assert len(rp.paths["start"]) == 2 - - # Render - rp.render() - assert rp.pattern.has_shapes() - assert len(rp.pattern.shapes[(1, 0)]) == 1 - - # Path vertices should be (0,0), (0,-10), (0,-20) - # transformed by start port (rot pi/2 -> 270 deg transform) - # wait, PathTool.render for opcode L uses rotation_matrix_2d(port_rot + pi) - # start_port rot pi/2. pi/2 + pi = 3pi/2. - # (10, 0) rotated 3pi/2 -> (0, -10) - # So vertices: (0,0), (0,-10), (0,-20) - path_shape = cast("Path", rp.pattern.shapes[(1, 0)][0]) - assert len(path_shape.vertices) == 3 - assert_allclose(path_shape.vertices, [[0, 0], [0, -10], [0, -20]], atol=1e-10) - - -def test_renderpather_bend(rpather_setup: tuple[Pather, PathTool, Library]) -> None: - rp, tool, lib = rpather_setup - # Plan straight then bend - rp.at("start").straight(10).cw(10) - - rp.render() - path_shape = cast("Path", rp.pattern.shapes[(1, 0)][0]) - # Path vertices: - # 1. Start (0,0) - # 2. Straight end: (0, -10) - # 3. Bend end: (-1, -20) - # PathTool.planL(ccw=False, length=10) returns data=[10, -1] - # start_port for 2nd segment is at (0, -10) with rotation pi/2 - # dxy = rot(pi/2 + pi) @ (10, 0) = (0, -10). So vertex at (0, -20). - # and final end_port.offset is (-1, -20). - assert len(path_shape.vertices) == 4 - assert_allclose(path_shape.vertices, [[0, 0], [0, -10], [0, -20], [-1, -20]], atol=1e-10) - - -def test_renderpather_jog_uses_native_pathtool_planS(rpather_setup: tuple[Pather, PathTool, Library]) -> None: - rp, tool, lib = rpather_setup - rp.at("start").jog(4, length=10) - - assert len(rp.paths["start"]) == 1 - assert rp.paths["start"][0].opcode == "S" - - rp.render() - path_shape = cast("Path", rp.pattern.shapes[(1, 0)][0]) - # Native PathTool S-bends place the jog width/2 before the route end. - assert_allclose(path_shape.vertices, [[0, 0], [0, -9], [4, -9], [4, -10]], atol=1e-10) - assert_allclose(rp.ports["start"].offset, [4, -10], atol=1e-10) - - -def test_renderpather_mirror_preserves_planned_bend_geometry(rpather_setup: tuple[Pather, PathTool, Library]) -> None: - rp, tool, lib = rpather_setup - rp.at("start").straight(10).cw(10) - - rp.mirror(0) - rp.render() - - path_shape = cast("Path", rp.pattern.shapes[(1, 0)][0]) - assert_allclose(path_shape.vertices, [[0, 0], [0, 10], [0, 20], [-1, 20]], atol=1e-10) - - -def test_renderpather_retool(rpather_setup: tuple[Pather, PathTool, Library]) -> None: - rp, tool1, lib = rpather_setup - tool2 = PathTool(layer=(2, 0), width=4, ptype="wire") - - rp.at("start").straight(10) - rp.retool(tool2, keys=["start"]) - rp.at("start").straight(10) - - rp.render() - # Different tools should cause different batches/shapes - assert len(rp.pattern.shapes[(1, 0)]) == 1 - assert len(rp.pattern.shapes[(2, 0)]) == 1 - - -def test_portpather_translate_only_affects_future_steps(rpather_setup: tuple[Pather, PathTool, Library]) -> None: - rp, tool, lib = rpather_setup - pp = rp.at("start") - pp.straight(10) - pp.translate((5, 0)) - pp.straight(10) - - rp.render() - - shapes = rp.pattern.shapes[(1, 0)] - assert len(shapes) == 2 - assert_allclose(cast("Path", shapes[0]).vertices, [[0, 0], [0, -10]], atol=1e-10) - assert_allclose(cast("Path", shapes[1]).vertices, [[5, -10], [5, -20]], atol=1e-10) - assert_allclose(rp.ports["start"].offset, [5, -20], atol=1e-10) - - -def test_renderpather_dead_ports() -> None: - lib = Library() - tool = PathTool(layer=(1, 0), width=1) - rp = Pather(lib, ports={"in": Port((0, 0), 0)}, tools=tool, auto_render=False) - rp.set_dead() - - # Impossible path - rp.straight("in", -10) - - # port_rot=0, forward is -x. path(-10) means moving -10 in -x direction -> +10 in x. - assert_allclose(rp.ports["in"].offset, [10, 0], atol=1e-10) - - # Verify no render steps were added - assert len(rp.paths["in"]) == 0 - - # Verify no geometry - rp.render() - assert not rp.pattern.has_shapes() - - -def test_renderpather_rename_port(rpather_setup: tuple[Pather, PathTool, Library]) -> None: - rp, tool, lib = rpather_setup - rp.at("start").straight(10) - # Rename port while path is planned - rp.rename_ports({"start": "new_start"}) - # Continue path on new name - rp.at("new_start").straight(10) - - assert "start" not in rp.paths - assert len(rp.paths["new_start"]) == 2 - - rp.render() - assert rp.pattern.has_shapes() - assert len(rp.pattern.shapes[(1, 0)]) == 1 - # Total length 20. start_port rot pi/2 -> 270 deg transform. - # Vertices (0,0), (0,-10), (0,-20) - path_shape = cast("Path", rp.pattern.shapes[(1, 0)][0]) - assert_allclose(path_shape.vertices, [[0, 0], [0, -10], [0, -20]], atol=1e-10) - assert "new_start" in rp.ports - assert_allclose(rp.ports["new_start"].offset, [0, -20], atol=1e-10) - - -def test_renderpather_drop_keeps_pending_geometry_without_port(rpather_setup: tuple[Pather, PathTool, Library]) -> None: - rp, tool, lib = rpather_setup - rp.at("start").straight(10).drop() - - assert "start" not in rp.ports - assert len(rp.paths["start"]) == 1 - - rp.render() - assert rp.pattern.has_shapes() - assert "start" not in rp.ports - path_shape = cast("Path", rp.pattern.shapes[(1, 0)][0]) - assert_allclose(path_shape.vertices, [[0, 0], [0, -10]], atol=1e-10) - - -def test_pathtool_traceL_bend_geometry_matches_ports() -> None: - tool = PathTool(layer=(1, 0), width=2, ptype="wire") - - tree = tool.traceL(True, 10) - pat = tree.top_pattern() - path_shape = cast("Path", pat.shapes[(1, 0)][0]) - - assert_allclose(path_shape.vertices, [[0, 0], [10, 0], [10, 1]], atol=1e-10) - assert_allclose(pat.ports["B"].offset, [10, 1], atol=1e-10) - - -def test_pathtool_traceS_geometry_matches_ports() -> None: - tool = PathTool(layer=(1, 0), width=2, ptype="wire") - - tree = tool.traceS(10, 4) - pat = tree.top_pattern() - path_shape = cast("Path", pat.shapes[(1, 0)][0]) - - assert_allclose(path_shape.vertices, [[0, 0], [9, 0], [9, 4], [10, 4]], atol=1e-10) - assert_allclose(pat.ports["B"].offset, [10, 4], atol=1e-10) - assert_allclose(pat.ports["B"].rotation, pi, atol=1e-10) diff --git a/masque/test/test_repetition.py b/masque/test/test_repetition.py deleted file mode 100644 index 0d0be41..0000000 --- a/masque/test/test_repetition.py +++ /dev/null @@ -1,91 +0,0 @@ -import pytest -from numpy.testing import assert_equal, assert_allclose -from numpy import pi - -from ..repetition import Grid, Arbitrary -from ..error import PatternError - - -def test_grid_displacements() -> None: - # 2x2 grid - grid = Grid(a_vector=(10, 0), b_vector=(0, 5), a_count=2, b_count=2) - disps = sorted([tuple(d) for d in grid.displacements]) - assert disps == [(0.0, 0.0), (0.0, 5.0), (10.0, 0.0), (10.0, 5.0)] - - -def test_grid_1d() -> None: - grid = Grid(a_vector=(10, 0), a_count=3) - disps = sorted([tuple(d) for d in grid.displacements]) - assert disps == [(0.0, 0.0), (10.0, 0.0), (20.0, 0.0)] - - -def test_grid_rotate() -> None: - grid = Grid(a_vector=(10, 0), a_count=2) - grid.rotate(pi / 2) - assert_allclose(grid.a_vector, [0, 10], atol=1e-10) - - -def test_grid_get_bounds() -> None: - grid = Grid(a_vector=(10, 0), b_vector=(0, 5), a_count=2, b_count=2) - bounds = grid.get_bounds() - assert_equal(bounds, [[0, 0], [10, 5]]) - - -def test_arbitrary_displacements() -> None: - pts = [[0, 0], [10, 20], [-5, 30]] - arb = Arbitrary(pts) - # They should be sorted by displacements.setter - disps = arb.displacements - assert len(disps) == 3 - assert any((disps == [0, 0]).all(axis=1)) - assert any((disps == [10, 20]).all(axis=1)) - assert any((disps == [-5, 30]).all(axis=1)) - - -def test_arbitrary_transform() -> None: - arb = Arbitrary([[10, 0]]) - arb.rotate(pi / 2) - assert_allclose(arb.displacements, [[0, 10]], atol=1e-10) - - arb.mirror(0) # Mirror x across y axis? Wait, mirror(axis=0) in repetition.py is: - # self.displacements[:, 1 - axis] *= -1 - # if axis=0, 1-axis=1, so y *= -1 - assert_allclose(arb.displacements, [[0, -10]], atol=1e-10) - - -def test_arbitrary_empty_repetition_is_allowed() -> None: - arb = Arbitrary([]) - assert arb.displacements.shape == (0, 2) - assert arb.get_bounds() is None - - -def test_arbitrary_rejects_non_nx2_displacements() -> None: - for displacements in ([[1], [2]], [[1, 2, 3]], [1, 2, 3]): - with pytest.raises(PatternError, match='displacements must be convertible to an Nx2 ndarray'): - Arbitrary(displacements) - - -def test_grid_count_setters_reject_nonpositive_values() -> None: - for attr, value, message in ( - ('a_count', 0, 'a_count'), - ('a_count', -1, 'a_count'), - ('b_count', 0, 'b_count'), - ('b_count', -1, 'b_count'), - ): - grid = Grid(a_vector=(10, 0), b_vector=(0, 5), a_count=2, b_count=2) - with pytest.raises(PatternError, match=message): - setattr(grid, attr, value) - - -def test_repetition_less_equal_includes_equality() -> None: - grid_a = Grid(a_vector=(10, 0), a_count=2) - grid_b = Grid(a_vector=(10, 0), a_count=2) - assert grid_a == grid_b - assert grid_a <= grid_b - assert grid_a >= grid_b - - arb_a = Arbitrary([[0, 0], [1, 0]]) - arb_b = Arbitrary([[0, 0], [1, 0]]) - assert arb_a == arb_b - assert arb_a <= arb_b - assert arb_a >= arb_b diff --git a/masque/test/test_rotation_consistency.py b/masque/test/test_rotation_consistency.py deleted file mode 100644 index f574f52..0000000 --- a/masque/test/test_rotation_consistency.py +++ /dev/null @@ -1,133 +0,0 @@ - -from typing import cast -import numpy as np -from numpy.testing import assert_allclose -from ..pattern import Pattern -from ..ref import Ref -from ..label import Label -from ..repetition import Grid - -def test_ref_rotate_intrinsic() -> None: - # Intrinsic rotate() should NOT affect repetition - rep = Grid(a_vector=(10, 0), a_count=2) - ref = Ref(repetition=rep) - - ref.rotate(np.pi/2) - - assert_allclose(ref.rotation, np.pi/2, atol=1e-10) - # Grid vector should still be (10, 0) - assert ref.repetition is not None - assert_allclose(cast('Grid', ref.repetition).a_vector, [10, 0], atol=1e-10) - -def test_ref_rotate_around_extrinsic() -> None: - # Extrinsic rotate_around() SHOULD affect repetition - rep = Grid(a_vector=(10, 0), a_count=2) - ref = Ref(repetition=rep) - - ref.rotate_around((0, 0), np.pi/2) - - assert_allclose(ref.rotation, np.pi/2, atol=1e-10) - # Grid vector should be rotated to (0, 10) - assert ref.repetition is not None - assert_allclose(cast('Grid', ref.repetition).a_vector, [0, 10], atol=1e-10) - -def test_pattern_rotate_around_extrinsic() -> None: - # Pattern.rotate_around() SHOULD affect repetition of its elements - rep = Grid(a_vector=(10, 0), a_count=2) - ref = Ref(repetition=rep) - - pat = Pattern() - pat.refs['cell'].append(ref) - - pat.rotate_around((0, 0), np.pi/2) - - # Check the ref inside the pattern - ref_in_pat = pat.refs['cell'][0] - assert_allclose(ref_in_pat.rotation, np.pi/2, atol=1e-10) - # Grid vector should be rotated to (0, 10) - assert ref_in_pat.repetition is not None - assert_allclose(cast('Grid', ref_in_pat.repetition).a_vector, [0, 10], atol=1e-10) - -def test_label_rotate_around_extrinsic() -> None: - # Extrinsic rotate_around() SHOULD affect repetition of labels - rep = Grid(a_vector=(10, 0), a_count=2) - lbl = Label("test", repetition=rep, offset=(5, 0)) - - lbl.rotate_around((0, 0), np.pi/2) - - # Label offset should be (0, 5) - assert_allclose(lbl.offset, [0, 5], atol=1e-10) - # Grid vector should be rotated to (0, 10) - assert lbl.repetition is not None - assert_allclose(cast('Grid', lbl.repetition).a_vector, [0, 10], atol=1e-10) - -def test_pattern_rotate_elements_intrinsic() -> None: - # rotate_elements() should NOT affect repetition - rep = Grid(a_vector=(10, 0), a_count=2) - ref = Ref(repetition=rep) - - pat = Pattern() - pat.refs['cell'].append(ref) - - pat.rotate_elements(np.pi/2) - - ref_in_pat = pat.refs['cell'][0] - assert_allclose(ref_in_pat.rotation, np.pi/2, atol=1e-10) - # Grid vector should still be (10, 0) - assert ref_in_pat.repetition is not None - assert_allclose(cast('Grid', ref_in_pat.repetition).a_vector, [10, 0], atol=1e-10) - -def test_pattern_rotate_element_centers_extrinsic() -> None: - # rotate_element_centers() SHOULD affect repetition and offset - rep = Grid(a_vector=(10, 0), a_count=2) - ref = Ref(repetition=rep, offset=(5, 0)) - - pat = Pattern() - pat.refs['cell'].append(ref) - - pat.rotate_element_centers(np.pi/2) - - ref_in_pat = pat.refs['cell'][0] - # Offset should be (0, 5) - assert_allclose(ref_in_pat.offset, [0, 5], atol=1e-10) - # Grid vector should be rotated to (0, 10) - assert ref_in_pat.repetition is not None - assert_allclose(cast('Grid', ref_in_pat.repetition).a_vector, [0, 10], atol=1e-10) - # Ref rotation should NOT be changed - assert_allclose(ref_in_pat.rotation, 0, atol=1e-10) - -def test_pattern_mirror_elements_intrinsic() -> None: - # mirror_elements() should NOT affect repetition or offset - rep = Grid(a_vector=(10, 5), a_count=2) - ref = Ref(repetition=rep, offset=(5, 2)) - - pat = Pattern() - pat.refs['cell'].append(ref) - - pat.mirror_elements(axis=0) # Mirror across x (flip y) - - ref_in_pat = pat.refs['cell'][0] - assert ref_in_pat.mirrored is True - # Repetition and offset should be unchanged - assert ref_in_pat.repetition is not None - assert_allclose(cast('Grid', ref_in_pat.repetition).a_vector, [10, 5], atol=1e-10) - assert_allclose(ref_in_pat.offset, [5, 2], atol=1e-10) - -def test_pattern_mirror_element_centers_extrinsic() -> None: - # mirror_element_centers() SHOULD affect repetition and offset - rep = Grid(a_vector=(10, 5), a_count=2) - ref = Ref(repetition=rep, offset=(5, 2)) - - pat = Pattern() - pat.refs['cell'].append(ref) - - pat.mirror_element_centers(axis=0) # Mirror across x (flip y) - - ref_in_pat = pat.refs['cell'][0] - # Offset should be (5, -2) - assert_allclose(ref_in_pat.offset, [5, -2], atol=1e-10) - # Grid vector should be (10, -5) - assert ref_in_pat.repetition is not None - assert_allclose(cast('Grid', ref_in_pat.repetition).a_vector, [10, -5], atol=1e-10) - # Ref mirrored state should NOT be changed - assert ref_in_pat.mirrored is False diff --git a/masque/test/test_shape_advanced.py b/masque/test/test_shape_advanced.py deleted file mode 100644 index 689df2a..0000000 --- a/masque/test/test_shape_advanced.py +++ /dev/null @@ -1,244 +0,0 @@ -from pathlib import Path -import pytest -import numpy -from numpy.testing import assert_equal, assert_allclose -from numpy import pi - -from ..shapes import Arc, Ellipse, Circle, Polygon, Path as MPath, Text, PolyCollection -from ..error import PatternError - - -# 1. Text shape tests -def test_text_to_polygons() -> None: - pytest.importorskip("freetype") - font_path = "/usr/share/fonts/truetype/dejavu/DejaVuMathTeXGyre.ttf" - if not Path(font_path).exists(): - pytest.skip("Font file not found") - - t = Text("Hi", height=10, font_path=font_path) - polys = t.to_polygons() - assert len(polys) > 0 - assert all(isinstance(p, Polygon) for p in polys) - - # Check that it advances - # Character 'H' and 'i' should have different vertices - # Each character is a set of polygons. We check the mean x of vertices for each character. - char_x_means = [p.vertices[:, 0].mean() for p in polys] - assert len(set(char_x_means)) >= 2 - - -def test_text_bounds_and_normalized_form() -> None: - pytest.importorskip("freetype") - font_path = "/usr/share/fonts/truetype/dejavu/DejaVuMathTeXGyre.ttf" - if not Path(font_path).exists(): - pytest.skip("Font file not found") - - text = Text("Hi", height=10, font_path=font_path) - _intrinsic, extrinsic, ctor = text.normalized_form(5) - normalized = ctor() - - assert extrinsic[1] == 2 - assert normalized.height == 5 - - bounds = text.get_bounds_single() - assert bounds is not None - assert numpy.isfinite(bounds).all() - assert numpy.all(bounds[1] > bounds[0]) - - -def test_text_mirroring_affects_comparison() -> None: - text = Text("A", height=10, font_path="dummy.ttf") - mirrored = Text("A", height=10, font_path="dummy.ttf", mirrored=True) - - assert text != mirrored - assert (text < mirrored) != (mirrored < text) - - -# 2. Manhattanization tests -def test_manhattanize() -> None: - pytest.importorskip("float_raster") - pytest.importorskip("skimage.measure") - # Diamond shape - poly = Polygon([[0, 5], [5, 10], [10, 5], [5, 0]]) - grid = numpy.arange(0, 11, 1) - - manhattan_polys = poly.manhattanize(grid, grid) - assert len(manhattan_polys) >= 1 - for mp in manhattan_polys: - # Check that all edges are axis-aligned - dv = numpy.diff(mp.vertices, axis=0) - # For each segment, either dx or dy must be zero - assert numpy.all((dv[:, 0] == 0) | (dv[:, 1] == 0)) - - -# 3. Comparison and Sorting tests -def test_shape_comparisons() -> None: - c1 = Circle(radius=10) - c2 = Circle(radius=20) - assert c1 < c2 - assert not (c2 < c1) - - p1 = Polygon([[0, 0], [10, 0], [10, 10]]) - p2 = Polygon([[0, 0], [10, 0], [10, 11]]) # Different vertex - assert p1 < p2 - - # Different types - assert c1 < p1 or p1 < c1 - assert (c1 < p1) != (p1 < c1) - - -# 4. Arc/Path Edge Cases -def test_arc_edge_cases() -> None: - # Wrapped arc (> 360 deg) - a = Arc(radii=(10, 10), angles=(0, 3 * pi), width=2) - a.to_polygons(num_vertices=64) - # Should basically be a ring - bounds = a.get_bounds_single() - assert_allclose(bounds, [[-11, -11], [11, 11]], atol=1e-10) - - -def test_rotated_ellipse_bounds_match_polygonized_geometry() -> None: - ellipse = Ellipse(radii=(10, 20), rotation=pi / 4, offset=(100, 200)) - bounds = ellipse.get_bounds_single() - poly_bounds = ellipse.to_polygons(num_vertices=8192)[0].get_bounds_single() - assert_allclose(bounds, poly_bounds, atol=1e-3) - - -def test_rotated_arc_bounds_match_polygonized_geometry() -> None: - arc = Arc(radii=(10, 20), angles=(0, pi), width=2, rotation=pi / 4, offset=(100, 200)) - bounds = arc.get_bounds_single() - poly_bounds = arc.to_polygons(num_vertices=8192)[0].get_bounds_single() - assert_allclose(bounds, poly_bounds, atol=1e-3) - - -def test_curve_polygonizers_clamp_large_max_arclen() -> None: - for shape in ( - Circle(radius=10), - Ellipse(radii=(10, 20)), - Arc(radii=(10, 20), angles=(0, 1), width=2), - ): - polys = shape.to_polygons(num_vertices=None, max_arclen=1e9) - assert len(polys) == 1 - assert len(polys[0].vertices) >= 3 - - -def test_arc_polygonization_rejects_nan_implied_arclen() -> None: - arc = Arc(radii=(10, 20), angles=(0, numpy.nan), width=2) - with pytest.raises(PatternError, match='valid max_arclen'): - arc.to_polygons(num_vertices=24) - - -def test_ellipse_integer_radii_scale_cleanly() -> None: - ellipse = Ellipse(radii=(10, 20)) - ellipse.scale_by(0.5) - assert_allclose(ellipse.radii, [5, 10]) - - -def test_arc_rejects_zero_radii_up_front() -> None: - with pytest.raises(PatternError, match='Radii must be positive'): - Arc(radii=(0, 5), angles=(0, 1), width=1) - with pytest.raises(PatternError, match='Radii must be positive'): - Arc(radii=(5, 0), angles=(0, 1), width=1) - with pytest.raises(PatternError, match='Radii must be positive'): - Arc(radii=(0, 0), angles=(0, 1), width=1) - - -def test_path_edge_cases() -> None: - # Zero-length segments - p = MPath(vertices=[[0, 0], [0, 0], [10, 0]], width=2) - polys = p.to_polygons() - assert len(polys) == 1 - assert_equal(polys[0].get_bounds_single(), [[0, -1], [10, 1]]) - - -# 5. PolyCollection with holes -def test_poly_collection_holes() -> None: - # Outer square, inner square hole - # PolyCollection doesn't explicitly support holes, but its constituents (Polygons) do? - # wait, Polygon in masque is just a boundary. Holes are usually handled by having multiple - # polygons or using specific winding rules. - # masque.shapes.Polygon doc says "specify an implicitly-closed boundary". - # Pyclipper is used in connectivity.py for holes. - - # Let's test PolyCollection with multiple polygons - verts = [ - [0, 0], - [10, 0], - [10, 10], - [0, 10], # Poly 1 - [2, 2], - [2, 8], - [8, 8], - [8, 2], # Poly 2 - ] - offsets = [0, 4] - pc = PolyCollection(verts, offsets) - polys = pc.to_polygons() - assert len(polys) == 2 - assert_equal(polys[0].vertices, [[0, 0], [10, 0], [10, 10], [0, 10]]) - assert_equal(polys[1].vertices, [[2, 2], [2, 8], [8, 8], [8, 2]]) - - -def test_poly_collection_constituent_empty() -> None: - # One real triangle, one "empty" polygon (0 vertices), one real square - # Note: Polygon requires 3 vertices, so "empty" here might mean just some junk - # that to_polygons should handle. - # Actually PolyCollection doesn't check vertex count per polygon. - verts = [ - [0, 0], - [1, 0], - [0, 1], # Tri - # Empty space - [10, 10], - [11, 10], - [11, 11], - [10, 11], # Square - ] - offsets = [0, 3, 3] # Index 3 is start of "empty", Index 3 is also start of Square? - # No, offsets should be strictly increasing or handle 0-length slices. - # vertex_slices uses zip(offsets, chain(offsets[1:], [len(verts)])) - # if offsets = [0, 3, 3], slices are [0:3], [3:3], [3:7] - offsets = [0, 3, 3] - pc = PolyCollection(verts, offsets) - # Polygon(vertices=[]) will fail because of the setter check. - # Let's see if pc.to_polygons() handles it. - # It calls Polygon(vertices=vv) for each slice. - # slice [3:3] gives empty vv. - with pytest.raises(PatternError): - pc.to_polygons() - - -def test_poly_collection_valid() -> None: - verts = [[0, 0], [1, 0], [0, 1], [10, 10], [11, 10], [11, 11], [10, 11]] - offsets = [0, 3] - pc = PolyCollection(verts, offsets) - assert len(pc.to_polygons()) == 2 - shapes = [Circle(radius=20), Circle(radius=10), Polygon([[0, 0], [10, 0], [10, 10]]), Ellipse(radii=(5, 5))] - sorted_shapes = sorted(shapes) - assert len(sorted_shapes) == 4 - # Just verify it doesn't crash and is stable - assert sorted(sorted_shapes) == sorted_shapes - - -def test_poly_collection_normalized_form_reconstruction_is_independent() -> None: - pc = PolyCollection([[0, 0], [1, 0], [0, 1]], [0]) - _intrinsic, _extrinsic, rebuild = pc.normalized_form(1) - - clone = rebuild() - clone.vertex_offsets[:] = [5] - - assert_equal(pc.vertex_offsets, [0]) - assert_equal(clone.vertex_offsets, [5]) - - -def test_poly_collection_normalized_form_rebuilds_independent_clones() -> None: - pc = PolyCollection([[0, 0], [1, 0], [0, 1]], [0]) - _intrinsic, _extrinsic, rebuild = pc.normalized_form(1) - - first = rebuild() - second = rebuild() - first.vertex_offsets[:] = [7] - - assert_equal(first.vertex_offsets, [7]) - assert_equal(second.vertex_offsets, [0]) - assert_equal(pc.vertex_offsets, [0]) diff --git a/masque/test/test_shapes.py b/masque/test/test_shapes.py deleted file mode 100644 index b19d6bc..0000000 --- a/masque/test/test_shapes.py +++ /dev/null @@ -1,142 +0,0 @@ -import numpy -from numpy.testing import assert_equal, assert_allclose -from numpy import pi - -from ..shapes import Arc, Ellipse, Circle, Polygon, PolyCollection - - -def test_poly_collection_init() -> None: - # Two squares: [[0,0], [1,0], [1,1], [0,1]] and [[10,10], [11,10], [11,11], [10,11]] - verts = [[0, 0], [1, 0], [1, 1], [0, 1], [10, 10], [11, 10], [11, 11], [10, 11]] - offsets = [0, 4] - pc = PolyCollection(vertex_lists=verts, vertex_offsets=offsets) - assert len(list(pc.polygon_vertices)) == 2 - assert_equal(pc.get_bounds_single(), [[0, 0], [11, 11]]) - - -def test_poly_collection_to_polygons() -> None: - verts = [[0, 0], [1, 0], [1, 1], [0, 1], [10, 10], [11, 10], [11, 11], [10, 11]] - offsets = [0, 4] - pc = PolyCollection(vertex_lists=verts, vertex_offsets=offsets) - polys = pc.to_polygons() - assert len(polys) == 2 - assert_equal(polys[0].vertices, [[0, 0], [1, 0], [1, 1], [0, 1]]) - assert_equal(polys[1].vertices, [[10, 10], [11, 10], [11, 11], [10, 11]]) - - -def test_circle_init() -> None: - c = Circle(radius=10, offset=(5, 5)) - assert c.radius == 10 - assert_equal(c.offset, [5, 5]) - - -def test_circle_to_polygons() -> None: - c = Circle(radius=10) - polys = c.to_polygons(num_vertices=32) - assert len(polys) == 1 - assert isinstance(polys[0], Polygon) - # A circle with 32 vertices should have vertices distributed around (0,0) - bounds = polys[0].get_bounds_single() - assert_allclose(bounds, [[-10, -10], [10, 10]], atol=1e-10) - - -def test_ellipse_init() -> None: - e = Ellipse(radii=(10, 5), offset=(1, 2), rotation=pi / 4) - assert_equal(e.radii, [10, 5]) - assert_equal(e.offset, [1, 2]) - assert e.rotation == pi / 4 - - -def test_ellipse_to_polygons() -> None: - e = Ellipse(radii=(10, 5)) - polys = e.to_polygons(num_vertices=64) - assert len(polys) == 1 - bounds = polys[0].get_bounds_single() - assert_allclose(bounds, [[-10, -5], [10, 5]], atol=1e-10) - - -def test_arc_init() -> None: - a = Arc(radii=(10, 10), angles=(0, pi / 2), width=2, offset=(0, 0)) - assert_equal(a.radii, [10, 10]) - assert_equal(a.angles, [0, pi / 2]) - assert a.width == 2 - - -def test_arc_to_polygons() -> None: - # Quarter circle arc - a = Arc(radii=(10, 10), angles=(0, pi / 2), width=2) - polys = a.to_polygons(num_vertices=32) - assert len(polys) == 1 - # Outer radius 11, inner radius 9 - # Quarter circle from 0 to 90 deg - bounds = polys[0].get_bounds_single() - # Min x should be 0 (inner edge start/stop or center if width is large) - # But wait, the arc is centered at 0,0. - # Outer edge goes from (11, 0) to (0, 11) - # Inner edge goes from (9, 0) to (0, 9) - # So x ranges from 0 to 11, y ranges from 0 to 11. - assert_allclose(bounds, [[0, 0], [11, 11]], atol=1e-10) - - -def test_shape_mirror() -> None: - e = Ellipse(radii=(10, 5), offset=(10, 20), rotation=pi / 4) - e.mirror(0) # Mirror across x axis (axis 0): in-place relative to offset - assert_equal(e.offset, [10, 20]) - # rotation was pi/4, mirrored(0) -> -pi/4 == 3pi/4 (mod pi) - assert_allclose(e.rotation, 3 * pi / 4, atol=1e-10) - - a = Arc(radii=(10, 10), angles=(0, pi / 4), width=2, offset=(10, 20)) - a.mirror(0) - assert_equal(a.offset, [10, 20]) - # For Arc, mirror(0) negates rotation and angles - assert_allclose(a.angles, [0, -pi / 4], atol=1e-10) - - -def test_shape_flip_across() -> None: - e = Ellipse(radii=(10, 5), offset=(10, 20), rotation=pi / 4) - e.flip_across(axis=0) # Mirror across y=0: flips y-offset - assert_equal(e.offset, [10, -20]) - # rotation also flips: -pi/4 == 3pi/4 (mod pi) - assert_allclose(e.rotation, 3 * pi / 4, atol=1e-10) - # Mirror across specific y - e = Ellipse(radii=(10, 5), offset=(10, 20)) - e.flip_across(y=10) # Mirror across y=10 - # y=20 mirrored across y=10 -> y=0 - assert_equal(e.offset, [10, 0]) - - -def test_shape_scale() -> None: - e = Ellipse(radii=(10, 5)) - e.scale_by(2) - assert_equal(e.radii, [20, 10]) - - a = Arc(radii=(10, 5), angles=(0, pi), width=2) - a.scale_by(0.5) - assert_equal(a.radii, [5, 2.5]) - assert a.width == 1 - - -def test_shape_arclen() -> None: - # Test that max_arclen correctly limits segment lengths - - # Ellipse - e = Ellipse(radii=(10, 5)) - # Approximate perimeter is ~48.4 - # With max_arclen=5, should have > 10 segments - polys = e.to_polygons(max_arclen=5) - v = polys[0].vertices - dist = numpy.sqrt(numpy.sum(numpy.diff(v, axis=0, append=v[:1]) ** 2, axis=1)) - assert numpy.all(dist <= 5.000001) - assert len(v) > 10 - - # Arc - a = Arc(radii=(10, 10), angles=(0, pi / 2), width=2) - # Outer perimeter is 11 * pi/2 ~ 17.27 - # Inner perimeter is 9 * pi/2 ~ 14.14 - # With max_arclen=2, should have > 8 segments on outer edge - polys = a.to_polygons(max_arclen=2) - v = polys[0].vertices - # Arc polygons are closed, but contain both inner and outer edges and caps - # Let's just check that all segment lengths are within limit - dist = numpy.sqrt(numpy.sum(numpy.diff(v, axis=0, append=v[:1]) ** 2, axis=1)) - assert numpy.all(dist <= 2.000001) diff --git a/masque/test/test_svg.py b/masque/test/test_svg.py deleted file mode 100644 index c0dcd97..0000000 --- a/masque/test/test_svg.py +++ /dev/null @@ -1,100 +0,0 @@ -from pathlib import Path -import xml.etree.ElementTree as ET - -import numpy -import pytest -from numpy.testing import assert_allclose - -pytest.importorskip("svgwrite") - -from ..library import Library -from ..pattern import Pattern -from ..file import svg - - -SVG_NS = "{http://www.w3.org/2000/svg}" -XLINK_HREF = "{http://www.w3.org/1999/xlink}href" - - -def _child_transform(svg_path: Path) -> tuple[float, ...]: - root = ET.fromstring(svg_path.read_text()) - for use in root.iter(f"{SVG_NS}use"): - if use.attrib.get(XLINK_HREF) == "#child": - raw = use.attrib["transform"] - assert raw.startswith("matrix(") and raw.endswith(")") - return tuple(float(value) for value in raw[7:-1].split()) - raise AssertionError("No child reference found in SVG output") - - -def test_svg_ref_rotation_uses_correct_affine_transform(tmp_path: Path) -> None: - lib = Library() - child = Pattern() - child.polygon("1", vertices=[[0, 0], [1, 0], [0, 1]]) - lib["child"] = child - - top = Pattern() - top.ref("child", offset=(3, 4), rotation=numpy.pi / 2, scale=2) - lib["top"] = top - - svg_path = tmp_path / "rotation.svg" - svg.writefile(lib, "top", str(svg_path)) - - assert_allclose(_child_transform(svg_path), (0, 2, -2, 0, 3, 4), atol=1e-10) - - -def test_svg_ref_mirroring_changes_affine_transform(tmp_path: Path) -> None: - base = Library() - child = Pattern() - child.polygon("1", vertices=[[0, 0], [1, 0], [0, 1]]) - base["child"] = child - - top_plain = Pattern() - top_plain.ref("child", offset=(3, 4), rotation=numpy.pi / 2, scale=2, mirrored=False) - base["plain"] = top_plain - - plain_path = tmp_path / "plain.svg" - svg.writefile(base, "plain", str(plain_path)) - plain_transform = _child_transform(plain_path) - - mirrored = Library() - mirrored["child"] = child.deepcopy() - top_mirrored = Pattern() - top_mirrored.ref("child", offset=(3, 4), rotation=numpy.pi / 2, scale=2, mirrored=True) - mirrored["mirrored"] = top_mirrored - - mirrored_path = tmp_path / "mirrored.svg" - svg.writefile(mirrored, "mirrored", str(mirrored_path)) - mirrored_transform = _child_transform(mirrored_path) - - assert_allclose(plain_transform, (0, 2, -2, 0, 3, 4), atol=1e-10) - assert_allclose(mirrored_transform, (0, 2, 2, 0, 3, 4), atol=1e-10) - - -def test_svg_uses_unique_ids_for_colliding_mangled_names(tmp_path: Path) -> None: - lib = Library() - first = Pattern() - first.polygon("1", vertices=[[0, 0], [1, 0], [0, 1]]) - lib["a b"] = first - - second = Pattern() - second.polygon("1", vertices=[[0, 0], [2, 0], [0, 2]]) - lib["a-b"] = second - - top = Pattern() - top.ref("a b") - top.ref("a-b", offset=(5, 0)) - lib["top"] = top - - svg_path = tmp_path / "colliding_ids.svg" - svg.writefile(lib, "top", str(svg_path)) - - root = ET.fromstring(svg_path.read_text()) - ids = [group.attrib["id"] for group in root.iter(f"{SVG_NS}g")] - top_group = next(group for group in root.iter(f"{SVG_NS}g") if group.attrib["id"] == "top") - hrefs = [use.attrib[XLINK_HREF] for use in top_group.iter(f"{SVG_NS}use")] - - assert len(set(ids)) == len(ids) - assert len(hrefs) == 2 - assert len(set(hrefs)) == 2 - assert all(href.startswith("#") for href in hrefs) - assert all(href[1:] in ids for href in hrefs) diff --git a/masque/test/test_utils.py b/masque/test/test_utils.py deleted file mode 100644 index ddab9cd..0000000 --- a/masque/test/test_utils.py +++ /dev/null @@ -1,192 +0,0 @@ -from pathlib import Path - -import numpy -from numpy.testing import assert_equal, assert_allclose -from numpy import pi -import pytest - -from ..utils import remove_duplicate_vertices, remove_colinear_vertices, poly_contains_points, rotation_matrix_2d, apply_transforms, normalize_mirror, DeferredDict -from ..file.utils import tmpfile -from ..utils.curves import bezier -from ..error import PatternError - - -def test_remove_duplicate_vertices() -> None: - # Closed path (default) - v = [[0, 0], [1, 1], [1, 1], [2, 2], [0, 0]] - v_clean = remove_duplicate_vertices(v, closed_path=True) - # The last [0,0] is a duplicate of the first [0,0] if closed_path=True - assert_equal(v_clean, [[0, 0], [1, 1], [2, 2]]) - - # Open path - v_clean_open = remove_duplicate_vertices(v, closed_path=False) - assert_equal(v_clean_open, [[0, 0], [1, 1], [2, 2], [0, 0]]) - - -def test_remove_colinear_vertices() -> None: - v = [[0, 0], [1, 0], [2, 0], [2, 1], [2, 2], [1, 1], [0, 0]] - v_clean = remove_colinear_vertices(v, closed_path=True) - # [1, 0] is between [0, 0] and [2, 0] - # [2, 1] is between [2, 0] and [2, 2] - # [1, 1] is between [2, 2] and [0, 0] - assert_equal(v_clean, [[0, 0], [2, 0], [2, 2]]) - - -def test_remove_colinear_vertices_exhaustive() -> None: - # U-turn - v = [[0, 0], [10, 0], [0, 0]] - v_clean = remove_colinear_vertices(v, closed_path=False, preserve_uturns=True) - # Open path should keep ends. [10,0] is between [0,0] and [0,0]? - # They are colinear, but it's a 180 degree turn. - # We preserve 180 degree turns if preserve_uturns is True. - assert len(v_clean) == 3 - - v_collapsed = remove_colinear_vertices(v, closed_path=False, preserve_uturns=False) - # If not preserving u-turns, it should collapse to just the endpoints - assert len(v_collapsed) == 2 - - # 180 degree U-turn in closed path - v = [[0, 0], [10, 0], [5, 0]] - v_clean = remove_colinear_vertices(v, closed_path=True, preserve_uturns=False) - assert len(v_clean) == 2 - - -def test_poly_contains_points() -> None: - v = [[0, 0], [10, 0], [10, 10], [0, 10]] - pts = [[5, 5], [-1, -1], [10, 10], [11, 5]] - inside = poly_contains_points(v, pts) - assert_equal(inside, [True, False, True, False]) - - -def test_rotation_matrix_2d() -> None: - m = rotation_matrix_2d(pi / 2) - assert_allclose(m, [[0, -1], [1, 0]], atol=1e-10) - - -def test_rotation_matrix_non_manhattan() -> None: - # 45 degrees - m = rotation_matrix_2d(pi / 4) - s = numpy.sqrt(2) / 2 - assert_allclose(m, [[s, -s], [s, s]], atol=1e-10) - - -def test_apply_transforms() -> None: - # cumulative [x_offset, y_offset, rotation (rad), mirror_x (0 or 1)] - t1 = [10, 20, 0, 0] - t2 = [[5, 0, 0, 0], [0, 5, 0, 0]] - combined = apply_transforms(t1, t2) - assert_equal(combined, [[15, 20, 0, 0, 1], [10, 25, 0, 0, 1]]) - - -def test_apply_transforms_advanced() -> None: - # Ox4: (x, y, rot, mir) - # Outer: mirror x (axis 0), then rotate 90 deg CCW - # apply_transforms logic for mirror uses y *= -1 (which is axis 0 mirror) - outer = [0, 0, pi / 2, 1] - - # Inner: (10, 0, 0, 0) - inner = [10, 0, 0, 0] - - combined = apply_transforms(outer, inner) - # 1. mirror inner y if outer mirrored: (10, 0) -> (10, 0) - # 2. rotate by outer rotation (pi/2): (10, 0) -> (0, 10) - # 3. add outer offset (0, 0) -> (0, 10) - assert_allclose(combined[0], [0, 10, pi / 2, 1, 1], atol=1e-10) - - -def test_apply_transforms_empty_inputs() -> None: - empty_outer = apply_transforms(numpy.empty((0, 5)), [[1, 2, 0, 0, 1]]) - assert empty_outer.shape == (0, 5) - - empty_inner = apply_transforms([[1, 2, 0, 0, 1]], numpy.empty((0, 5))) - assert empty_inner.shape == (0, 5) - - both_empty_tensor = apply_transforms(numpy.empty((0, 5)), numpy.empty((0, 5)), tensor=True) - assert both_empty_tensor.shape == (0, 0, 5) - - -def test_normalize_mirror_validates_length() -> None: - with pytest.raises(ValueError, match='2-item sequence'): - normalize_mirror(()) - - with pytest.raises(ValueError, match='2-item sequence'): - normalize_mirror((True,)) - - with pytest.raises(ValueError, match='2-item sequence'): - normalize_mirror((True, False, True)) - - -def test_bezier_validates_weight_length() -> None: - with pytest.raises(PatternError, match='one entry per control point'): - bezier([[0, 0], [1, 1]], [0, 0.5, 1], weights=[1]) - - with pytest.raises(PatternError, match='one entry per control point'): - bezier([[0, 0], [1, 1]], [0, 0.5, 1], weights=[1, 2, 3]) - - -def test_bezier_accepts_exact_weight_count() -> None: - samples = bezier([[0, 0], [1, 1]], [0, 0.5, 1], weights=[1, 2]) - assert_allclose(samples, [[0, 0], [2 / 3, 2 / 3], [1, 1]], atol=1e-10) - - -def test_deferred_dict_accessors_resolve_values_once() -> None: - calls = 0 - - def make_value() -> int: - nonlocal calls - calls += 1 - return 7 - - deferred = DeferredDict[str, int]() - deferred["x"] = make_value - - assert deferred.get("missing", 9) == 9 - assert deferred.get("x") == 7 - assert list(deferred.values()) == [7] - assert list(deferred.items()) == [("x", 7)] - assert calls == 1 - - -def test_deferred_dict_mutating_accessors_preserve_value_semantics() -> None: - calls = 0 - - def make_value() -> int: - nonlocal calls - calls += 1 - return 7 - - deferred = DeferredDict[str, int]() - - assert deferred.setdefault("x", 5) == 5 - assert deferred["x"] == 5 - - assert deferred.setdefault("y", make_value) == 7 - assert deferred["y"] == 7 - assert calls == 1 - - copy_deferred = deferred.copy() - assert isinstance(copy_deferred, DeferredDict) - assert copy_deferred["x"] == 5 - assert copy_deferred["y"] == 7 - assert calls == 1 - - assert deferred.pop("x") == 5 - assert deferred.pop("missing", 9) == 9 - assert deferred.popitem() == ("y", 7) - - -def test_tmpfile_cleans_up_on_exception(tmp_path: Path) -> None: - target = tmp_path / "out.txt" - temp_path = None - - try: - with tmpfile(target) as stream: - temp_path = Path(stream.name) - stream.write(b"hello") - raise RuntimeError("boom") - except RuntimeError: - pass - - assert temp_path is not None - assert not target.exists() - assert not temp_path.exists() diff --git a/masque/test/test_visualize.py b/masque/test/test_visualize.py deleted file mode 100644 index 4dab435..0000000 --- a/masque/test/test_visualize.py +++ /dev/null @@ -1,55 +0,0 @@ -import numpy as np -import pytest -from masque.pattern import Pattern -from masque.ports import Port -from masque.repetition import Grid - -try: - import matplotlib - HAS_MATPLOTLIB = True -except ImportError: - HAS_MATPLOTLIB = False - -@pytest.mark.skipif(not HAS_MATPLOTLIB, reason="matplotlib not installed") -def test_visualize_noninteractive(tmp_path) -> None: - """ - Test that visualize() runs and saves a file without error. - This covers the recursive transformation and collection logic. - """ - # Create a hierarchy - child = Pattern() - child.polygon('L1', [[0, 0], [1, 0], [1, 1], [0, 1]]) - child.ports['P1'] = Port((0.5, 0.5), 0) - - parent = Pattern() - # Add some refs with various transforms - parent.ref('child', offset=(10, 0), rotation=np.pi/4, mirrored=True, scale=2.0) - - # Add a repetition - rep = Grid(a_vector=(5, 5), a_count=2) - parent.ref('child', offset=(0, 10), repetition=rep) - - library = {'child': child} - - output_file = tmp_path / "test_plot.png" - - # Run visualize with filename to avoid showing window - parent.visualize(library=library, filename=str(output_file), ports=True) - - assert output_file.exists() - assert output_file.stat().st_size > 0 - -@pytest.mark.skipif(not HAS_MATPLOTLIB, reason="matplotlib not installed") -def test_visualize_empty() -> None: - """ Test visualizing an empty pattern. """ - pat = Pattern() - # Should not raise - pat.visualize(overdraw=True) - -@pytest.mark.skipif(not HAS_MATPLOTLIB, reason="matplotlib not installed") -def test_visualize_no_refs() -> None: - """ Test visualizing a pattern with only local shapes (no library needed). """ - pat = Pattern() - pat.polygon('L1', [[0, 0], [1, 0], [0, 1]]) - # Should not raise even if library is None - pat.visualize(overdraw=True) diff --git a/masque/traits/__init__.py b/masque/traits/__init__.py index cca38f3..7c7360c 100644 --- a/masque/traits/__init__.py +++ b/masque/traits/__init__.py @@ -26,11 +26,7 @@ from .scalable import ( Scalable as Scalable, ScalableImpl as ScalableImpl, ) -from .mirrorable import ( - Mirrorable as Mirrorable, - Flippable as Flippable, - FlippableImpl as FlippableImpl, - ) +from .mirrorable import Mirrorable as Mirrorable from .copyable import Copyable as Copyable from .annotatable import ( Annotatable as Annotatable, diff --git a/masque/traits/mirrorable.py b/masque/traits/mirrorable.py index 2a3a9fb..6d4ec3c 100644 --- a/masque/traits/mirrorable.py +++ b/masque/traits/mirrorable.py @@ -1,13 +1,6 @@ from typing import Self from abc import ABCMeta, abstractmethod -import numpy -from numpy.typing import NDArray - -from ..error import MasqueError -from .positionable import Positionable -from .repeatable import Repeatable - class Mirrorable(metaclass=ABCMeta): """ @@ -18,17 +11,11 @@ class Mirrorable(metaclass=ABCMeta): @abstractmethod def mirror(self, axis: int = 0) -> Self: """ - Intrinsic transformation: Mirror the entity across an axis through its origin. - This does NOT affect the object's repetition grid. - - This operation is performed relative to the object's internal origin (ignoring - its offset). For objects like `Polygon` and `Path` where the offset is forced - to (0, 0), this is equivalent to mirroring in the container's coordinate system. + Mirror the entity across an axis. Args: - axis: Axis to mirror across: - 0: X-axis (flip y coords), - 1: Y-axis (flip x coords) + axis: Axis to mirror across. + Returns: self """ @@ -36,11 +23,10 @@ class Mirrorable(metaclass=ABCMeta): def mirror2d(self, across_x: bool = False, across_y: bool = False) -> Self: """ - Optionally mirror the entity across both axes through its origin. + Optionally mirror the entity across both axes Args: - across_x: Mirror across the horizontal X-axis (flip Y coordinates). - across_y: Mirror across the vertical Y-axis (flip X coordinates). + axes: (mirror_across_x, mirror_across_y) Returns: self @@ -52,61 +38,30 @@ class Mirrorable(metaclass=ABCMeta): return self -class Flippable(Positionable, metaclass=ABCMeta): - """ - Trait class for entities which can be mirrored relative to an external line. - """ - __slots__ = () - - @staticmethod - def _check_flip_args(axis: int | None = None, *, x: float | None = None, y: float | None = None) -> tuple[int, NDArray[numpy.float64]]: - pivot = numpy.zeros(2) - if axis is not None: - if x is not None or y is not None: - raise MasqueError('Cannot specify both axis and x or y') - return axis, pivot - if x is not None: - if y is not None: - raise MasqueError('Cannot specify both x and y') - return 1, pivot + (x, 0) - if y is not None: - return 0, pivot + (0, y) - raise MasqueError('Must specify one of axis, x, or y') - - @abstractmethod - def flip_across(self, axis: int | None = None, *, x: float | None = None, y: float | None = None) -> Self: - """ - Extrinsic transformation: Mirror the object across a line in the container's - coordinate system. This affects both the object's offset and its repetition grid. - - Unlike `mirror()`, this operation is performed relative to the container's origin - (e.g. the `Pattern` origin, in the case of shapes) and takes the object's offset - into account. - - Args: - axis: Axis to mirror across. 0: x-axis (flip y coord), 1: y-axis (flip x coord). - x: Vertical line x=val to mirror across. - y: Horizontal line y=val to mirror across. - - Returns: - self - """ - pass - - -class FlippableImpl(Flippable, Mirrorable, Repeatable, metaclass=ABCMeta): - """ - Implementation of `Flippable` for objects which are `Mirrorable`, `Positionable`, - and `Repeatable`. - """ - __slots__ = () - - def flip_across(self, axis: int | None = None, *, x: float | None = None, y: float | None = None) -> Self: - axis, pivot = self._check_flip_args(axis=axis, x=x, y=y) - self.translate(-pivot) - self.mirror(axis) - if self.repetition is not None: - self.repetition.mirror(axis) - self.offset[1 - axis] *= -1 - self.translate(+pivot) - return self +#class MirrorableImpl(Mirrorable, metaclass=ABCMeta): +# """ +# Simple implementation of `Mirrorable` +# """ +# __slots__ = () +# +# _mirrored: NDArray[numpy.bool] +# """ Whether to mirror the instance across the x and/or y axes. """ +# +# # +# # Properties +# # +# # Mirrored property +# @property +# def mirrored(self) -> NDArray[numpy.bool]: +# """ Whether to mirror across the [x, y] axes, respectively """ +# return self._mirrored +# +# @mirrored.setter +# def mirrored(self, val: Sequence[bool]) -> None: +# if is_scalar(val): +# raise MasqueError('Mirrored must be a 2-element list of booleans') +# self._mirrored = numpy.array(val, dtype=bool) +# +# # +# # Methods +# # diff --git a/masque/traits/repeatable.py b/masque/traits/repeatable.py index dbf4fad..fbd765f 100644 --- a/masque/traits/repeatable.py +++ b/masque/traits/repeatable.py @@ -76,7 +76,7 @@ class RepeatableImpl(Repeatable, Bounded, metaclass=ABCMeta): @repetition.setter def repetition(self, repetition: 'Repetition | None') -> None: - from ..repetition import Repetition #noqa: PLC0415 + from ..repetition import Repetition if repetition is not None and not isinstance(repetition, Repetition): raise MasqueError(f'{repetition} is not a valid Repetition object!') self._repetition = repetition diff --git a/masque/traits/rotatable.py b/masque/traits/rotatable.py index 436d0a2..2fa86c1 100644 --- a/masque/traits/rotatable.py +++ b/masque/traits/rotatable.py @@ -1,4 +1,4 @@ -from typing import Self +from typing import Self, cast, Any, TYPE_CHECKING from abc import ABCMeta, abstractmethod import numpy @@ -8,7 +8,8 @@ from numpy.typing import ArrayLike from ..error import MasqueError from ..utils import rotation_matrix_2d -from .positionable import Positionable +if TYPE_CHECKING: + from .positionable import Positionable _empty_slots = () # Workaround to get mypy to ignore intentionally empty slots for superclass @@ -25,8 +26,7 @@ class Rotatable(metaclass=ABCMeta): @abstractmethod def rotate(self, val: float) -> Self: """ - Intrinsic transformation: Rotate the shape around its origin (0, 0), ignoring its offset. - This does NOT affect the object's repetition grid. + Rotate the shape around its origin (0, 0), ignoring its offset. Args: val: Angle to rotate by (counterclockwise, radians) @@ -64,10 +64,6 @@ class RotatableImpl(Rotatable, metaclass=ABCMeta): # Methods # def rotate(self, rotation: float) -> Self: - """ - Intrinsic transformation: Rotate the shape around its origin (0, 0), ignoring its offset. - This does NOT affect the object's repetition grid. - """ self.rotation += rotation return self @@ -85,9 +81,9 @@ class RotatableImpl(Rotatable, metaclass=ABCMeta): return self -class Pivotable(Positionable, metaclass=ABCMeta): +class Pivotable(metaclass=ABCMeta): """ - Trait class for entities which can be rotated around a point. + Trait class for entites which can be rotated around a point. This requires that they are `Positionable` but not necessarily `Rotatable` themselves. """ __slots__ = () @@ -95,11 +91,7 @@ class Pivotable(Positionable, metaclass=ABCMeta): @abstractmethod def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self: """ - Extrinsic transformation: Rotate the object around a point in the container's - coordinate system. This affects both the object's offset and its repetition grid. - - For objects that are also `Rotatable`, this also performs an intrinsic - rotation of the object. + Rotate the object around a point. Args: pivot: Point (x, y) to rotate around @@ -111,21 +103,20 @@ class Pivotable(Positionable, metaclass=ABCMeta): pass -class PivotableImpl(Pivotable, Rotatable, metaclass=ABCMeta): +class PivotableImpl(Pivotable, metaclass=ABCMeta): """ Implementation of `Pivotable` for objects which are `Rotatable` - and `Positionable`. """ __slots__ = () + offset: Any # TODO see if we can get around defining `offset` in PivotableImpl + """ `[x_offset, y_offset]` """ + def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self: - from .repeatable import Repeatable #noqa: PLC0415 pivot = numpy.asarray(pivot, dtype=float) - self.translate(-pivot) - self.rotate(rotation) - if isinstance(self, Repeatable) and self.repetition is not None: - self.repetition.rotate(rotation) + cast('Positionable', self).translate(-pivot) + cast('Rotatable', self).rotate(rotation) self.offset = numpy.dot(rotation_matrix_2d(rotation), self.offset) - self.translate(+pivot) + cast('Positionable', self).translate(+pivot) return self diff --git a/masque/utils/autoslots.py b/masque/utils/autoslots.py index cef8006..e82d3db 100644 --- a/masque/utils/autoslots.py +++ b/masque/utils/autoslots.py @@ -17,12 +17,11 @@ class AutoSlots(ABCMeta): for base in bases: parents |= set(base.mro()) - slots = list(dctn.get('__slots__', ())) + slots = tuple(dctn.get('__slots__', ())) for parent in parents: if not hasattr(parent, '__annotations__'): continue - slots.extend(parent.__annotations__.keys()) + slots += tuple(parent.__annotations__.keys()) - # Deduplicate (dict to preserve order) - dctn['__slots__'] = tuple(dict.fromkeys(slots)) + dctn['__slots__'] = slots return super().__new__(cls, name, bases, dctn) diff --git a/masque/utils/boolean.py b/masque/utils/boolean.py deleted file mode 100644 index 5181fc5..0000000 --- a/masque/utils/boolean.py +++ /dev/null @@ -1,196 +0,0 @@ -from typing import Any, Literal -from collections.abc import Iterable -import logging - -import numpy -from numpy.typing import NDArray - -from ..shapes.polygon import Polygon -from ..error import PatternError - - -logger = logging.getLogger(__name__) - - -def _bridge_holes(outer_path: NDArray[numpy.float64], holes: list[NDArray[numpy.float64]]) -> NDArray[numpy.float64]: - """ - Bridge multiple holes into an outer boundary using zero-width slits. - """ - current_outer = outer_path - - # Sort holes by max X to potentially minimize bridge lengths or complexity - # (though not strictly necessary for correctness) - holes = sorted(holes, key=lambda h: numpy.max(h[:, 0]), reverse=True) - - for hole in holes: - # Find max X vertex of hole - max_idx = numpy.argmax(hole[:, 0]) - m = hole[max_idx] - - # Find intersection of ray (m.x, m.y) + (t, 0) with current_outer edges - best_t = numpy.inf - best_pt = None - best_edge_idx = -1 - - n = len(current_outer) - for i in range(n): - p1 = current_outer[i] - p2 = current_outer[(i + 1) % n] - - # Check if edge (p1, p2) spans m.y - if (p1[1] <= m[1] < p2[1]) or (p2[1] <= m[1] < p1[1]): - # Intersection x: - # x = p1.x + (m.y - p1.y) * (p2.x - p1.x) / (p2.y - p1.y) - t = (p1[0] + (m[1] - p1[1]) * (p2[0] - p1[0]) / (p2[1] - p1[1])) - m[0] - if 0 <= t < best_t: - best_t = t - best_pt = numpy.array([m[0] + t, m[1]]) - best_edge_idx = i - - if best_edge_idx == -1: - # Fallback: find nearest vertex if ray fails (shouldn't happen for valid hole) - dists = numpy.linalg.norm(current_outer - m, axis=1) - best_edge_idx = int(numpy.argmin(dists)) - best_pt = current_outer[best_edge_idx] - # Adjust best_edge_idx to insert AFTER this vertex - # (treating it as a degenerate edge) - - assert best_pt is not None - - # Reorder hole vertices to start at m - hole_reordered = numpy.roll(hole, -max_idx, axis=0) - - # Construct new outer: - # 1. Start of outer up to best_edge_idx - # 2. Intersection point - # 3. Hole vertices (starting and ending at m) - # 4. Intersection point (to close slit) - # 5. Rest of outer - - new_outer: list[NDArray[numpy.float64]] = [] - new_outer.extend(current_outer[:best_edge_idx + 1]) - new_outer.append(best_pt) - new_outer.extend(hole_reordered) - new_outer.append(hole_reordered[0]) # close hole loop at m - new_outer.append(best_pt) # back to outer - new_outer.extend(current_outer[best_edge_idx + 1:]) - - current_outer = numpy.array(new_outer) - - return current_outer - -def boolean( - subjects: Iterable[Any], - clips: Iterable[Any] | None = None, - operation: Literal['union', 'intersection', 'difference', 'xor'] = 'union', - scale: float = 1e6, - ) -> list[Polygon]: - """ - Perform a boolean operation on two sets of polygons. - - Args: - subjects: List of subjects (Polygons or vertex arrays). - clips: List of clips (Polygons or vertex arrays). - operation: The boolean operation to perform. - scale: Scaling factor for integer conversion (pyclipper uses integers). - - Returns: - A list of result Polygons. - """ - try: - import pyclipper #noqa: PLC0415 - except ImportError: - raise ImportError( - "Boolean operations require 'pyclipper'. " - "Install it with 'pip install pyclipper' or 'pip install masque[boolean]'." - ) from None - - op_map = { - 'union': pyclipper.CT_UNION, - 'intersection': pyclipper.CT_INTERSECTION, - 'difference': pyclipper.CT_DIFFERENCE, - 'xor': pyclipper.CT_XOR, - } - - def to_vertices(objs: Iterable[Any] | Any | None) -> list[NDArray]: - if objs is None: - return [] - if hasattr(objs, 'to_polygons') or isinstance(objs, numpy.ndarray | Polygon): - objs = (objs,) - elif not isinstance(objs, Iterable): - raise PatternError(f"Unsupported type for boolean operation: {type(objs)}") - verts = [] - for obj in objs: - if hasattr(obj, 'to_polygons'): - for p in obj.to_polygons(): - verts.append(p.vertices) - elif isinstance(obj, numpy.ndarray): - verts.append(obj) - elif isinstance(obj, Polygon): - verts.append(obj.vertices) - else: - # Try to iterate if it's an iterable of shapes - try: - for sub in obj: - if hasattr(sub, 'to_polygons'): - for p in sub.to_polygons(): - verts.append(p.vertices) - elif isinstance(sub, Polygon): - verts.append(sub.vertices) - except TypeError: - raise PatternError(f"Unsupported type for boolean operation: {type(obj)}") from None - return verts - - subject_verts = to_vertices(subjects) - clip_verts = to_vertices(clips) - - if not subject_verts: - if operation in ('union', 'xor'): - return [Polygon(vertices) for vertices in clip_verts] - return [] - - if not clip_verts: - if operation == 'intersection': - return [] - return [Polygon(vertices) for vertices in subject_verts] - - pc = pyclipper.Pyclipper() - pc.AddPaths(pyclipper.scale_to_clipper(subject_verts, scale), pyclipper.PT_SUBJECT, True) - if clip_verts: - pc.AddPaths(pyclipper.scale_to_clipper(clip_verts, scale), pyclipper.PT_CLIP, True) - - # Use GetPolyTree to distinguish between outers and holes - polytree = pc.Execute2(op_map[operation.lower()], pyclipper.PFT_NONZERO, pyclipper.PFT_NONZERO) - - result_polygons = [] - - def process_node(node: Any) -> None: - if not node.IsHole: - # This is an outer boundary - outer_path = numpy.array(pyclipper.scale_from_clipper(node.Contour, scale)) - - # Find immediate holes - holes = [] - for child in node.Childs: - if child.IsHole: - holes.append(numpy.array(pyclipper.scale_from_clipper(child.Contour, scale))) - - if holes: - combined_vertices = _bridge_holes(outer_path, holes) - result_polygons.append(Polygon(combined_vertices)) - else: - result_polygons.append(Polygon(outer_path)) - - # Recursively process children of holes (which are nested outers) - for child in node.Childs: - if child.IsHole: - for grandchild in child.Childs: - process_node(grandchild) - else: - # Holes are processed as children of outers - pass - - for top_node in polytree.Childs: - process_node(top_node) - - return result_polygons diff --git a/masque/utils/comparisons.py b/masque/utils/comparisons.py index bb2dfee..63981c9 100644 --- a/masque/utils/comparisons.py +++ b/masque/utils/comparisons.py @@ -9,15 +9,7 @@ def annotation2key(aaa: int | float | str) -> tuple[bool, Any]: return (isinstance(aaa, str), aaa) -def _normalized_annotations(annotations: annotations_t) -> annotations_t: - if not annotations: - return None - return annotations - - def annotations_lt(aa: annotations_t, bb: annotations_t) -> bool: - aa = _normalized_annotations(aa) - bb = _normalized_annotations(bb) if aa is None: return bb is not None elif bb is None: # noqa: RET505 @@ -44,8 +36,6 @@ def annotations_lt(aa: annotations_t, bb: annotations_t) -> bool: def annotations_eq(aa: annotations_t, bb: annotations_t) -> bool: - aa = _normalized_annotations(aa) - bb = _normalized_annotations(bb) if aa is None: return bb is None elif bb is None: # noqa: RET505 @@ -57,7 +47,7 @@ def annotations_eq(aa: annotations_t, bb: annotations_t) -> bool: keys_a = tuple(sorted(aa.keys())) keys_b = tuple(sorted(bb.keys())) if keys_a != keys_b: - return False + return keys_a < keys_b for key in keys_a: va = aa[key] diff --git a/masque/utils/curves.py b/masque/utils/curves.py index 0a6c9bd..8b3fcc4 100644 --- a/masque/utils/curves.py +++ b/masque/utils/curves.py @@ -2,8 +2,6 @@ import numpy from numpy.typing import ArrayLike, NDArray from numpy import pi -from ..error import PatternError - try: from numpy import trapezoid except ImportError: @@ -33,11 +31,6 @@ def bezier( tt = numpy.asarray(tt) nn = nodes.shape[0] weights = numpy.ones(nn) if weights is None else numpy.asarray(weights) - if weights.ndim != 1 or weights.shape[0] != nn: - raise PatternError( - f'weights must be a 1D array with one entry per control point; ' - f'got shape {weights.shape} for {nn} control points' - ) with numpy.errstate(divide='ignore'): umul = (tt / (1 - tt)).clip(max=1) @@ -76,25 +69,14 @@ def euler_bend( num_points_arc = num_points - 2 * num_points_spiral def gen_spiral(ll_max: float) -> NDArray[numpy.float64]: - if ll_max == 0: - return numpy.zeros((num_points_spiral, 2)) - - resolution = 100000 - qq = numpy.linspace(0, ll_max, resolution) - dx = numpy.cos(qq * qq / 2) - dy = -numpy.sin(qq * qq / 2) - - dq = ll_max / (resolution - 1) - ix = numpy.zeros(resolution) - iy = numpy.zeros(resolution) - ix[1:] = numpy.cumsum((dx[:-1] + dx[1:]) / 2) * dq - iy[1:] = numpy.cumsum((dy[:-1] + dy[1:]) / 2) * dq - - ll_target = numpy.linspace(0, ll_max, num_points_spiral) - x_target = numpy.interp(ll_target, qq, ix) - y_target = numpy.interp(ll_target, qq, iy) - - return numpy.stack((x_target, y_target), axis=1) + xx = [] + yy = [] + for ll in numpy.linspace(0, ll_max, num_points_spiral): + qq = numpy.linspace(0, ll, 1000) # integrate to current arclength + xx.append(trapezoid( numpy.cos(qq * qq / 2), qq)) + yy.append(trapezoid(-numpy.sin(qq * qq / 2), qq)) + xy_part = numpy.stack((xx, yy), axis=1) + return xy_part xy_spiral = gen_spiral(ll_max) xy_parts = [xy_spiral] @@ -117,6 +99,6 @@ def euler_bend( xy = numpy.concatenate(xy_parts) # Remove any 2x-duplicate points - xy = xy[(numpy.roll(xy, 1, axis=0) - xy > 1e-12).any(axis=1)] + xy = xy[(numpy.roll(xy, 1, axis=0) != xy).any(axis=1)] return xy diff --git a/masque/utils/deferreddict.py b/masque/utils/deferreddict.py index 70893c0..aff3bcc 100644 --- a/masque/utils/deferreddict.py +++ b/masque/utils/deferreddict.py @@ -1,11 +1,10 @@ from typing import TypeVar, Generic -from collections.abc import Callable, Iterator +from collections.abc import Callable from functools import lru_cache Key = TypeVar('Key') Value = TypeVar('Value') -_MISSING = object() class DeferredDict(dict, Generic[Key, Value]): @@ -26,73 +25,18 @@ class DeferredDict(dict, Generic[Key, Value]): """ def __init__(self, *args, **kwargs) -> None: dict.__init__(self) - if args or kwargs: - self.update(*args, **kwargs) + self.update(*args, **kwargs) def __setitem__(self, key: Key, value: Callable[[], Value]) -> None: - """ - Set a value, which must be a callable that returns the actual value. - The result of the callable is cached after the first access. - """ - if not callable(value): - raise TypeError(f"DeferredDict value must be callable, got {type(value)}") cached_fn = lru_cache(maxsize=1)(value) dict.__setitem__(self, key, cached_fn) def __getitem__(self, key: Key) -> Value: return dict.__getitem__(self, key)() - def get(self, key: Key, default: Value | None = None) -> Value | None: - if key not in self: - return default - return self[key] - - def setdefault(self, key: Key, default: Value | Callable[[], Value] | None = None) -> Value | None: - if key in self: - return self[key] - if callable(default): - self[key] = default - else: - self.set_const(key, default) - return self[key] - - def items(self) -> Iterator[tuple[Key, Value]]: - for key in self.keys(): - yield key, self[key] - - def values(self) -> Iterator[Value]: - for key in self.keys(): - yield self[key] - def update(self, *args, **kwargs) -> None: - """ - Update the DeferredDict. If a value is callable, it is used as a generator. - Otherwise, it is wrapped as a constant. - """ for k, v in dict(*args, **kwargs).items(): - if callable(v): - self[k] = v - else: - self.set_const(k, v) - - def pop(self, key: Key, default: Value | object = _MISSING) -> Value: - if key in self: - value = self[key] - dict.pop(self, key) - return value - if default is _MISSING: - raise KeyError(key) - return default # type: ignore[return-value] - - def popitem(self) -> tuple[Key, Value]: - key, value_func = dict.popitem(self) - return key, value_func() - - def copy(self) -> 'DeferredDict[Key, Value]': - new = DeferredDict[Key, Value]() - for key in self.keys(): - dict.__setitem__(new, key, dict.__getitem__(self, key)) - return new + self[k] = v def __repr__(self) -> str: return '' @@ -102,4 +46,4 @@ class DeferredDict(dict, Generic[Key, Value]): Convenience function to avoid having to manually wrap constant values into callables. """ - self[key] = lambda v=value: v + self[key] = lambda: value diff --git a/masque/utils/pack2d.py b/masque/utils/pack2d.py index 248f408..ce6b006 100644 --- a/masque/utils/pack2d.py +++ b/masque/utils/pack2d.py @@ -60,12 +60,6 @@ def maxrects_bssf( degenerate = (min_more & max_less).any(axis=0) regions = regions[~degenerate] - if regions.shape[0] == 0: - if allow_rejects: - rejected_inds.add(rect_ind) - continue - raise MasqueError(f'Failed to find a suitable location for rectangle {rect_ind}') - ''' Place the rect ''' # Best short-side fit (bssf) to pick a region region_sizes = regions[:, 2:] - regions[:, :2] @@ -108,7 +102,7 @@ def maxrects_bssf( if presort: unsort_order = rect_order.argsort() rect_locs = rect_locs[unsort_order] - rejected_inds = {int(rect_order[ii]) for ii in rejected_inds} + rejected_inds = set(unsort_order[list(rejected_inds)]) return rect_locs, rejected_inds @@ -193,7 +187,7 @@ def guillotine_bssf_sas( if presort: unsort_order = rect_order.argsort() rect_locs = rect_locs[unsort_order] - rejected_inds = {int(rect_order[ii]) for ii in rejected_inds} + rejected_inds = set(unsort_order[list(rejected_inds)]) return rect_locs, rejected_inds @@ -242,9 +236,7 @@ def pack_patterns( locations, reject_inds = packer(sizes, containers, presort=presort, allow_rejects=allow_rejects) pat = Pattern() - for ii, (pp, oo, loc) in enumerate(zip(patterns, offsets, locations, strict=True)): - if ii in reject_inds: - continue + for pp, oo, loc in zip(patterns, offsets, locations, strict=True): pat.ref(pp, offset=oo + loc) rejects = [patterns[ii] for ii in reject_inds] diff --git a/masque/utils/ports2data.py b/masque/utils/ports2data.py index 44a0ec3..b67fa0a 100644 --- a/masque/utils/ports2data.py +++ b/masque/utils/ports2data.py @@ -57,9 +57,11 @@ def data_to_ports( name: str | None = None, # Note: name optional, but arg order different from read(postprocess=) max_depth: int = 0, skip_subcells: bool = True, - visited: set[int] | None = None, + # TODO missing ok? ) -> Pattern: """ + # TODO fixup documentation in ports2data + # TODO move to utils.file? Examine `pattern` for labels specifying port info, and use that info to fill out its `ports` attribute. @@ -68,30 +70,18 @@ def data_to_ports( Args: layers: Search for labels on all the given layers. - library: Mapping from pattern names to patterns. pattern: Pattern object to scan for labels. - name: Name of the pattern object. - max_depth: Maximum hierarcy depth to search. Default 0. + max_depth: Maximum hierarcy depth to search. Default 999_999. Reduce this to 0 to avoid ever searching subcells. skip_subcells: If port labels are found at a given hierarcy level, do not continue searching at deeper levels. This allows subcells to contain their own port info without interfering with supercells' port data. Default True. - visited: Set of object IDs which have already been processed. Returns: The updated `pattern`. Port labels are not removed. """ - if visited is None: - visited = set() - - # Note: visited uses id(pattern) to detect cycles and avoid redundant processing. - # This may not catch identical patterns if they are loaded as separate object instances. - if id(pattern) in visited: - return pattern - visited.add(id(pattern)) - if pattern.ports: logger.warning(f'Pattern {name if name else pattern} already had ports, skipping data_to_ports') return pattern @@ -109,20 +99,18 @@ def data_to_ports( if target is None: continue pp = data_to_ports( - layers = layers, - library = library, - pattern = library[target], - name = target, - max_depth = max_depth - 1, - skip_subcells = skip_subcells, - visited = visited, + layers=layers, + library=library, + pattern=library[target], + name=target, + max_depth=max_depth - 1, + skip_subcells=skip_subcells, ) found_ports |= bool(pp.ports) if not found_ports: return pattern - imported_ports: dict[str, Port] = {} for target, refs in pattern.refs.items(): if target is None: continue @@ -134,14 +122,9 @@ def data_to_ports( if not aa.ports: break - if ref.repetition is not None: - logger.warning(f'Pattern {name if name else pattern} has repeated ref to {target!r}; ' - 'data_to_ports() is importing only the base instance ports') aa.apply_ref_transform(ref) - Pattern(ports={**pattern.ports, **imported_ports}).check_ports(other_names=aa.ports.keys()) - imported_ports.update(aa.ports) - - pattern.ports.update(imported_ports) + pattern.check_ports(other_names=aa.ports.keys()) + pattern.ports.update(aa.ports) return pattern @@ -177,24 +160,13 @@ def data_to_ports_flat( local_ports = {} for label in labels: - if ':' not in label.string: - logger.warning(f'Invalid port label "{label.string}" in pattern "{pstr}" (missing ":")') - continue - - name, property_string = label.string.split(':', 1) - properties = property_string.split() - ptype = properties[0] if len(properties) > 0 else 'unk' - if len(properties) > 1: - try: - angle_deg = float(properties[1]) - except ValueError: - logger.warning(f'Invalid port label "{label.string}" in pattern "{pstr}" (bad angle)') - continue - else: - angle_deg = numpy.inf + name, property_string = label.string.split(':') + properties = property_string.split(' ') + ptype = properties[0] + angle_deg = float(properties[1]) if len(ptype) else 0 xy = label.offset - angle = numpy.deg2rad(angle_deg) if numpy.isfinite(angle_deg) else None + angle = numpy.deg2rad(angle_deg) if name in local_ports: logger.warning(f'Duplicate port "{name}" in pattern "{pstr}"') @@ -203,3 +175,4 @@ def data_to_ports_flat( pattern.ports.update(local_ports) return pattern + diff --git a/masque/utils/transform.py b/masque/utils/transform.py index 7b39122..dfb6492 100644 --- a/masque/utils/transform.py +++ b/masque/utils/transform.py @@ -28,9 +28,8 @@ def rotation_matrix_2d(theta: float) -> NDArray[numpy.float64]: arr = numpy.array([[numpy.cos(theta), -numpy.sin(theta)], [numpy.sin(theta), +numpy.cos(theta)]]) - # If this was a manhattan rotation, round to remove some inaccuracies in sin & cos - # cos(4*theta) is 1 for any multiple of pi/2. - if numpy.isclose(numpy.cos(4 * theta), 1, atol=1e-12): + # If this was a manhattan rotation, round to remove some inacuraccies in sin & cos + if numpy.isclose(theta % (pi / 2), 0): arr = numpy.round(arr) arr.flags.writeable = False @@ -50,10 +49,7 @@ def normalize_mirror(mirrored: Sequence[bool]) -> tuple[bool, float]: `angle_to_rotate` in radians """ - if len(mirrored) != 2: - raise ValueError(f'mirrored must be a 2-item sequence, got length {len(mirrored)}') - - mirrored_x, mirrored_y = (bool(value) for value in mirrored) + mirrored_x, mirrored_y = mirrored mirror_x = (mirrored_x != mirrored_y) # XOR angle = numpy.pi if mirrored_y else 0 return mirror_x, angle @@ -90,55 +86,37 @@ def apply_transforms( Apply a set of transforms (`outer`) to a second set (`inner`). This is used to find the "absolute" transform for nested `Ref`s. - The two transforms should be of shape Ox5 and Ix5. - Rows should be of the form `(x_offset, y_offset, rotation_ccw_rad, mirror_across_x, scale)`. - The output will be of the form (O*I)x5 (if `tensor=False`) or OxIx5 (`tensor=True`). + The two transforms should be of shape Ox4 and Ix4. + Rows should be of the form `(x_offset, y_offset, rotation_ccw_rad, mirror_across_x)`. + The output will be of the form (O*I)x4 (if `tensor=False`) or OxIx4 (`tensor=True`). Args: - outer: Transforms for the container refs. Shape Ox5. - inner: Transforms for the contained refs. Shape Ix5. - tensor: If `True`, an OxIx5 array is returned, with `result[oo, ii, :]` corresponding + outer: Transforms for the container refs. Shape Ox4. + inner: Transforms for the contained refs. Shape Ix4. + tensor: If `True`, an OxIx4 array is returned, with `result[oo, ii, :]` corresponding to the `oo`th `outer` transform applied to the `ii`th inner transform. - If `False` (default), this is concatenated into `(O*I)x5` to allow simple + If `False` (default), this is concatenated into `(O*I)x4` to allow simple chaining into additional `apply_transforms()` calls. Returns: - OxIx5 or (O*I)x5 array. Final dimension is - `(total_x, total_y, total_rotation_ccw_rad, net_mirrored_x, total_scale)`. + OxIx4 or (O*I)x4 array. Final dimension is + `(total_x, total_y, total_rotation_ccw_rad, net_mirrored_x)`. """ outer = numpy.atleast_2d(outer).astype(float, copy=False) inner = numpy.atleast_2d(inner).astype(float, copy=False) - if outer.shape[1] == 4: - outer = numpy.pad(outer, ((0, 0), (0, 1)), constant_values=1.0) - if inner.shape[1] == 4: - inner = numpy.pad(inner, ((0, 0), (0, 1)), constant_values=1.0) - - if outer.shape[0] == 0 or inner.shape[0] == 0: - if tensor: - return numpy.empty((outer.shape[0], inner.shape[0], 5)) - return numpy.empty((0, 5)) - # If mirrored, flip y's xy_mir = numpy.tile(inner[:, :2], (outer.shape[0], 1, 1)) # dims are outer, inner, xyrm xy_mir[outer[:, 3].astype(bool), :, 1] *= -1 - # Apply outer scale to inner offset - xy_mir *= outer[:, None, 4, None] - rot_mats = [rotation_matrix_2d(angle) for angle in outer[:, 2]] xy = numpy.einsum('ort,oit->oir', rot_mats, xy_mir) - tot = numpy.empty((outer.shape[0], inner.shape[0], 5)) + tot = numpy.empty((outer.shape[0], inner.shape[0], 4)) tot[:, :, :2] = outer[:, None, :2] + xy - - # If mirrored, flip inner rotation - mirrored_outer = outer[:, None, 3].astype(bool) - rotations = outer[:, None, 2] + numpy.where(mirrored_outer, -inner[None, :, 2], inner[None, :, 2]) - - tot[:, :, 2] = rotations % (2 * pi) - tot[:, :, 3] = (outer[:, None, 3] + inner[None, :, 3]) % 2 # net mirrored - tot[:, :, 4] = outer[:, None, 4] * inner[None, :, 4] # net scale + tot[:, :, 2:] = outer[:, None, 2:] + inner[None, :, 2:] # sum rotations and mirrored + tot[:, :, 2] %= 2 * pi # clamp rot + tot[:, :, 3] %= 2 # clamp mirrored if tensor: return tot diff --git a/masque/utils/vertices.py b/masque/utils/vertices.py index 5a5df9f..5fddd52 100644 --- a/masque/utils/vertices.py +++ b/masque/utils/vertices.py @@ -18,23 +18,13 @@ def remove_duplicate_vertices(vertices: ArrayLike, closed_path: bool = True) -> `vertices` with no consecutive duplicates. This may be a view into the original array. """ vertices = numpy.asarray(vertices) - if vertices.shape[0] <= 1: - return vertices duplicates = (vertices == numpy.roll(vertices, -1, axis=0)).all(axis=1) if not closed_path: duplicates[-1] = False - - result = vertices[~duplicates] - if result.shape[0] == 0 and vertices.shape[0] > 0: - return vertices[:1] - return result + return vertices[~duplicates] -def remove_colinear_vertices( - vertices: ArrayLike, - closed_path: bool = True, - preserve_uturns: bool = False, - ) -> NDArray[numpy.float64]: +def remove_colinear_vertices(vertices: ArrayLike, closed_path: bool = True) -> NDArray[numpy.float64]: """ Given a list of vertices, remove any superflous vertices (i.e. those which lie along the line formed by their neighbors) @@ -43,40 +33,24 @@ def remove_colinear_vertices( vertices: Nx2 ndarray of vertices closed_path: If `True`, the vertices are assumed to represent an implicitly closed path. If `False`, the path is assumed to be open. Default `True`. - preserve_uturns: If `True`, colinear vertices that correspond to a 180 degree - turn (a "spike") are preserved. Default `False`. Returns: `vertices` with colinear (superflous) vertices removed. May be a view into the original array. """ - vertices = remove_duplicate_vertices(vertices, closed_path=closed_path) + vertices = remove_duplicate_vertices(vertices) # Check for dx0/dy0 == dx1/dy1 - dv = numpy.roll(vertices, -1, axis=0) - vertices - if not closed_path: - dv[-1] = 0 - # dxdy[i] is based on dv[i] and dv[i-1] - # slopes_equal[i] refers to vertex i - dxdy = dv * numpy.roll(dv, 1, axis=0)[:, ::-1] + dv = numpy.roll(vertices, -1, axis=0) - vertices # [y1-y0, y2-y1, ...] + dxdy = dv * numpy.roll(dv, 1, axis=0)[:, ::-1] # [[dx0*(dy_-1), (dx_-1)*dy0], dx1*dy0, dy1*dx0]] dxdy_diff = numpy.abs(numpy.diff(dxdy, axis=1))[:, 0] err_mult = 2 * numpy.abs(dxdy).sum(axis=1) + 1e-40 slopes_equal = (dxdy_diff / err_mult) < 1e-15 - - if preserve_uturns: - # Only merge if segments are in the same direction (avoid collapsing u-turns) - dot_prod = (dv * numpy.roll(dv, 1, axis=0)).sum(axis=1) - slopes_equal &= (dot_prod > 0) - if not closed_path: slopes_equal[[0, -1]] = False - if slopes_equal.all() and vertices.shape[0] > 0: - # All colinear, keep the first and last - return vertices[[0, vertices.shape[0] - 1]] - return vertices[~slopes_equal] @@ -84,7 +58,7 @@ def poly_contains_points( vertices: ArrayLike, points: ArrayLike, include_boundary: bool = True, - ) -> NDArray[numpy.bool_]: + ) -> NDArray[numpy.int_]: """ Tests whether the provided points are inside the implicitly closed polygon described by the provided list of vertices. @@ -103,13 +77,13 @@ def poly_contains_points( vertices = numpy.asarray(vertices, dtype=float) if points.size == 0: - return numpy.zeros(0, dtype=bool) + return numpy.zeros(0, dtype=numpy.int8) min_bounds = numpy.min(vertices, axis=0)[None, :] max_bounds = numpy.max(vertices, axis=0)[None, :] trivially_outside = ((points < min_bounds).any(axis=1) - | (points > max_bounds).any(axis=1)) + | (points > max_bounds).any(axis=1)) # noqa: E128 nontrivial = ~trivially_outside if trivially_outside.all(): @@ -127,10 +101,10 @@ def poly_contains_points( dv = numpy.roll(verts, -1, axis=0) - verts is_left = (dv[:, 0] * (ntpts[..., 1] - verts[:, 1]) # >0 if left of dv, <0 if right, 0 if on the line - - dv[:, 1] * (ntpts[..., 0] - verts[:, 0])) + - dv[:, 1] * (ntpts[..., 0] - verts[:, 0])) # noqa: E128 winding_number = ((upward & (is_left > 0)).sum(axis=0) - - (downward & (is_left < 0)).sum(axis=0)) + - (downward & (is_left < 0)).sum(axis=0)) # noqa: E128 nontrivial_inside = winding_number != 0 # filter nontrivial points based on winding number if include_boundary: diff --git a/pyproject.toml b/pyproject.toml index ad03fa9..9a29065 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,3 +1,7 @@ +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + [project] name = "masque" description = "Lithography mask library" @@ -42,40 +46,17 @@ dependencies = [ "klamath~=1.4", ] -[dependency-groups] -dev = [ - "masque[arrow]", - "masque[oasis]", - "masque[dxf]", - "masque[svg]", - "masque[visualize]", - "masque[text]", - "masque[manhattanize]", - "masque[manhattanize_slow]", - "masque[boolean]", - "matplotlib>=3.10.8", - "pytest>=9.0.2", - "ruff>=0.15.5", - "mypy>=1.19.1", - ] - -[build-system] -requires = ["hatchling"] -build-backend = "hatchling.build" [tool.hatch.version] path = "masque/__init__.py" [project.optional-dependencies] -arrow = ["pyarrow", "cffi"] oasis = ["fatamorgana~=0.11"] -dxf = ["ezdxf~=1.4"] +dxf = ["ezdxf~=1.0.2"] svg = ["svgwrite"] visualize = ["matplotlib"] text = ["matplotlib", "freetype-py"] -manhattanize = ["scikit-image"] -manhattanize_slow = ["float_raster"] -boolean = ["pyclipper"] +manhatanize_slow = ["float_raster"] [tool.ruff] @@ -106,20 +87,10 @@ lint.ignore = [ "PLR09", # Too many xxx "PLR2004", # magic number "PLC0414", # import x as x -# "PLC0415", # non-top-level import - "PLW1641", # missing __hash__ with total_ordering "TRY003", # Long exception message ] [tool.pytest.ini_options] addopts = "-rsXx" testpaths = ["masque"] -filterwarnings = [ - "ignore::DeprecationWarning:ezdxf.*", -] -[tool.mypy] -mypy_path = "stubs" -python_version = "3.11" -strict = false -check_untyped_defs = true diff --git a/stubs/ezdxf/__init__.pyi b/stubs/ezdxf/__init__.pyi deleted file mode 100644 index f25475f..0000000 --- a/stubs/ezdxf/__init__.pyi +++ /dev/null @@ -1,13 +0,0 @@ -from typing import Any, TextIO -from collections.abc import Iterable -from .layouts import Modelspace, BlockRecords - -class Drawing: - blocks: BlockRecords - @property - def layers(self) -> Iterable[Any]: ... - def modelspace(self) -> Modelspace: ... - def write(self, stream: TextIO) -> None: ... - -def new(version: str = ..., setup: bool = ...) -> Drawing: ... -def read(stream: TextIO) -> Drawing: ... diff --git a/stubs/ezdxf/entities.pyi b/stubs/ezdxf/entities.pyi deleted file mode 100644 index 2c6efa9..0000000 --- a/stubs/ezdxf/entities.pyi +++ /dev/null @@ -1,18 +0,0 @@ -from typing import Any -from collections.abc import Iterable, Sequence - -class DXFEntity: - def dxfattribs(self) -> dict[str, Any]: ... - def dxftype(self) -> str: ... - -class LWPolyline(DXFEntity): - def get_points(self) -> Iterable[tuple[float, ...]]: ... - -class Polyline(DXFEntity): - def points(self) -> Iterable[Any]: ... # has .xyz - -class Text(DXFEntity): - def get_placement(self) -> tuple[int, tuple[float, float, float]]: ... - def set_placement(self, p: Sequence[float], align: int = ...) -> Text: ... - -class Insert(DXFEntity): ... diff --git a/stubs/ezdxf/enums.pyi b/stubs/ezdxf/enums.pyi deleted file mode 100644 index 0dcf600..0000000 --- a/stubs/ezdxf/enums.pyi +++ /dev/null @@ -1,4 +0,0 @@ -from enum import IntEnum - -class TextEntityAlignment(IntEnum): - BOTTOM_LEFT = ... diff --git a/stubs/ezdxf/layouts.pyi b/stubs/ezdxf/layouts.pyi deleted file mode 100644 index c9d12ad..0000000 --- a/stubs/ezdxf/layouts.pyi +++ /dev/null @@ -1,21 +0,0 @@ -from typing import Any -from collections.abc import Iterator, Sequence, Iterable -from .entities import DXFEntity - -class BaseLayout: - def __iter__(self) -> Iterator[DXFEntity]: ... - def add_lwpolyline(self, points: Iterable[Sequence[float]], dxfattribs: dict[str, Any] = ...) -> Any: ... - def add_text(self, text: str, dxfattribs: dict[str, Any] = ...) -> Any: ... - def add_blockref(self, name: str, insert: Any, dxfattribs: dict[str, Any] = ...) -> Any: ... - -class Modelspace(BaseLayout): - @property - def name(self) -> str: ... - -class BlockLayout(BaseLayout): - @property - def name(self) -> str: ... - -class BlockRecords: - def new(self, name: str) -> BlockLayout: ... - def __iter__(self) -> Iterator[BlockLayout]: ... diff --git a/stubs/pyclipper/__init__.pyi b/stubs/pyclipper/__init__.pyi deleted file mode 100644 index 08d77c8..0000000 --- a/stubs/pyclipper/__init__.pyi +++ /dev/null @@ -1,46 +0,0 @@ -from typing import Any -from collections.abc import Iterable, Sequence -import numpy -from numpy.typing import NDArray - - -# Basic types for Clipper integer coordinates -Path = Sequence[tuple[int, int]] -Paths = Sequence[Path] - -# Types for input/output floating point coordinates -FloatPoint = tuple[float, float] | NDArray[numpy.floating] -FloatPath = Sequence[FloatPoint] | NDArray[numpy.floating] -FloatPaths = Iterable[FloatPath] - -# Constants -PT_SUBJECT: int -PT_CLIP: int - -PT_UNION: int -PT_INTERSECTION: int -PT_DIFFERENCE: int -PT_XOR: int - -PFT_EVENODD: int -PFT_NONZERO: int -PFT_POSITIVE: int -PFT_NEGATIVE: int - -# Scaling functions -def scale_to_clipper(paths: FloatPaths, scale: float = ...) -> Paths: ... -def scale_from_clipper(paths: Path | Paths, scale: float = ...) -> Any: ... - -class PolyNode: - Contour: Path - Childs: list[PolyNode] - Parent: PolyNode - IsHole: bool - -class Pyclipper: - def __init__(self) -> None: ... - def AddPath(self, path: Path, poly_type: int, closed: bool) -> None: ... - def AddPaths(self, paths: Paths, poly_type: int, closed: bool) -> None: ... - def Execute(self, clip_type: int, subj_fill_type: int = ..., clip_fill_type: int = ...) -> Paths: ... - def Execute2(self, clip_type: int, subj_fill_type: int = ..., clip_fill_type: int = ...) -> PolyNode: ... - def Clear(self) -> None: ...