Compare commits

...

286 commits

Author SHA1 Message Date
27e1c23c33 [svg] avoid mutating the original library 2026-04-20 20:51:47 -07:00
5c1050b0ff [arrow] add lazy arrow reader 2026-04-20 20:51:47 -07:00
37d462525c [shapes] move to per-shape purpose-built _from_raw constructors 2026-04-20 20:51:47 -07:00
3f63599abe [gdsii_arrow] more performance work 2026-04-20 20:51:47 -07:00
27f4f0e86e [Polygon / PolyCollection] add raw constructors 2026-04-20 20:51:47 -07:00
ec78031565 [RectCollection] add a RectCollection shape 2026-04-20 20:51:47 -07:00
7130d26112 enable annotations=None by default 2026-04-20 20:51:47 -07:00
28562f73f6 [gdsii_arrow] further improvements to speed 2026-04-20 20:51:47 -07:00
d387066228 [Label / Ref / Grid] add raw constructors 2026-04-20 20:51:47 -07:00
9df42000b7 [gdsii] add some profiling helpers 2026-04-20 20:51:47 -07:00
09fec67a21 [gdsii_arrow] misc correctness work 2026-04-20 20:51:47 -07:00
18f12792c4 add some missing deps 2026-04-20 20:51:47 -07:00
jan
bd0c8c9d16 [gdsii_arrow] add gdsii_arrow 2026-04-20 20:51:47 -07:00
950d144ead [AutoTool] rework two-L routing to avoid some bugs with incorrect transitions 2026-04-17 20:41:37 -07:00
d95ddbb6b9 [Arc] return clearer errors when working with an invalid arclength 2026-04-16 19:39:42 -07:00
bdc4dfdd06 [Pather] fix using trees when append=True 2026-04-09 16:32:25 -07:00
6cf9600193 [PortList.measure_travel] add a convenience wrapper for measuring internal travel 2026-04-09 11:41:21 -07:00
e6f5136357 [PathTool] add native S-bend 2026-04-08 23:48:48 -07:00
778b3d9be7 [Builder / RenderPather] BREAKING remove aliases to old names 2026-04-08 23:08:26 -07:00
02f0833fb3 [pather] handle paths without existing ports 2026-04-08 22:33:07 -07:00
47f150f579 [curves.euler] clean up nearly-duplicate points 2026-04-08 18:07:54 -07:00
84106dc355 [set_dead] improve handling of dead ports 2026-04-08 17:41:50 -07:00
429e687666 [docs] add migration guide 2026-04-06 15:38:03 -07:00
c501a8ff99 [referenced_patterns] don't visit tops twice 2026-04-06 15:30:37 -07:00
fd2698c503 [docs / examples] Update docs and examples 2026-04-02 12:19:51 -07:00
8100d8095a [Pather] improve bounds handling for bundles 2026-04-02 12:18:03 -07:00
2c5243237e [Pather] rework pather internals -- split route planning vs strategy selection 2026-04-02 11:34:49 -07:00
cf0a245143 [dxf] ignore unreferenced internal dxf blocks 2026-04-02 10:09:38 -07:00
bbe3586ba9 [Pather] fix trace_into() for straight connections 2026-04-02 09:55:27 -07:00
e071bd89b0 [tests] clean up some over-specific tests 2026-04-02 00:40:18 -07:00
06ed2ce54a [Pather] avoid repeated resolve and non-atomic breaks 2026-04-02 00:11:26 -07:00
0f2b4d713b [Pattern] make plug/place atomic wrt. annotation conflicts 2026-04-01 23:34:40 -07:00
524503031c [ILibrary / LazyLibrary] allow mapping a name to itself 2026-04-01 22:59:18 -07:00
ce7bf5ce70 [ILibrary / LazyLibrary] raise a LibraryError instead of KeyError 2026-04-01 22:58:30 -07:00
f0a4b08a31 [PortList] find_transform requires a non-empty connection map 2026-04-01 22:49:35 -07:00
b3a1489258 [PortPather] complain if the user gives ambiguous port names 2026-04-01 22:47:22 -07:00
d366db5a62 [utils.transform] better input validation in normalize_mirror and apply_transform 2026-04-01 21:59:27 -07:00
20f37ea0f7 [ell] validate spacing length 2026-04-01 21:58:56 -07:00
6fd73b9d46 [ell] fix crash when ccw=None but spacing is non-scalar 2026-04-01 21:58:21 -07:00
32744512e0 [boolean] more work towards getting boolean ops working 2026-04-01 21:28:33 -07:00
75a9114709 [bezier] validate weights 2026-04-01 21:16:03 -07:00
df578d7764 [PolyCollection] copy vertex offsets when making normalized form 2026-04-01 21:15:44 -07:00
786716fc62 [preflight] document that preflight doesn't copy the library 2026-04-01 20:58:10 -07:00
a82365ec8c [svg] fix duplicate svg ids 2026-04-01 20:57:35 -07:00
28be89f047 [gdsii] make sure iterable is supported 2026-04-01 20:56:59 -07:00
afc49f945d [DeferredDict] add setdefault(), pop(), popitem(), copy() 2026-04-01 20:14:53 -07:00
ce46cc18dc [tmpfile] delete the temporary file if an error occurs 2026-04-01 20:12:24 -07:00
7c50f95fde [ILibrary] update docs for add() 2026-04-01 20:00:46 -07:00
ae314cce93 [ILibraryView] child_order shouldn't leak graphlib.CycleErrror 2026-04-01 19:59:59 -07:00
09a95a6608 [ILibraryView] fix assignment during dfs() 2026-04-01 19:57:29 -07:00
fbe138d443 [data_to_ports] warn on invalid angle 2026-04-01 19:22:16 -07:00
4b416745da [repetition.Grid] check for invalid displacements or counts 2026-04-01 19:21:47 -07:00
0830dce50c [data_to_ports] don't leave the pattern dirty if we error out part-way 2026-04-01 19:10:50 -07:00
ac87179da2 [data_to_ports] warn that repetitions are not not expanded 2026-04-01 19:01:47 -07:00
f0eea0382b [Ref] get_bounds_single shoudl ignore repetition 2026-04-01 19:00:59 -07:00
0c9b435e94 [PortList.check_ports] Check for duplicate map_in/map_out values 2026-04-01 19:00:19 -07:00
f461222852 [ILibrary.add] respect mutate_other=False even without duplicate keys 2026-04-01 18:58:01 -07:00
9767ee4e62 [Pattern] improve atomicity of place(), plug(), interface() 2026-03-31 23:00:35 -07:00
395ad4df9d [PortList] plugged() failure shouldn't dirty the ports 2026-03-31 22:38:27 -07:00
35b42c397b [Pattern.append] don't dirty pattern if append() fails 2026-03-31 22:37:16 -07:00
6a7b3b2259 [PorList] Error if multiple ports are renamed to the same name 2026-03-31 22:15:48 -07:00
8d50f497f1 [repetition / others] copies should get their own repetitions 2026-03-31 22:15:21 -07:00
2176d56b4c [Arc] Error out on zero radius 2026-03-31 22:03:42 -07:00
f1e25debec [Ellipse] force radii to float dtype 2026-03-31 22:03:19 -07:00
4b07bb9e25 [OASIS] raise PatternError for unsuppored caps 2026-03-31 21:42:49 -07:00
2952e6ef8f [Arc / Ellipse / Circle] gracefully handle large arclen 2026-03-31 21:42:16 -07:00
c303a0c114 [Ellipse / Arc] improve bounds calculation 2026-03-31 21:41:49 -07:00
f34b9b2f5c [Text] fixup bounds and normalized form 2026-03-31 21:22:35 -07:00
89cdd23f00 [Arc / Ellipse] make radii hashable 2026-03-31 21:22:15 -07:00
620b001af5 [ILibrary] fix dedup messing up rotations 2026-03-31 21:21:16 -07:00
46a3559391 [dxf] fix dxf repetition load 2026-03-31 21:19:29 -07:00
08421d6a54 [OASIS] repeated property keys should be merged, not overwritten 2026-03-31 19:00:38 -07:00
462a05a665 [Library] fix dedup()
- use consistent deduplicated target name
- remove shape indices per dedup
2026-03-31 18:58:37 -07:00
2b29e46b93 [Pather] fix port rename/deletion tracking 2026-03-31 18:49:41 -07:00
2e0b64bdab [Ref / Label] make equality safe for unrelated types 2026-03-31 17:51:02 -07:00
20c845a881 [Tool] avoid passing port_names down 2026-03-31 17:12:41 -07:00
707a16fe64 [RenderStep] fix mirroring a planned path 2026-03-31 17:11:26 -07:00
932565d531 [Repetition] fix ordering 2026-03-31 17:10:19 -07:00
e7f847d4c7 [Pather] make two-L path planning atomic (don't error out with only one half drawn) 2026-03-31 09:28:48 -07:00
3beadd2bf0 [Path] preserve cap extensions in normalized form, and scale them with scale() 2026-03-31 09:24:22 -07:00
1bcf5901d6 [Path] preserve width from normalized form 2026-03-31 09:23:19 -07:00
56e401196a [PathTool] fix pathtool L-shape 2026-03-31 00:25:45 -07:00
83ec64158a [AutoTool] fix exact s-bend validation 2026-03-31 00:24:52 -07:00
aa7007881f [pack2d] bin-packing fixes 2026-03-31 00:16:58 -07:00
d03fafcaf6 [ILibraryView] don't fail on nested dangling ref 2026-03-30 23:34:31 -07:00
d3be6aeba3 [PortList] add_port_pair requires unique port names 2026-03-30 23:33:33 -07:00
ffbe15c465 [Port / PortList] raise PortError on missing port name 2026-03-30 23:32:50 -07:00
b44c962e07 [Pattern] improve error handling in place() 2026-03-30 22:11:50 -07:00
20bd0640e1 [Library] improve handling of dangling refs 2026-03-30 22:10:26 -07:00
4ae8115139 [DeferredDict] implement get/items/values for deferreddict 2026-03-30 22:07:21 -07:00
c2ef3e4217 [test] data_to_ports should accurately preserve ports from a scaled ref 2026-03-30 21:19:10 -07:00
c32168dc64 [ILibraryView / Pattern] flatten() should raise PatternError if asked to preserve ports from a repeated ref 2026-03-30 21:17:33 -07:00
b843ffb4d3 [ILibraryView / Pattern] flatten() shouldn't drop ports-only patterns if flatten_ports=True 2026-03-30 21:12:20 -07:00
9adfcac437 [Ref] don't shadow ref property 2026-03-30 21:07:13 -07:00
26cc0290b9 [Abstract] respect ref scale 2026-03-30 21:06:51 -07:00
548b51df47 [Port] fix printing of None rotation 2026-03-30 20:25:45 -07:00
06f8611a90 [svg] fix rotation in svg 2026-03-30 20:24:24 -07:00
9ede16df5d [dxf] fix reading Polyline 2026-03-30 20:22:40 -07:00
add82e955d update dev deps 2026-03-30 19:39:25 -07:00
jan
cfec9e8c76 [euler_bend] speed up integration 2026-03-10 00:47:50 -07:00
jan
2275bf415a [Pattern] improve error message when attempting to reference a Pattern 2026-03-10 00:31:58 -07:00
jan
fa3dfa1e74 [Pattern] improve clarity of .copy()->.deepcopy() 2026-03-10 00:31:11 -07:00
jan
75dc391540 [pack2d] don't place rejects 2026-03-10 00:29:51 -07:00
jan
feb5d87cf4 [repetition.Arbitrary] fix zero-sized bounds 2026-03-10 00:29:10 -07:00
jan
5f91bd9c6c [BREAKING][Ref / Label / Pattern] Make rotate/mirror consistent intrinsic transfomations
offset and repetition are extrinsic; use rotate_around() and flip() to
alter both
mirror() and rotate() only affect the object's intrinsic properties
2026-03-09 23:34:39 -07:00
jan
db22237369 [PathCap] clean up comment 2026-03-09 11:20:04 -07:00
jan
a6ea5c08e6 [repetition.Grid] drop b_vector=None handling (guaranteed to be zeros now) 2026-03-09 11:19:42 -07:00
jan
3792248cd1 [dxf] improve dxf reader (ezdxf 1.4 related LWPolyLine changes) 2026-03-09 11:16:30 -07:00
jan
e8083cc24c [dxf] hide ezdxf warnings directly 2026-03-09 03:37:42 -07:00
jan
d307589995 [ports2data] add note about using id rather than name 2026-03-09 03:29:19 -07:00
jan
ea93a7ef37 [remove_colinear_vertices / Path] add preserve_uturns and use it for paths 2026-03-09 03:28:31 -07:00
jan
495babf837 [Path] revert endcap changes to avoid double-counting 2026-03-09 03:27:39 -07:00
jan
5d20a061fd [Path / Polygon] improve normalized_form approach to follow documented order 2026-03-09 02:42:13 -07:00
jan
25b8fe8448 [Path.to_polygons] Use linalg.solve() where possible; fallback to lstsq if singular 2026-03-09 02:41:15 -07:00
jan
f154303bef [remove_colinear_vertices] treat unclosed paths correctly 2026-03-09 02:38:33 -07:00
jan
5596e2b1af [tests] cover scale-aware transform 2026-03-09 02:35:35 -07:00
jan
6c42049b23 [PortList] actually raise the error 2026-03-09 02:34:57 -07:00
jan
da20922224 [apply_transform] include scale in transform 2026-03-09 02:34:11 -07:00
jan
b8ee4bb05d [ell] fix set_rotation check 2026-03-09 02:32:20 -07:00
jan
169f66cc85 [rotation_matrix_2d] improve manhattan angle detection
modulo causes issues with negative numbers
2026-03-09 01:16:54 -07:00
jan
a38c5bb085 [ports2data] deal with cycles better 2026-03-09 01:15:42 -07:00
jan
0ad89d6d95 [DeferredDict] capture value in set_const 2026-03-09 01:10:26 -07:00
jan
6c96968341 [Path] improve robustness of intersection calculations 2026-03-09 01:09:37 -07:00
jan
b7143e3287 [repetition.Grid] fix __le__ comparison of b_vector 2026-03-09 01:08:35 -07:00
jan
0cce5e0586 [Ref] misc copy fixes -- don't deepcopy repetition or annotations in __copy__ 2026-03-09 01:07:50 -07:00
jan
36cb86a15d [tests] clean unused imports 2026-03-09 00:20:29 -07:00
jan
5e0936e15f [dxf] update ezdxf dep 2026-03-09 00:18:06 -07:00
jan
a467a0baca [Path] simplify conditional 2026-03-09 00:17:50 -07:00
jan
564ff10db3 [dxf] add roundtrip dxf test, enable refs and improve path handling 2026-03-09 00:17:23 -07:00
jan
e261585894 [gdsii] Try to close files if able 2026-03-08 23:09:45 -07:00
jan
f42114bf43 [gdsii] explicitly cast cap_extensions to int 2026-03-08 22:47:22 -07:00
jan
5eb460ecb7 [repetition.Grid] disallow b_vector=None (except when initializing) 2026-03-08 22:43:58 -07:00
jan
fb822829ec [Polygon] rect() should call rectangle() with positive width/height
no big deal, but this makes vertex order consistent
2026-03-08 22:42:48 -07:00
jan
838c742651 [Path] Improve comparisons: compare vertices 2026-03-08 22:41:37 -07:00
jan
9a76ce5b66 [Path] cap_extensions=None should mean [0, 0] when using custom extensions 2026-03-08 22:41:11 -07:00
jan
2019fc0d74 [Path] Circular cap extensions should translate to square, not empty 2026-03-08 22:40:08 -07:00
jan
e3f8d28529 [Path] improve __lt__ for endcaps 2026-03-08 22:37:30 -07:00
jan
9296011d4b [Ref] deepcopy annotations and repetitions 2026-03-08 22:34:39 -07:00
jan
92d0140093 [Pattern] fix pattern comparisons 2026-03-08 22:33:59 -07:00
jan
c4dc9f9573 [oasis] comment and code cleanup 2026-03-08 22:32:16 -07:00
jan
0b8e11e8bf [dxf] improve manhattan check robustness 2026-03-08 22:31:18 -07:00
jan
5989e45906 [apply_transforms] fix handling of rotations while mirrored 2026-03-08 21:38:47 -07:00
jan
7eec2b7acf [LazyLibrary] report full cycle when one is detected 2026-03-08 21:18:54 -07:00
jan
2a6458b1ac [repetitions.Arbitrary] reassign to displacements when scaling or mirroring to trigger re-sort 2026-03-08 20:43:33 -07:00
jan
9ee3c7ff89 [ILibrary] make referenced_patterns more robust to cyclical dependencies 2026-03-08 20:01:00 -07:00
jan
3bedab2301 [ports2data] Make port label parsing more robust 2026-03-08 19:58:56 -07:00
jan
4eb1d8d486 [gdsii] fix missing paren in message 2026-03-08 19:57:49 -07:00
jan
3ceeba23b8 [tests] move imports into functions 2026-03-08 19:00:20 -07:00
jan
963103b859 [Pattern / Library] add resolve_repeated_refs 2026-03-08 15:15:53 -07:00
jan
e5a6aab940 [dxf] improve repetition handling 2026-03-08 15:15:28 -07:00
jan
042941c838 [DeferredDict] improve handling of constants 2026-03-08 15:05:08 -07:00
jan
0f63acbad0 [AutoSlots] deduplicate slots entries 2026-03-08 15:01:27 -07:00
jan
a0d7d0ed26 [annotations] fix annotations_eq
-e
2026-03-08 14:56:13 -07:00
jan
d32a5ee762 [dxf] fix typos 2026-03-08 14:53:28 -07:00
jan
19dafad157 [remove_duplicate_vertices] improve handling of degenerate shapes 2026-03-08 10:24:25 -07:00
jan
5cb608734d [poly_contains_points] consistently return boolean arrays 2026-03-08 10:17:52 -07:00
jan
d0b48e6bfc [tests] fix some tests 2026-03-08 10:15:09 -07:00
jan
ef5c8c715e [Pather] add auto_render_append arg 2026-03-08 10:12:43 -07:00
jan
049864ddc7 [manhattanize_fast] Improve handling of grids smaller than the shape 2026-03-08 10:10:46 -07:00
jan
3bf7efc404 [Polygon] fix offset error messages 2026-03-08 09:48:03 -07:00
jan
74fa377450 [repetition.Arbitrary] fix equality check 2026-03-08 09:47:50 -07:00
jan
c3581243c8 [Pather] Major pathing rework / Consolidate RenderPather, Pather, and Builder 2026-03-08 00:18:47 -08:00
jan
338c123fb1 [pattern] speed up visualize() 2026-03-07 23:57:12 -08:00
jan
a89f07c441 [Port] add describe() for logging 2026-03-07 23:36:14 -08:00
jan
bb7f4906af [ILibrary] add .resolve() 2026-03-07 23:35:47 -08:00
jan
2513c7f8fd [pattern.visualize] cleanup 2026-03-07 10:32:41 -08:00
jan
ad4e9af59d [svg] add annotate_ports arg 2026-03-07 10:32:22 -08:00
jan
46555dbd4d [pattern.visualize] add options for file output and port visualization 2026-03-07 10:22:54 -08:00
jan
26e6a44559 [readme] clean up todos 2026-03-07 00:48:50 -08:00
jan
32681edb47 [tests] fixup tests related to pather api changes 2026-03-07 00:48:22 -08:00
jan
84f37195ad [Pather / RenderPather / Tool] Rename path->trace in more locations 2026-03-07 00:33:18 -08:00
jan
0189756df4 [Pather/RenderPather] Add U-bend to trace_into 2026-03-07 00:03:07 -08:00
jan
1070815730 [AutoTool] add U-bend 2026-03-06 23:51:56 -08:00
jan
8a45c6d8d6 [tutorial] update pather and renderpather tutorials to new syntax 2026-03-06 23:31:44 -08:00
jan
9d6fb985d8 [Pather/RenderPather/PathTool] Add updated pather tests 2026-03-06 23:09:59 -08:00
jan
69ac25078c [Pather/RenderPather/Tool/PortPather] Add U-bends 2026-03-06 22:58:32 -08:00
jan
babbe78daa [Pather/RenderPather/PortPather] Rework pathing verbs *BREAKING CHANGE* 2026-03-06 22:58:03 -08:00
jan
16875e9cd6 [RenderPather / PathTool] Improve support for port transformations
So that moving a port while in the middle of planning a path doesn't
break everything
2026-03-06 13:07:06 -08:00
jan
4332cf14c0 [ezdxf] add stubs 2026-02-16 20:48:26 -08:00
jan
ff8ca92963 cleanup 2026-02-16 20:48:15 -08:00
jan
ed021e3d81 [Pattern] fix mirror_elements and change arg name to axis 2026-02-16 19:23:08 -08:00
jan
07a25ec290 [Mirrorable / Flippable] clarify docs 2026-02-16 18:53:31 -08:00
jan
504f89796c Add ruff and mypy to dev deps 2026-02-16 18:08:40 -08:00
jan
0f49924aa6 Add ezdxf stubs 2026-02-16 18:04:16 -08:00
jan
ebfe1b559c misc cleanup (mostly type-related) 2026-02-16 17:58:34 -08:00
jan
7ad59d6b89 [boolean] Add basic boolean functionality (boolean() and Polygon.boolean()) 2026-02-16 17:42:19 -08:00
jan
5d040061f4 [set_dead] improve docs 2026-02-16 13:57:16 -08:00
jan
f42e720c68 [set_dead / skip_geometry] Improve dead pathers so more "broken" layouts can be successfully executed 2026-02-16 13:44:56 -08:00
jan
cf822c7dcf [Port] add more logging to aid in debug 2026-02-16 12:23:40 -08:00
jan
59e996e680 [tutorial] include a repetition and update docs 2026-02-15 20:05:38 -08:00
jan
abf236a046 [mirror / flip_across] improve documentation 2026-02-15 19:46:47 -08:00
jan
d40bdb1cb2 add 'dev' dependency group and 'manhattanize' optional dep 2026-02-15 19:23:02 -08:00
5e08579498 [tests] add round-trip file tests 2026-02-15 16:44:17 -08:00
c18e5b8d3e [OASIS] cleanup 2026-02-15 16:43:46 -08:00
48f7569c1f [traits] Formalize Flippable and Pivotable depending on Positionable 2026-02-15 14:34:10 -08:00
8a56679884 Clean up types/imports 2026-02-15 12:40:47 -08:00
1cce6c1f70 [Tests] cleanup 2026-02-15 12:36:13 -08:00
d9adb4e1b9 [Tools] fixup imports 2026-02-15 12:35:58 -08:00
1de76bff47 [tests] Add machine-generated test suite 2026-02-15 01:41:31 -08:00
9bb0d5190d [Arc] improve some edge cases when calculating arclengths 2026-02-15 01:37:53 -08:00
ad49276345 [Arc] improve bounding box edge cases 2026-02-15 01:35:43 -08:00
fe70d0574b [Arc] Improve handling of full rings 2026-02-15 01:34:56 -08:00
36fed84249 [PolyCollection] fix slicing 2026-02-15 01:31:15 -08:00
278f0783da [PolyCollection] gracefully handle empty PolyCollections 2026-02-15 01:26:06 -08:00
72f462d077 [AutoTool] Enable running AutoTool without any bends in the list 2026-02-15 01:18:21 -08:00
66d6fae2bd [AutoTool] Fix error handling for ccw=None 2026-02-15 01:15:07 -08:00
2b7ad00204 [Port] add custom __deepcopy__ 2026-02-15 00:57:47 -08:00
2d63e72802 fixup! [Mirrorable / Flippable] Bifurcate mirror into flip (relative to line) vs mirror (relative to own offset/origin) 2026-02-15 00:49:34 -08:00
51ced2fe83 [Text] use translate instead of offset 2026-02-15 00:07:43 -08:00
19fac463e4 [Shape] fix annotation 2026-02-15 00:07:27 -08:00
44986bac67 [Mirrorable / Flippable] Bifurcate mirror into flip (relative to line) vs mirror (relative to own offset/origin) 2026-02-15 00:05:53 -08:00
accad3db9f Prefer [1 - axis] for clarity 2026-02-14 19:20:50 -08:00
05098c0c13 [remove_colinear_vertices] keep two vertices if all were colinear 2026-02-14 19:15:54 -08:00
f64b080b15 [repetition.Arbitrary] fix mirroring 2026-02-14 19:10:01 -08:00
54f3b273bc [Label] don't drop annotations when copying 2026-02-14 18:53:23 -08:00
add0600bac [RenderPather] warn about unrendered paths on deletion 2026-02-14 17:13:22 -08:00
737d41d592 [examples] expand port_pather tutorial 2026-02-14 17:06:29 -08:00
395244ee83 [examples] some cleanup 2026-02-14 16:58:24 -08:00
43ccd8de2f [examples] type annotations 2026-02-14 16:57:34 -08:00
dfa0259997 [examples] clean up imports 2026-02-14 16:57:11 -08:00
37418d2137 [examples] fixup examples and add port_pather example 2026-02-14 16:07:19 -08:00
d8702af5b9 misc doc updates 2026-02-01 15:04:34 -08:00
49e3917a6e [remove_duplicate_vertices] remove the last vertex rather than the first
to better match docs
2026-01-19 22:20:09 -08:00
48034b7a30 [Polygon.rect] raise a PatternError when given the wrong number of args
instead of assert
2026-01-19 22:20:09 -08:00
jan
28e2864ce1 [dxf] make sure layer tuple contents are ints 2026-01-19 22:20:09 -08:00
jan
ba2bc2b444 [dxf] don't need to add polygon offset since it's zero 2026-01-19 22:20:09 -08:00
jan
05b73066ea update TODO in readme 2026-01-19 22:20:09 -08:00
jan
f7138ee8e4 [PortPather] generalize to multi-port functions where possible 2026-01-19 22:20:09 -08:00
jan
dca0df940b [Pattern] use 1-axis instead of axis-1 2026-01-19 22:20:09 -08:00
jan
c18249c4d5 [AutoTool] S-bend to L-bend fallback does not work yet, should throw an error 2026-01-19 22:20:09 -08:00
jan
fc963cfbfc [Pather / RenderPather] Fix handling of jog polarity 2026-01-19 22:20:09 -08:00
jan
c366add952 [Port] mirror() should not mirror port position, only orientation 2026-01-19 22:20:09 -08:00
jan
84629ea614 minor readme cleanup 2026-01-19 22:20:09 -08:00
jan
fdd776f4d7 [RenderPather.plug] fix ok_connections param 2026-01-19 22:20:09 -08:00
jan
0f8078127d cleanup 2026-01-19 22:20:09 -08:00
jan
48ccf3e148 [PortPather] add some more port-related convenience functions 2026-01-19 22:20:09 -08:00
jan
88a3d261aa [AutoTool / SimpleTool] allow choice between rotating or mirroring bends 2026-01-19 22:20:09 -08:00
jan
c1c83afc98 [Library.flatten] add dangling_ok param 2026-01-19 22:20:09 -08:00
jan
f8a82336f6 [PatherMixin] add thru arg to path_into and rework portlist inheritance 2026-01-19 22:20:09 -08:00
jan
62a030dd14 [PortPather] add rename_to and rename_from 2026-01-19 22:20:09 -08:00
jan
7ca3dd5b09 [PatherMixin] add at() for generating PortPather 2026-01-19 22:20:09 -08:00
jan
d46be245c6 add missing float_raster dep for manhattanize_slow 2026-01-19 22:20:09 -08:00
jan
54cddaddd9 [PortPather] add PortPather 2026-01-19 22:20:09 -08:00
jan
1c7ee9bef4 [RenderPather] whitespace 2026-01-19 22:20:09 -08:00
jan
7c928a59fa [plug()] rename inherit_name arg to thru and allow passing a string
Breaking change

Affects Pattern, Builder, Pather, RenderPather
2026-01-19 22:20:09 -08:00
jan
bc8c0ee580 add some whitespace 2026-01-19 22:20:09 -08:00
d3216c680c [Tool / AutoTool / Pather / RenderPather / PatherMixin] add support for S-bends 2026-01-19 22:20:09 -08:00
3593b4aec7 [Port] add Port.measure_travel() 2026-01-19 22:20:09 -08:00
e8e630bb2f [Tool / Pather] fix some doc typos 2026-01-19 22:20:09 -08:00
feba7c699d [Tool / AutoTool] clarify some docstings 2026-01-19 22:20:09 -08:00
70559308a1 [Pather] clarify a variable name 2026-01-19 22:20:09 -08:00
021142533d [AutoTool] enable S-bends 2026-01-19 22:20:09 -08:00
b1c838c8fd [AutoTool / SimpleTool] remove append arg 2026-01-19 22:20:09 -08:00
7c033edc21 [SimpleTool/AutoTool] clarify some error messages 2026-01-19 22:20:09 -08:00
601773d17e [AutoTool/SimpleTool/BasicTool] Rename BasicTool->SimpleTool and remove transition handling. Export AutoTool and SimpleTool at top level. 2026-01-19 22:20:09 -08:00
cf92cc06b3 [AutoTool] pass in kwargs to straight fn call 2026-01-19 22:20:09 -08:00
fcb622441b [AutoTool] consolidate duplicate code for path() and render() 2026-01-19 22:20:09 -08:00
2e57724095 [AutoTool] add add_complementary_transitions() 2026-01-19 22:20:09 -08:00
a69860ad9c [AutoTool] Use more dataclasses to clarify internal code 2026-01-19 22:20:09 -08:00
82f3e7ab8f [PolyCollection] rename setter arg to placate linter 2026-01-19 22:20:09 -08:00
a308b1515a [format_stacktrace] suppress linter 2026-01-19 22:20:09 -08:00
38d4c4b6af [AutoTool] support min/max length for straight segments 2026-01-19 22:20:09 -08:00
4822ae8708 [BasicTool/AutoTool] fix port orientation for straight segments when using RenderPather 2026-01-19 22:20:09 -08:00
d00899bb39 [AutoTool] Add first pass for AutoTool 2026-01-19 22:20:09 -08:00
a908fadfc3 [RenderPather] add wrapped label/ref/polygon/rect functions 2026-01-19 22:20:09 -08:00
92875cfdb6 [Pather/RenderPather/PatherMixin] clean up imports 2026-01-19 22:20:09 -08:00
11306dbb56 [Pather / RenderPather] move common functionality into PatherMixin; redo hierarchy
- (BREAKING change) Pather.mpath no longer wraps the whole bus into a
container, since this has no equivalent in RenderPather. Possible this
functionality will return in the future
- Removed `tool_port_names` arg from Pather functions
- In general RenderPather should be much closer to Pather now
2026-01-19 22:20:09 -08:00
fc9d4c6ba2 [pather] code style changes 2026-01-19 22:20:09 -08:00
8fa1d0479c [error] also exclude concurrent.futures.process from traces 2026-01-19 22:20:09 -08:00
a4b93419b4 [error] also exclude frames starting with '<frozen' 2026-01-19 22:20:09 -08:00
549193534f [file.svg] use logger.warning over warnings.warn (for flexibility) 2026-01-19 22:20:09 -08:00
6a494b99a0 [ports] make port mismatch deltas more obvious 2026-01-19 22:20:09 -08:00
705a1cef78 [error, ports] Make stack traces more directly reflect teh location of the issue 2026-01-19 22:20:09 -08:00
b8ab3b91f5 misc cleanup: variable naming, typing, comments 2026-01-19 22:20:09 -08:00
34a43a707c [Path / PolyCollection / Polygon] fix order of rotation/offset 2026-01-19 22:20:09 -08:00
aca49dc7e3 [Polygon / Path / PolyCollection] Force polygon/path offset to (0, 0)
And disallow setting it.

This offset was basically just a footgun.
2026-01-19 22:20:09 -08:00
e231fa89cb [Polygon.rect] use floats more explicitly 2026-01-19 22:20:09 -08:00
0c04bf8ea3 Various type-checking improvements 2026-01-19 22:20:09 -08:00
e5f0c85560 [BasicTool] enable straight to handle trees (not just flat patterns) 2026-01-19 22:20:09 -08:00
2961ae5471 [builder.tools] Handle in_ptype=None 2026-01-19 22:20:09 -08:00
ba05e40f84 [traits.annotatable] Don't break when setting annotations to None 2026-01-19 22:20:09 -08:00
314910d363 [shapes] Don't create empty dicts for annotations 2026-01-19 22:20:09 -08:00
jan
fbe804750b [PolyCollection] add PolyCollection shape
based on ndarrays of vertices and offsets
2026-01-19 22:20:09 -08:00
aee0d5b619 [utils.curves] ignore re-import of trapeziod 2026-01-19 22:20:09 -08:00
e00d82bbc4 allow annotations to be None
breaking change, but properties are seldom used by anyone afaik
2026-01-19 22:20:09 -08:00
jan
4d74eea253 [file.gdsii] attributes may have key=126 2025-10-12 23:34:39 -07:00
108 changed files with 15388 additions and 3046 deletions

299
MIGRATION.md Normal file
View file

@ -0,0 +1,299 @@
# Migration Guide
This guide covers changes between the git tag `release` and the current tree.
At `release`, `masque.__version__` was `3.3`; the current tree reports `3.4`.
Most downstream changes are in `masque/builder/*`, but there are a few other
API changes that may require code updates.
## Routing API: renamed and consolidated
The routing helpers were consolidated into a single implementation in
`masque/builder/pather.py`.
The biggest migration point is that the old routing verbs were renamed:
| Old API | New API |
| --- | --- |
| `Pather.path(...)` | `Pather.trace(...)` |
| `Pather.path_to(...)` | `Pather.trace_to(...)` |
| `Pather.mpath(...)` | `Pather.trace(...)` / `Pather.trace_to(...)` with multiple ports |
| `Pather.pathS(...)` | `Pather.jog(...)` |
| `Pather.pathU(...)` | `Pather.uturn(...)` |
| `Pather.path_into(...)` | `Pather.trace_into(...)` |
| `Pather.path_from(src, dst)` | `Pather.at(src).trace_into(dst)` |
| `RenderPather.path(...)` | `Pather(..., auto_render=False).trace(...)` |
| `RenderPather.path_to(...)` | `Pather(..., auto_render=False).trace_to(...)` |
| `RenderPather.mpath(...)` | `Pather(..., auto_render=False).trace(...)` / `Pather(..., auto_render=False).trace_to(...)` |
| `RenderPather.pathS(...)` | `Pather(..., auto_render=False).jog(...)` |
| `RenderPather.pathU(...)` | `Pather(..., auto_render=False).uturn(...)` |
| `RenderPather.path_into(...)` | `Pather(..., auto_render=False).trace_into(...)` |
| `RenderPather.path_from(src, dst)` | `Pather(..., auto_render=False).at(src).trace_into(dst)` |
There are also new convenience wrappers:
- `straight(...)` for `trace_to(..., ccw=None, ...)`
- `ccw(...)` for `trace_to(..., ccw=True, ...)`
- `cw(...)` for `trace_to(..., ccw=False, ...)`
- `jog(...)` for S-bends
- `uturn(...)` for U-bends
Important: `Pather.path()` is no longer the routing API. It now forwards to
`Pattern.path()` and creates a geometric `Path` element. Any old routing code
that still calls `pather.path(...)` must be renamed.
### Common rewrites
```python
# old
pather.path('VCC', False, 6_000)
pather.path_to('VCC', None, x=0)
pather.mpath(['GND', 'VCC'], True, xmax=-10_000, spacing=5_000)
pather.pathS('VCC', offset=-2_000, length=8_000)
pather.pathU('VCC', offset=4_000, length=5_000)
pather.path_into('src', 'dst')
pather.path_from('src', 'dst')
# new
pather.cw('VCC', 6_000)
pather.straight('VCC', x=0)
pather.ccw(['GND', 'VCC'], xmax=-10_000, spacing=5_000)
pather.jog('VCC', offset=-2_000, length=8_000)
pather.uturn('VCC', offset=4_000, length=5_000)
pather.trace_into('src', 'dst')
pather.at('src').trace_into('dst')
```
If you prefer the more explicit spelling, `trace(...)` and `trace_to(...)`
remain the underlying primitives:
```python
pather.trace('VCC', False, 6_000)
pather.trace_to('VCC', None, x=0)
```
## `PortPather` and `.at(...)`
Routing can now be written in a fluent style via `.at(...)`, which returns a
`PortPather`.
```python
(rpather.at('VCC')
.trace(False, length=6_000)
.trace_to(None, x=0)
)
```
This is additive, not required for migration. Existing code can stay with the
non-fluent `Pather` methods after renaming the verbs above.
Old `PortPather` helper names were also cleaned up:
| Old API | New API |
| --- | --- |
| `save_copy(...)` | `mark(...)` |
| `rename_to(...)` | `rename(...)` |
Example:
```python
# old
pp.save_copy('branch')
pp.rename_to('feed')
# new
pp.mark('branch')
pp.rename('feed')
```
## Imports and module layout
`Pather` now provides the remaining builder/routing surface in
`masque/builder/pather.py`. The old module files
`masque/builder/builder.py` and `masque/builder/renderpather.py` were removed.
Update imports like this:
```python
# old
from masque.builder.builder import Builder
from masque.builder.renderpather import RenderPather
# new
from masque.builder import Pather
builder = Pather(...)
deferred = Pather(..., auto_render=False)
```
Top-level imports from `masque` also continue to work.
`Pather` now defaults to `auto_render=True`, so plain construction replaces the
old `Builder` behavior. Use `Pather(..., auto_render=False)` where you
previously used `RenderPather`.
## `BasicTool` was replaced
`BasicTool` is no longer exported. Use:
- `SimpleTool` for the simple "one straight generator + one bend cell" case
- `AutoTool` if you need transitions, multiple candidate straights/bends, or
S-bends/U-bends
### Old `BasicTool`
```python
from masque.builder.tools import BasicTool
tool = BasicTool(
straight=(make_straight, 'input', 'output'),
bend=(lib.abstract('bend'), 'input', 'output'),
transitions={
'm2wire': (lib.abstract('via'), 'top', 'bottom'),
},
)
```
### New `AutoTool`
```python
from masque.builder.tools import AutoTool
tool = AutoTool(
straights=[
AutoTool.Straight(
ptype='m1wire',
fn=make_straight,
in_port_name='input',
out_port_name='output',
),
],
bends=[
AutoTool.Bend(
abstract=lib.abstract('bend'),
in_port_name='input',
out_port_name='output',
clockwise=True,
),
],
sbends=[],
transitions={
('m2wire', 'm1wire'): AutoTool.Transition(
lib.abstract('via'),
'top',
'bottom',
),
},
default_out_ptype='m1wire',
)
```
The key differences are:
- `BasicTool` -> `SimpleTool` or `AutoTool`
- `straight=(fn, in_name, out_name)` -> `straights=[AutoTool.Straight(...)]`
- `bend=(abstract, in_name, out_name)` -> `bends=[AutoTool.Bend(...)]`
- transition keys are now `(external_ptype, internal_ptype)` tuples
- transitions use `AutoTool.Transition(...)` instead of raw tuples
If your old `BasicTool` usage did not rely on transitions or multiple routing
options, `SimpleTool` is the closest replacement.
## Custom `Tool` subclasses
If you maintain your own `Tool` subclass, the interface changed:
- `Tool.path(...)` became `Tool.traceL(...)`
- `Tool.traceS(...)` and `Tool.traceU(...)` were added for native S/U routes
- `planL()` / `planS()` / `planU()` remain the planning hooks used by deferred rendering
In practice, a minimal old implementation like:
```python
class MyTool(Tool):
def path(self, ccw, length, **kwargs):
...
```
should now become:
```python
class MyTool(Tool):
def traceL(self, ccw, length, **kwargs):
...
```
If you do not implement `traceS()` or `traceU()`, the unified pather will
either fall back to the planning hooks or synthesize those routes from simpler
steps where possible.
## Transform semantics changed
The other major user-visible change is that `mirror()` and `rotate()` are now
treated more consistently as intrinsic transforms on low-level objects.
The practical migration rule is:
- use `mirror()` / `rotate()` when you want to change the object relative to its
own origin
- use `flip_across(...)`, `rotate_around(...)`, or container-level transforms
when you want to move the object in its parent coordinate system
### Example: `Port`
Old behavior:
```python
port.mirror(0) # changed both offset and orientation
```
New behavior:
```python
port.mirror(0) # changes orientation only
port.flip_across(axis=0) # old "mirror in the parent pattern" behavior
```
### What to audit
Check code that calls:
- `Port.mirror(...)`
- `Ref.rotate(...)`
- `Ref.mirror(...)`
- `Label.rotate_around(...)` / `Label.mirror(...)`
If that code expected offsets or repetition grids to move automatically, it
needs updating. For whole-pattern transforms, prefer calling `Pattern.mirror()`
or `Pattern.rotate_around(...)` at the container level.
## Other user-facing changes
### DXF environments
If you install the DXF extra, the supported `ezdxf` baseline moved from
`~=1.0.2` to `~=1.4`. Any pinned environments should be updated accordingly.
### New exports
These are additive, but available now from `masque` and `masque.builder`:
- `PortPather`
- `SimpleTool`
- `AutoTool`
- `boolean`
## Minimal migration checklist
If your code uses the routing stack, do these first:
1. Replace `path`/`path_to`/`mpath`/`path_into` calls with
`trace`/`trace_to`/multi-port `trace`/`trace_into`.
2. Replace `BasicTool` with `SimpleTool` or `AutoTool`.
3. Fix imports that still reference `masque.builder.builder` or
`masque.builder.renderpather`.
4. Audit any low-level `mirror()` usage, especially on `Port` and `Ref`.
If your code only uses `Pattern`, `Library`, `place()`, and `plug()` without the
routing helpers, you may not need any changes beyond the transform audit and any
stale imports.

View file

@ -37,6 +37,55 @@ A layout consists of a hierarchy of `Pattern`s stored in a single `Library`.
Each `Pattern` can contain `Ref`s pointing at other patterns, `Shape`s, `Label`s, and `Port`s.
Library / Pattern hierarchy:
```
+-----------------------------------------------------------------------+
| Library |
| |
| Name: "MyChip" ...> Name: "Transistor" |
| +---------------------------+ : +---------------------------+ |
| | [Pattern] | : | [Pattern] | |
| | | : | | |
| | shapes: {...} | : | shapes: { | |
| | ports: {...} | : | "Si": [<Polygon>, ...] | |
| | | : | "M1": [<Polygon>, ...]}| |
| | refs: | : | ports: {G, S, D} | |
| | "Transistor": [Ref, Ref]|..: +---------------------------+ |
| +---------------------------+ |
| |
| # (`refs` keys resolve to Patterns within the Library) |
+-----------------------------------------------------------------------+
```
Pattern internals:
```
+---------------------------------------------------------------+
| [Pattern] |
| |
| shapes: { |
| (1, 0): [Polygon, Circle, ...], # Geometry by layer |
| (2, 0): [Path, ...] |
| "M1" : [Path, ...] |
| "M2" : [Polygon, ...] |
| } |
| |
| refs: { # Key sets target name, Ref sets transform |
| "my_cell": [ |
| Ref(offset=(0,0), rotation=0), |
| Ref(offset=(10,0), rotation=R90, repetition=Grid(...)) |
| ] |
| } |
| |
| ports: { |
| "in": Port(offset=(0,0), rotation=0, ptype="M1"), |
| "out": Port(offset=(10,0), rotation=R180, ptype="wg") |
| } |
| |
+---------------------------------------------------------------+
```
`masque` departs from several "classic" GDSII paradigms:
- A `Pattern` object does not store its own name. A name is only assigned when the pattern is placed
into a `Library`, which is effectively a name->`Pattern` mapping.
@ -96,7 +145,7 @@ References are accomplished by listing the target's name, not its `Pattern` obje
in order to create a reference, but they also need to access the pattern's ports.
* One way to provide this data is through an `Abstract`, generated via
`Library.abstract()` or through a `Library.abstract_view()`.
* Another way is use `Builder.place()` or `Builder.plug()`, which automatically creates
* Another way is use `Pather.place()` or `Pather.plug()`, which automatically creates
an `Abstract` from its internally-referenced `Library`.
@ -133,7 +182,7 @@ tree = make_tree(...)
# To reference this cell in our layout, we have to add all its children to our `library` first:
top_name = tree.top() # get the name of the topcell
name_mapping = library.add(tree) # add all patterns from `tree`, renaming elgible conflicting patterns
name_mapping = library.add(tree) # add all patterns from `tree`, renaming eligible conflicting patterns
new_name = name_mapping.get(top_name, top_name) # get the new name for the cell (in case it was auto-renamed)
my_pattern.ref(new_name, ...) # instantiate the cell
@ -144,8 +193,8 @@ my_pattern.ref(new_name, ...) # instantiate the cell
# In practice, you may do lots of
my_pattern.ref(lib << make_tree(...), ...)
# With a `Builder` and `place()`/`plug()` the `lib <<` portion can be implicit:
my_builder = Builder(library=lib, ...)
# With a `Pather` and `place()`/`plug()` the `lib <<` portion can be implicit:
my_builder = Pather(library=lib, ...)
...
my_builder.place(make_tree(...))
```
@ -176,7 +225,7 @@ my_pattern.place(library << make_tree(...), ...)
### Quickly add geometry, labels, or refs:
The long form for adding elements can be overly verbose:
Adding elements can be overly verbose:
```python3
my_pattern.shapes[layer].append(Polygon(vertices, ...))
my_pattern.labels[layer] += [Label('my text')]
@ -228,9 +277,6 @@ my_pattern.ref(_make_my_subpattern(), offset=..., ...)
## TODO
* Better interface for polygon operations (e.g. with `pyclipper`)
- de-embedding
- boolean ops
* Tests tests tests
* check renderpather
* pather and renderpather examples
* PolyCollection & arrow-based read/write
* Bus-to-bus connections?
* tuple / string layer auto-translation

View file

@ -6,7 +6,7 @@ from masque.file import gdsii
from masque import Arc, Pattern
def main():
def main() -> None:
pat = Pattern()
layer = (0, 0)
pat.shapes[layer].extend([

View file

@ -0,0 +1,5 @@
from masque.file.gdsii_perf import main
if __name__ == '__main__':
raise SystemExit(main())

View file

@ -1,7 +1,5 @@
import numpy
from pyclipper import (
Pyclipper, PT_CLIP, PT_SUBJECT, CT_UNION, CT_INTERSECTION, PFT_NONZERO,
scale_to_clipper, scale_from_clipper,
Pyclipper, PT_SUBJECT, CT_UNION, PFT_NONZERO,
)
p = Pyclipper()
p.AddPaths([
@ -12,8 +10,8 @@ p.AddPaths([
], PT_SUBJECT, closed=True)
#p.Execute2?
#p.Execute?
p.Execute(PT_UNION, PT_NONZERO, PT_NONZERO)
p.Execute(CT_UNION, PT_NONZERO, PT_NONZERO)
p.Execute(CT_UNION, PFT_NONZERO, PFT_NONZERO)
p.Execute(CT_UNION, PFT_NONZERO, PFT_NONZERO)
p.Execute(CT_UNION, PFT_NONZERO, PFT_NONZERO)
p = Pyclipper()

View file

@ -0,0 +1,131 @@
from __future__ import annotations
import argparse
import importlib
import json
import time
from pathlib import Path
from typing import Any
from masque import LibraryError
READERS: dict[str, tuple[str, tuple[str, ...]]] = {
'gdsii': ('masque.file.gdsii', ('readfile',)),
'gdsii_arrow': ('masque.file.gdsii_arrow', ('readfile', 'arrow_import', 'arrow_convert')),
}
def _summarize_library(path: Path, elapsed_s: float, info: dict[str, object], lib: object) -> dict[str, object]:
assert hasattr(lib, '__len__')
assert hasattr(lib, 'tops')
tops = lib.tops() # type: ignore[no-any-return, attr-defined]
try:
unique_top = lib.top() # type: ignore[no-any-return, attr-defined]
except LibraryError:
unique_top = None
return {
'path': str(path),
'elapsed_s': elapsed_s,
'library_name': info['name'],
'cell_count': len(lib), # type: ignore[arg-type]
'topcells': tops,
'topcell': unique_top,
}
def _summarize_arrow_import(path: Path, elapsed_s: float, arrow_arr: Any) -> dict[str, object]:
libarr = arrow_arr[0]
return {
'path': str(path),
'elapsed_s': elapsed_s,
'arrow_rows': len(arrow_arr),
'library_name': libarr['lib_name'].as_py(),
'cell_count': len(libarr['cells']),
'layer_count': len(libarr['layers']),
}
def _profile_stage(module: Any, stage: str, path: Path) -> dict[str, object]:
start = time.perf_counter()
if stage == 'readfile':
lib, info = module.readfile(path)
elapsed_s = time.perf_counter() - start
return _summarize_library(path, elapsed_s, info, lib)
if stage == 'arrow_import':
if hasattr(module, 'readfile_arrow'):
libarr, _info = module.readfile_arrow(path)
elapsed_s = time.perf_counter() - start
return {
'path': str(path),
'elapsed_s': elapsed_s,
'arrow_rows': 1,
'library_name': libarr['lib_name'].as_py(),
'cell_count': len(libarr['cells']),
'layer_count': len(libarr['layers']),
}
arrow_arr = module._read_to_arrow(path)
elapsed_s = time.perf_counter() - start
return _summarize_arrow_import(path, elapsed_s, arrow_arr)
if stage == 'arrow_convert':
arrow_arr = module._read_to_arrow(path)
libarr = arrow_arr[0]
start = time.perf_counter()
lib, info = module.read_arrow(libarr)
elapsed_s = time.perf_counter() - start
return _summarize_library(path, elapsed_s, info, lib)
raise ValueError(f'Unsupported stage {stage!r}')
def build_arg_parser() -> argparse.ArgumentParser:
parser = argparse.ArgumentParser(description='Profile GDS readers with a stable end-to-end workload.')
parser.add_argument('--reader', choices=sorted(READERS), required=True)
parser.add_argument('--stage', default='readfile')
parser.add_argument('--path', type=Path, required=True)
parser.add_argument('--warmup', type=int, default=1)
parser.add_argument('--repeat', type=int, default=1)
parser.add_argument('--output-json', type=Path)
return parser
def main(argv: list[str] | None = None) -> int:
parser = build_arg_parser()
args = parser.parse_args(argv)
module_name, stages = READERS[args.reader]
if args.stage not in stages:
parser.error(f'reader {args.reader!r} only supports stages: {", ".join(stages)}')
module = importlib.import_module(module_name)
path = args.path.expanduser().resolve()
for _ in range(args.warmup):
_profile_stage(module, args.stage, path)
runs = []
for _ in range(args.repeat):
runs.append(_profile_stage(module, args.stage, path))
payload = {
'reader': args.reader,
'stage': args.stage,
'warmup': args.warmup,
'repeat': args.repeat,
'runs': runs,
}
rendered = json.dumps(payload, indent=2, sort_keys=True)
if args.output_json is not None:
args.output_json.parent.mkdir(parents=True, exist_ok=True)
args.output_json.write_text(rendered + '\n')
print(rendered)
return 0
if __name__ == '__main__':
raise SystemExit(main())

View file

@ -11,7 +11,7 @@ from masque.file import gdsii, dxf, oasis
def main():
def main() -> None:
lib = Library()
cell_name = 'ellip_grating'

View file

@ -1,6 +1,12 @@
masque Tutorial
===============
These examples are meant to be read roughly in order.
- Start with `basic_shapes.py` for the core `Pattern` / GDS concepts.
- Then read `devices.py` and `library.py` for hierarchical composition and libraries.
- Read the `pather*` tutorials separately when you want routing helpers.
Contents
--------
@ -8,24 +14,30 @@ Contents
* Draw basic geometry
* Export to GDS
- [devices](devices.py)
* Build hierarchical photonic-crystal example devices
* Reference other patterns
* Add ports to a pattern
* Snap ports together to build a circuit
* Use `Pather` to snap ports together into a circuit
* Check for dangling references
- [library](library.py)
* Continue from `devices.py` using a lazy library
* Create a `LazyLibrary`, which loads / generates patterns only when they are first used
* Explore alternate ways of specifying a pattern for `.plug()` and `.place()`
* Design a pattern which is meant to plug into an existing pattern (via `.interface()`)
- [pather](pather.py)
* Use `Pather` to route individual wires and wire bundles
* Use `BasicTool` to generate paths
* Use `BasicTool` to automatically transition between path types
- [renderpather](rendpather.py)
* Use `RenderPather` and `PathTool` to build a layout similar to the one in [pather](pather.py),
* Use `AutoTool` to generate paths
* Use `AutoTool` to automatically transition between path types
- [renderpather](renderpather.py)
* Use `Pather(auto_render=False)` and `PathTool` to build a layout similar to the one in [pather](pather.py),
but using `Path` shapes instead of `Polygon`s.
- [port_pather](port_pather.py)
* Use `PortPather` and the `.at()` syntax for more concise routing
* Advanced port manipulation and connections
Additionaly, [pcgen](pcgen.py) is a utility module for generating photonic crystal lattices.
Additionally, [pcgen](pcgen.py) is a utility module used by `devices.py` for generating
photonic-crystal lattices; it is support code rather than a step-by-step tutorial.
Running
@ -37,3 +49,6 @@ cd examples/tutorial
python3 basic_shapes.py
klayout -e basic_shapes.gds
```
Some tutorials depend on outputs from earlier ones. In particular, `library.py`
expects `circuit.gds`, which is generated by `devices.py`.

View file

@ -1,12 +1,9 @@
from collections.abc import Sequence
import numpy
from numpy import pi
from masque import (
layer_t, Pattern, Label, Port,
Circle, Arc, Polygon,
)
from masque import layer_t, Pattern, Circle, Arc, Ref
from masque.repetition import Grid
import masque.file.gdsii
@ -39,6 +36,45 @@ def hole(
return pat
def hole_array(
radius: float,
num_x: int = 5,
num_y: int = 3,
pitch: float = 2000,
layer: layer_t = (1, 0),
) -> Pattern:
"""
Generate an array of circular holes using `Repetition`.
Args:
radius: Circle radius.
num_x, num_y: Number of holes in x and y.
pitch: Center-to-center spacing.
layer: Layer to draw the holes on.
Returns:
Pattern containing a grid of holes.
"""
# First, make a pattern for a single hole
hpat = hole(radius, layer)
# Now, create a pattern that references it multiple times using a Grid
pat = Pattern()
pat.refs['hole'] = [
Ref(
offset=(0, 0),
repetition=Grid(a_vector=(pitch, 0), a_count=num_x,
b_vector=(0, pitch), b_count=num_y)
)]
# We can also add transformed references (rotation, mirroring, etc.)
pat.refs['hole'].append(
Ref(offset=(0, -pitch), rotation=pi / 4, mirrored=True)
)
return pat, hpat
def triangle(
radius: float,
layer: layer_t = (1, 0),
@ -60,9 +96,7 @@ def triangle(
]) * radius
pat = Pattern()
pat.shapes[layer].extend([
Polygon(offset=(0, 0), vertices=vertices),
])
pat.polygon(layer, vertices=vertices)
return pat
@ -111,9 +145,13 @@ def main() -> None:
lib['smile'] = smile(1000)
lib['triangle'] = triangle(1000)
# Use a Grid to make many holes efficiently
lib['grid'], lib['hole'] = hole_array(1000)
masque.file.gdsii.writefile(lib, 'basic_shapes.gds', **GDS_OPTS)
lib['triangle'].visualize()
lib['grid'].visualize(lib)
if __name__ == '__main__':

View file

@ -1,11 +1,19 @@
"""
Tutorial: building hierarchical devices with `Pattern`, `Port`, and `Pather`.
This file uses photonic-crystal components as the concrete example, so some of
the geometry-generation code is domain-specific. The tutorial value is in the
Masque patterns around it: creating reusable cells, annotating ports, composing
hierarchy with references, and snapping ports together to build a larger circuit.
"""
from collections.abc import Sequence, Mapping
import numpy
from numpy import pi
from masque import (
layer_t, Pattern, Ref, Label, Builder, Port, Polygon,
Library, ILibraryView,
layer_t, Pattern, Ref, Pather, Port, Polygon,
Library,
)
from masque.utils import ports2data
from masque.file.gdsii import writefile, check_valid_names
@ -64,9 +72,9 @@ def perturbed_l3(
Provided sequence should have same length as `shifts_a`.
xy_size: `(x, y)` number of mirror periods in each direction; total size is
`2 * n + 1` holes in each direction. Default (10, 10).
perturbed_radius: radius of holes perturbed to form an upwards-driected beam
perturbed_radius: radius of holes perturbed to form an upwards-directed beam
(multiplicative factor). Default 1.1.
trench width: Width of the undercut trenches. Default 1200.
trench_width: Width of the undercut trenches. Default 1200.
Returns:
`Pattern` object representing the L3 design.
@ -79,14 +87,15 @@ def perturbed_l3(
shifts_a=shifts_a,
shifts_r=shifts_r)
# Build L3 cavity, using references to the provided hole pattern
# Build the cavity by instancing the supplied `hole` pattern many times.
# Using references keeps the pattern compact even though it contains many holes.
pat = Pattern()
pat.refs[hole] += [
Ref(scale=r, offset=(lattice_constant * x,
lattice_constant * y))
for x, y, r in xyr]
# Add rectangular undercut aids
# Add rectangular undercut aids based on the referenced hole extents.
min_xy, max_xy = pat.get_bounds_nonempty(hole_lib)
trench_dx = max_xy[0] - min_xy[0]
@ -95,7 +104,7 @@ def perturbed_l3(
Polygon.rect(ymax=min_xy[1], xmin=min_xy[0], lx=trench_dx, ly=trench_width),
]
# Ports are at outer extents of the device (with y=0)
# Define the interface in Masque terms: two ports at the left/right extents.
extent = lattice_constant * xy_size[0]
pat.ports = dict(
input=Port((-extent, 0), rotation=0, ptype='pcwg'),
@ -125,17 +134,17 @@ def waveguide(
Returns:
`Pattern` object representing the waveguide.
"""
# Generate hole locations
# Generate the normalized lattice locations for the line defect.
xy = pcgen.waveguide(length=length, num_mirror=mirror_periods)
# Build the pattern
# Build the pattern by placing repeated references to the same hole cell.
pat = Pattern()
pat.refs[hole] += [
Ref(offset=(lattice_constant * x,
lattice_constant * y))
for x, y in xy]
# Ports are at outer edges, with y=0
# Publish the device interface as two ports at the outer edges.
extent = lattice_constant * length / 2
pat.ports = dict(
left=Port((-extent, 0), rotation=0, ptype='pcwg'),
@ -164,17 +173,17 @@ def bend(
`Pattern` object representing the waveguide bend.
Ports are named 'left' (input) and 'right' (output).
"""
# Generate hole locations
# Generate the normalized lattice locations for the bend.
xy = pcgen.wgbend(num_mirror=mirror_periods)
# Build the pattern
pat= Pattern()
# Build the pattern by instancing the shared hole cell.
pat = Pattern()
pat.refs[hole] += [
Ref(offset=(lattice_constant * x,
lattice_constant * y))
for x, y in xy]
# Figure out port locations.
# Publish the bend interface as two ports.
extent = lattice_constant * mirror_periods
pat.ports = dict(
left=Port((-extent, 0), rotation=0, ptype='pcwg'),
@ -203,17 +212,17 @@ def y_splitter(
`Pattern` object representing the y-splitter.
Ports are named 'in', 'top', and 'bottom'.
"""
# Generate hole locations
# Generate the normalized lattice locations for the splitter.
xy = pcgen.y_splitter(num_mirror=mirror_periods)
# Build pattern
# Build the pattern by instancing the shared hole cell.
pat = Pattern()
pat.refs[hole] += [
Ref(offset=(lattice_constant * x,
lattice_constant * y))
for x, y in xy]
# Determine port locations
# Publish the splitter interface as one input and two outputs.
extent = lattice_constant * mirror_periods
pat.ports = {
'in': Port((-extent, 0), rotation=0, ptype='pcwg'),
@ -227,13 +236,13 @@ def y_splitter(
def main(interactive: bool = True) -> None:
# Generate some basic hole patterns
# First make a couple of reusable primitive cells.
shape_lib = {
'smile': basic_shapes.smile(RADIUS),
'hole': basic_shapes.hole(RADIUS),
}
# Build some devices
# Then build a small library of higher-level devices from those primitives.
a = LATTICE_CONSTANT
devices = {}
@ -245,22 +254,23 @@ def main(interactive: bool = True) -> None:
devices['ysplit'] = y_splitter(lattice_constant=a, hole='hole', mirror_periods=5)
devices['l3cav'] = perturbed_l3(lattice_constant=a, hole='smile', hole_lib=shape_lib, xy_size=(4, 10)) # uses smile :)
# Turn our dict of devices into a Library.
# This provides some convenience functions in the future!
# Turn the device mapping into a `Library`.
# That gives us convenience helpers for hierarchy inspection and abstract views.
lib = Library(devices)
#
# Build a circuit
#
# Create a `Builder`, and add the circuit to our library as "my_circuit".
circ = Builder(library=lib, name='my_circuit')
# Create a `Pather`, and register the resulting top cell as "my_circuit".
circ = Pather(library=lib, name='my_circuit')
# Start by placing a waveguide. Call its ports "in" and "signal".
# Start by placing a waveguide and renaming its ports to match the circuit-level
# names we want to use while assembling the design.
circ.place('wg10', offset=(0, 0), port_map={'left': 'in', 'right': 'signal'})
# Extend the signal path by attaching the "left" port of a waveguide.
# Since there is only one other port ("right") on the waveguide we
# are attaching (wg10), it automatically inherits the name "signal".
# Extend the signal path by attaching another waveguide.
# Because `wg10` only has one unattached port left after the plug, Masque can
# infer that it should keep the name `signal`.
circ.plug('wg10', {'signal': 'left'})
# We could have done the following instead:
@ -268,8 +278,8 @@ def main(interactive: bool = True) -> None:
# lib['my_circuit'] = circ_pat
# circ_pat.place(lib.abstract('wg10'), ...)
# circ_pat.plug(lib.abstract('wg10'), ...)
# but `Builder` lets us omit some of the repetition of `lib.abstract(...)`, and uses similar
# syntax to `Pather` and `RenderPather`, which add wire/waveguide routing functionality.
# but `Pather` removes some repeated `lib.abstract(...)` boilerplate and keeps
# the assembly code focused on port-level intent.
# Attach a y-splitter to the signal path.
# Since the y-splitter has 3 ports total, we can't auto-inherit the
@ -281,13 +291,10 @@ def main(interactive: bool = True) -> None:
circ.plug('wg05', {'signal1': 'left'})
circ.plug('wg05', {'signal2': 'left'})
# Add a bend to both ports.
# Our bend's ports "left" and "right" refer to the original counterclockwise
# orientation. We want the bends to turn in opposite directions, so we attach
# the "right" port to "signal1" to bend clockwise, and the "left" port
# to "signal2" to bend counterclockwise.
# We could also use `mirrored=(True, False)` to mirror one of the devices
# and then use same device port on both paths.
# Add a bend to both branches.
# Our bend primitive is defined with a specific orientation, so choosing which
# port to plug determines whether the path turns clockwise or counterclockwise.
# We could also mirror one instance instead of using opposite ports.
circ.plug('bend0', {'signal1': 'right'})
circ.plug('bend0', {'signal2': 'left'})
@ -296,29 +303,26 @@ def main(interactive: bool = True) -> None:
circ.plug('l3cav', {'signal1': 'input'})
circ.plug('wg10', {'signal1': 'left'})
# "signal2" just gets a single of equivalent length
# `signal2` gets a single waveguide of equivalent overall length.
circ.plug('wg28', {'signal2': 'left'})
# Now we bend both waveguides back towards each other
# Now bend both branches back towards each other.
circ.plug('bend0', {'signal1': 'right'})
circ.plug('bend0', {'signal2': 'left'})
circ.plug('wg05', {'signal1': 'left'})
circ.plug('wg05', {'signal2': 'left'})
# To join the waveguides, we attach a second y-junction.
# We plug "signal1" into the "bot" port, and "signal2" into the "top" port.
# The remaining port gets named "signal_out".
# This operation would raise an exception if the ports did not line up
# correctly (i.e. they required different rotations or translations of the
# y-junction device).
# To join the branches, attach a second y-junction.
# This succeeds only if both chosen ports agree on the same translation and
# rotation for the inserted device; otherwise Masque raises an exception.
circ.plug('ysplit', {'signal1': 'bot', 'signal2': 'top'}, {'in': 'signal_out'})
# Finally, add some more waveguide to "signal_out".
circ.plug('wg10', {'signal_out': 'left'})
# We can also add text labels for our circuit's ports.
# They will appear at the uppermost hierarchy level, while the individual
# device ports will appear further down, in their respective cells.
# Bake the top-level port metadata into labels so it survives GDS export.
# These labels appear on the circuit cell; individual child devices keep their
# own port labels in their own cells.
ports_to_data(circ.pattern)
# Check if we forgot to include any patterns... ooops!
@ -330,12 +334,12 @@ def main(interactive: bool = True) -> None:
lib.add(shape_lib)
assert not lib.dangling_refs()
# We can visualize the design. Usually it's easier to just view the GDS.
# We can visualize the design directly, though opening the written GDS is often easier.
if interactive:
print('Visualizing... this step may be slow')
circ.pattern.visualize(lib)
#Write out to GDS, only keeping patterns referenced by our circuit (including itself)
# Write out only the subtree reachable from our top cell.
subtree = lib.subtree('my_circuit') # don't include wg90, which we don't use
check_valid_names(subtree.keys())
writefile(subtree, 'circuit.gds', **GDS_OPTS)

View file

@ -1,23 +1,28 @@
"""
Tutorial: using `LazyLibrary` and `Pather.interface()`.
This example assumes you have already read `devices.py` and generated the
`circuit.gds` file it writes. The goal here is not the photonic-crystal geometry
itself, but rather how Masque lets you mix lazily loaded GDS content with
python-generated devices inside one library.
"""
from typing import Any
from collections.abc import Sequence, Callable
from pprint import pformat
import numpy
from numpy import pi
from masque import Pattern, Builder, LazyLibrary
from masque import Pather, LazyLibrary
from masque.file.gdsii import writefile, load_libraryfile
import pcgen
import basic_shapes
import devices
from devices import ports_to_data, data_to_ports
from devices import data_to_ports
from basic_shapes import GDS_OPTS
def main() -> None:
# Define a `LazyLibrary`, which provides lazy evaluation for generating
# patterns and lazy-loading of GDS contents.
# A `LazyLibrary` delays work until a pattern is actually needed.
# That applies both to GDS cells we load from disk and to python callables
# that generate patterns on demand.
lib = LazyLibrary()
#
@ -27,9 +32,9 @@ def main() -> None:
# Scan circuit.gds and prepare to lazy-load its contents
gds_lib, _properties = load_libraryfile('circuit.gds', postprocess=data_to_ports)
# Add it into the device library by providing a way to read port info
# This maintains the lazy evaluation from above, so no patterns
# are actually read yet.
# Add those cells into our lazy library.
# Nothing is read yet; we are only registering how to fetch and postprocess
# each pattern when it is first requested.
lib.add(gds_lib)
print('Patterns loaded from GDS into library:\n' + pformat(list(lib.keys())))
@ -44,8 +49,8 @@ def main() -> None:
hole = 'triangle',
)
# Triangle-based variants. These are defined here, but they won't run until they're
# retrieved from the library.
# Triangle-based variants. These lambdas are only recipes for building the
# patterns; they do not execute until someone asks for the cell.
lib['tri_wg10'] = lambda: devices.waveguide(length=10, mirror_periods=5, **opts)
lib['tri_wg05'] = lambda: devices.waveguide(length=5, mirror_periods=5, **opts)
lib['tri_wg28'] = lambda: devices.waveguide(length=28, mirror_periods=5, **opts)
@ -57,22 +62,22 @@ def main() -> None:
# Build a mixed waveguide with an L3 cavity in the middle
#
# Immediately start building from an instance of the L3 cavity
circ2 = Builder(library=lib, ports='tri_l3cav')
# Start a new design by copying the ports from an existing library cell.
# This gives `circ2` the same external interface as `tri_l3cav`.
circ2 = Pather(library=lib, ports='tri_l3cav')
# First way to get abstracts is `lib.abstract(name)`
# We can use this syntax directly with `Pattern.plug()` and `Pattern.place()` as well as through `Builder`.
# First way to specify what we are plugging in: request an explicit abstract.
# This works with `Pattern` methods directly as well as with `Pather`.
circ2.plug(lib.abstract('wg10'), {'input': 'right'})
# Second way to get abstracts is to use an AbstractView
# This also works directly with `Pattern.plug()` / `Pattern.place()`.
# Second way: use an `AbstractView`, which behaves like a mapping of names
# to abstracts.
abstracts = lib.abstract_view()
circ2.plug(abstracts['wg10'], {'output': 'left'})
# Third way to specify an abstract works by automatically getting
# it from the library already within the Builder object.
# This wouldn't work if we only had a `Pattern` (not a `Builder`).
# Just pass the pattern name!
# Third way: let `Pather` resolve a pattern name through its own library.
# This shorthand is convenient, but it is specific to helpers that already
# carry a library reference.
circ2.plug('tri_wg10', {'input': 'right'})
circ2.plug('tri_wg10', {'output': 'left'})
@ -81,13 +86,15 @@ def main() -> None:
#
# Build a device that could plug into our mixed_wg_cav and joins the two ports
# Build a second device that is explicitly designed to mate with `circ2`.
#
# We'll be designing against an existing device's interface...
circ3 = Builder.interface(source=circ2)
# `Pather.interface()` makes a new pattern whose ports mirror an existing
# design's external interface. That is useful when you want to design an
# adapter, continuation, or mating structure.
circ3 = Pather.interface(source=circ2)
# ... that lets us continue from where we left off.
# Continue routing outward from those inherited ports.
circ3.plug('tri_bend0', {'input': 'right'})
circ3.plug('tri_bend0', {'input': 'left'}, mirrored=True) # mirror since no tri y-symmetry
circ3.plug('tri_bend0', {'input': 'right'})

View file

@ -1,10 +1,9 @@
"""
Manual wire routing tutorial: Pather and BasicTool
Manual wire routing tutorial: Pather and AutoTool
"""
from collections.abc import Callable
from numpy import pi
from masque import Pather, RenderPather, Library, Pattern, Port, layer_t, map_layers
from masque.builder.tools import BasicTool, PathTool
from masque import Pather, Library, Pattern, Port, layer_t
from masque.builder.tools import AutoTool, Tool
from masque.file.gdsii import writefile
from basic_shapes import GDS_OPTS
@ -107,31 +106,29 @@ def map_layer(layer: layer_t) -> layer_t:
'M2': (20, 0),
'V1': (30, 0),
}
return layer_mapping.get(layer, layer)
if isinstance(layer, str):
return layer_mapping.get(layer, layer)
return layer
#
# Now we can start building up our library (collection of static cells) and pathing tools.
#
# If any of the operations below are confusing, you can cross-reference against the `RenderPather`
# tutorial, which handles some things more explicitly (e.g. via placement) and simplifies others
# (e.g. geometry definition).
#
def main() -> None:
def prepare_tools() -> tuple[Library, Tool, Tool]:
"""
Create some basic library elements and tools for drawing M1 and M2
"""
# Build some patterns (static cells) using the above functions and store them in a library
library = Library()
library['pad'] = make_pad()
library['m1_bend'] = make_bend(layer='M1', ptype='m1wire', width=M1_WIDTH)
library['m2_bend'] = make_bend(layer='M2', ptype='m2wire', width=M2_WIDTH)
library['v1_via'] = make_via(
layer_top='M2',
layer_via='V1',
layer_bot='M1',
width_top=M2_WIDTH,
width_via=V1_WIDTH,
width_bot=M1_WIDTH,
ptype_bot='m1wire',
ptype_top='m2wire',
layer_top = 'M2',
layer_via = 'V1',
layer_bot = 'M1',
width_top = M2_WIDTH,
width_via = V1_WIDTH,
width_bot = M1_WIDTH,
ptype_bot = 'm1wire',
ptype_top = 'm2wire',
)
#
@ -140,53 +137,79 @@ def main() -> None:
# M2_tool will route on M2, using wires with M2_WIDTH
# Both tools are able to automatically transition from the other wire type (with a via)
#
# Note that while we use BasicTool for this tutorial, you can define your own `Tool`
# Note that while we use AutoTool for this tutorial, you can define your own `Tool`
# with arbitrary logic inside -- e.g. with single-use bends, complex transition rules,
# transmission line geometry, or other features.
#
M1_tool = BasicTool(
straight = (
# First, we need a function which takes in a length and spits out an M1 wire
lambda length: make_straight_wire(layer='M1', ptype='m1wire', width=M1_WIDTH, length=length),
'input', # When we get a pattern from make_straight_wire, use the port named 'input' as the input
'output', # and use the port named 'output' as the output
),
bend = (
library.abstract('m1_bend'), # When we need a bend, we'll reference the pattern we generated earlier
'input', # To orient it clockwise, use the port named 'input' as the input
'output', # and 'output' as the output
),
M1_tool = AutoTool(
# First, we need a function which takes in a length and spits out an M1 wire
straights = [
AutoTool.Straight(
ptype = 'm1wire',
fn = lambda length: make_straight_wire(layer='M1', ptype='m1wire', width=M1_WIDTH, length=length),
in_port_name = 'input', # When we get a pattern from make_straight_wire, use the port named 'input' as the input
out_port_name = 'output', # and use the port named 'output' as the output
),
],
bends = [
AutoTool.Bend(
abstract = library.abstract('m1_bend'), # When we need a bend, we'll reference the pattern we generated earlier
in_port_name = 'input',
out_port_name = 'output',
clockwise = True,
),
],
transitions = { # We can automate transitions for different (normally incompatible) port types
'm2wire': ( # For example, when we're attaching to a port with type 'm2wire'
('m2wire', 'm1wire'): AutoTool.Transition( # For example, when we're attaching to a port with type 'm2wire'
library.abstract('v1_via'), # we can place a V1 via
'top', # using the port named 'top' as the input (i.e. the M2 side of the via)
'bottom', # and using the port named 'bottom' as the output
),
},
sbends = [],
default_out_ptype = 'm1wire', # Unless otherwise requested, we'll default to trying to stay on M1
)
M2_tool = BasicTool(
straight = (
M2_tool = AutoTool(
straights = [
# Again, we use make_straight_wire, but this time we set parameters for M2
lambda length: make_straight_wire(layer='M2', ptype='m2wire', width=M2_WIDTH, length=length),
'input',
'output',
),
bend = (
library.abstract('m2_bend'), # and we use an M2 bend
'input',
'output',
),
AutoTool.Straight(
ptype = 'm2wire',
fn = lambda length: make_straight_wire(layer='M2', ptype='m2wire', width=M2_WIDTH, length=length),
in_port_name = 'input',
out_port_name = 'output',
),
],
bends = [
# and we use an M2 bend
AutoTool.Bend(
abstract = library.abstract('m2_bend'),
in_port_name = 'input',
out_port_name = 'output',
),
],
transitions = {
'm1wire': (
('m1wire', 'm2wire'): AutoTool.Transition(
library.abstract('v1_via'), # We still use the same via,
'bottom', # but the input port is now 'bottom'
'top', # and the output port is now 'top'
),
},
sbends = [],
default_out_ptype = 'm2wire', # We default to trying to stay on M2
)
return library, M1_tool, M2_tool
#
# Now we can start building up our library (collection of static cells) and pathing tools.
#
# If any of the operations below are confusing, you can cross-reference against the deferred
# `Pather` tutorial, which handles some things more explicitly (e.g. via placement) and simplifies
# others (e.g. geometry definition).
#
def main() -> None:
library, M1_tool, M2_tool = prepare_tools()
#
# Create a new pather which writes to `library` and uses `M2_tool` as its default tool.
@ -203,27 +226,25 @@ def main() -> None:
# Path VCC forward (in this case south) and turn clockwise 90 degrees (ccw=False)
# The total distance forward (including the bend's forward component) must be 6um
pather.path('VCC', ccw=False, length=6_000)
pather.cw('VCC', 6_000)
# Now path VCC to x=0. This time, don't include any bend (ccw=None).
# Now path VCC to x=0. This time, don't include any bend.
# Note that if we tried y=0 here, we would get an error since the VCC port is facing in the x-direction.
pather.path_to('VCC', ccw=None, x=0)
pather.straight('VCC', x=0)
# Path GND forward by 5um, turning clockwise 90 degrees.
# This time we use shorthand (bool(0) == False) and omit the parameter labels
# Note that although ccw=0 is equivalent to ccw=False, ccw=None is not!
pather.path('GND', 0, 5_000)
pather.cw('GND', 5_000)
# This time, path GND until it matches the current x-coordinate of VCC. Don't place a bend.
pather.path_to('GND', None, x=pather['VCC'].offset[0])
pather.straight('GND', x=pather['VCC'].offset[0])
# Now, start using M1_tool for GND.
# Since we have defined an M2-to-M1 transition for BasicPather, we don't need to place one ourselves.
# Since we have defined an M2-to-M1 transition for Pather, we don't need to place one ourselves.
# If we wanted to place our via manually, we could add `pather.plug('m1_via', {'GND': 'top'})` here
# and achieve the same result without having to define any transitions in M1_tool.
# Note that even though we have changed the tool used for GND, the via doesn't get placed until
# the next time we draw a path on GND (the pather.mpath() statement below).
pather.retool(M1_tool, keys=['GND'])
# the next time we route GND (the `pather.ccw()` call below).
pather.retool(M1_tool, keys='GND')
# Bundle together GND and VCC, and path the bundle forward and counterclockwise.
# Pick the distance so that the leading/outermost wire (in this case GND) ends up at x=-10_000.
@ -231,7 +252,7 @@ def main() -> None:
#
# Since we recently retooled GND, its path starts with a via down to M1 (included in the distance
# calculation), and its straight segment and bend will be drawn using M1 while VCC's are drawn with M2.
pather.mpath(['GND', 'VCC'], ccw=True, xmax=-10_000, spacing=5_000)
pather.ccw(['GND', 'VCC'], xmax=-10_000, spacing=5_000)
# Now use M1_tool as the default tool for all ports/signals.
# Since VCC does not have an explicitly assigned tool, it will now transition down to M1.
@ -241,38 +262,37 @@ def main() -> None:
# The total extension (travel distance along the forward direction) for the longest segment (in
# this case the segment being added to GND) should be exactly 50um.
# After turning, the wire pitch should be reduced only 1.2um.
pather.mpath(['GND', 'VCC'], ccw=True, emax=50_000, spacing=1_200)
pather.ccw(['GND', 'VCC'], emax=50_000, spacing=1_200)
# Make a U-turn with the bundle and expand back out to 4.5um wire pitch.
# Here, emin specifies the travel distance for the shortest segment. For the first mpath() call
# that applies to VCC, and for teh second call, that applies to GND; the relative lengths of the
# Here, emin specifies the travel distance for the shortest segment. For the first call
# that applies to VCC, and for the second call, that applies to GND; the relative lengths of the
# segments depend on their starting positions and their ordering within the bundle.
pather.mpath(['GND', 'VCC'], ccw=False, emin=1_000, spacing=1_200)
pather.mpath(['GND', 'VCC'], ccw=False, emin=2_000, spacing=4_500)
pather.cw(['GND', 'VCC'], emin=1_000, spacing=1_200)
pather.cw(['GND', 'VCC'], emin=2_000, spacing=4_500)
# Now, set the default tool back to M2_tool. Note that GND remains on M1 since it has been
# explicitly assigned a tool. We could `del pather.tools['GND']` to force it to use the default.
# explicitly assigned a tool.
pather.retool(M2_tool)
# Now path both ports to x=-28_000.
# When ccw is not None, xmin constrains the trailing/innermost port to stop at the target x coordinate,
# However, with ccw=None, all ports stop at the same coordinate, and so specifying xmin= or xmax= is
# With ccw=None, all ports stop at the same coordinate, and so specifying xmin= or xmax= is
# equivalent.
pather.mpath(['GND', 'VCC'], None, xmin=-28_000)
pather.straight(['GND', 'VCC'], xmin=-28_000)
# Further extend VCC out to x=-50_000, and specify that we would like to get an output on M1.
# This results in a via at the end of the wire (instead of having one at the start like we got
# when using pather.retool().
pather.path_to('VCC', None, -50_000, out_ptype='m1wire')
pather.straight('VCC', x=-50_000, out_ptype='m1wire')
# Now extend GND out to x=-50_000, using M2 for a portion of the path.
# We can use `pather.toolctx()` to temporarily retool, instead of calling `retool()` twice.
with pather.toolctx(M2_tool, keys=['GND']):
pather.path_to('GND', None, -40_000)
pather.path_to('GND', None, -50_000)
with pather.toolctx(M2_tool, keys='GND'):
pather.straight('GND', x=-40_000)
pather.straight('GND', x=-50_000)
# Save the pather's pattern into our library
library['Pather_and_BasicTool'] = pather.pattern
library['Pather_and_AutoTool'] = pather.pattern
# Convert from text-based layers to numeric layers for GDS, and output the file
library.map_layers(map_layer)

View file

@ -2,7 +2,7 @@
Routines for creating normalized 2D lattices and common photonic crystal
cavity designs.
"""
from collection.abc import Sequence
from collections.abc import Sequence
import numpy
from numpy.typing import ArrayLike, NDArray
@ -50,7 +50,7 @@ def triangular_lattice(
elif origin == 'corner':
pass
else:
raise Exception(f'Invalid value for `origin`: {origin}')
raise ValueError(f'Invalid value for `origin`: {origin}')
return xy[xy[:, 0].argsort(), :]
@ -197,12 +197,12 @@ def ln_defect(
`[[x0, y0], [x1, y1], ...]` for all the holes
"""
if defect_length % 2 != 1:
raise Exception('defect_length must be odd!')
p = triangular_lattice([2 * d + 1 for d in mirror_dims])
raise ValueError('defect_length must be odd!')
pp = triangular_lattice([2 * dd + 1 for dd in mirror_dims])
half_length = numpy.floor(defect_length / 2)
hole_nums = numpy.arange(-half_length, half_length + 1)
holes_to_keep = numpy.in1d(p[:, 0], hole_nums, invert=True)
return p[numpy.logical_or(holes_to_keep, p[:, 1] != 0), ]
holes_to_keep = numpy.isin(pp[:, 0], hole_nums, invert=True)
return pp[numpy.logical_or(holes_to_keep, pp[:, 1] != 0), :]
def ln_shift_defect(
@ -248,7 +248,7 @@ def ln_shift_defect(
for sign in (-1, 1):
x_val = sign * (x_removed + ind + 1)
which = numpy.logical_and(xyr[:, 0] == x_val, xyr[:, 1] == 0)
xyr[which, ] = (x_val + numpy.sign(x_val) * shifts_a[ind], 0, shifts_r[ind])
xyr[which, :] = (x_val + numpy.sign(x_val) * shifts_a[ind], 0, shifts_r[ind])
return xyr
@ -309,7 +309,7 @@ def l3_shift_perturbed_defect(
# which holes should be perturbed? (xs[[3, 7]], ys[1]) and (xs[[2, 6]], ys[2])
perturbed_holes = ((xs[a], ys[b]) for a, b in ((3, 1), (7, 1), (2, 2), (6, 2)))
for row in xyr:
if numpy.fabs(row) in perturbed_holes:
row[2] = perturbed_radius
for xy in perturbed_holes:
which = (numpy.fabs(xyr[:, :2]) == xy).all(axis=1)
xyr[which, 2] = perturbed_radius
return xyr

View file

@ -0,0 +1,169 @@
"""
PortPather tutorial: Using .at() syntax
"""
from masque import Pather, Pattern, Port, R90
from masque.file.gdsii import writefile
from basic_shapes import GDS_OPTS
from pather import map_layer, prepare_tools
def main() -> None:
# Reuse the same patterns (pads, bends, vias) and tools as in pather.py
library, M1_tool, M2_tool = prepare_tools()
# Create a deferred Pather and place some initial pads (same as Pather tutorial)
rpather = Pather(library, tools=M2_tool, auto_render=False)
rpather.place('pad', offset=(18_000, 30_000), port_map={'wire_port': 'VCC'})
rpather.place('pad', offset=(18_000, 60_000), port_map={'wire_port': 'GND'})
rpather.pattern.label(layer='M2', string='VCC', offset=(18e3, 30e3))
rpather.pattern.label(layer='M2', string='GND', offset=(18e3, 60e3))
#
# Routing with .at() chaining
#
# The .at(port_name) method returns a PortPather object which wraps the Pather
# and remembers the selected port(s). This allows method chaining.
# Route VCC: 6um South, then West to x=0.
# (Note: since the port points North into the pad, trace() moves South by default)
(rpather.at('VCC')
.trace(False, length=6_000) # Move South, turn West (Clockwise)
.trace_to(None, x=0) # Continue West to x=0
)
# Route GND: 5um South, then West to match VCC's x-coordinate.
rpather.at('GND').trace(False, length=5_000).trace_to(None, x=rpather['VCC'].x)
#
# Tool management and manual plugging
#
# We can use .retool() to change the tool for specific ports.
# We can also use .plug() directly on a PortPather.
# Manually add a via to GND and switch to M1_tool for subsequent segments
(rpather.at('GND')
.plug('v1_via', 'top')
.retool(M1_tool) # this only retools the 'GND' port
)
# We can also pass multiple ports to .at(), and then route them together.
# Here we bundle them, turn South, and retool both to M1 (VCC gets an auto-via).
(rpather.at(['GND', 'VCC'])
.trace(True, xmax=-10_000, spacing=5_000) # Move West to -10k, turn South
.retool(M1_tool) # Retools both GND and VCC
.trace(True, emax=50_000, spacing=1_200) # Turn East, moves 50um extension
.trace(False, emin=1_000, spacing=1_200) # U-turn back South
.trace(False, emin=2_000, spacing=4_500) # U-turn back West
)
# Retool VCC back to M2 and move both to x=-28k
rpather.at('VCC').retool(M2_tool)
rpather.at(['GND', 'VCC']).trace(None, xmin=-28_000)
# Final segments to -50k
rpather.at('VCC').trace_to(None, x=-50_000, out_ptype='m1wire')
with rpather.at('GND').toolctx(M2_tool):
rpather.at('GND').trace_to(None, x=-40_000)
rpather.at('GND').trace_to(None, x=-50_000)
#
# Branching with mark and fork
#
# .mark(new_name) creates a port copy and keeps the original selected.
# .fork(new_name) creates a port copy and selects the new one.
# Create a tap on GND
(rpather.at('GND')
.trace(None, length=5_000) # Move GND further West
.mark('GND_TAP') # Mark this location for a later branch
.jog(offset=-10_000, length=10_000) # Continue GND with an S-bend
)
# Branch VCC and follow the new branch
(rpather.at('VCC')
.trace(None, length=5_000)
.fork('VCC_BRANCH') # We are now manipulating 'VCC_BRANCH'
.trace(True, length=5_000) # VCC_BRANCH turns South
)
# The original 'VCC' port remains at x=-55k, y=VCC.y
#
# Port set management: add, drop, rename, delete
#
# Route the GND_TAP we saved earlier.
(rpather.at('GND_TAP')
.retool(M1_tool)
.trace(True, length=10_000) # Turn South
.rename('GND_FEED') # Give it a more descriptive name
.retool(M1_tool) # Re-apply tool to the new name
)
# We can manage the active set of ports in a PortPather
pp = rpather.at(['VCC_BRANCH', 'GND_FEED'])
pp.select('GND') # Now tracking 3 ports
pp.deselect('VCC_BRANCH') # Now tracking 2 ports: GND_FEED, GND
pp.trace(None, each=5_000) # Move both 5um forward (length > transition size)
# We can also delete ports from the pather entirely
rpather.at('VCC').delete() # VCC is gone (we have VCC_BRANCH instead)
#
# Advanced Connections: trace_into
#
# trace_into routes FROM the selected port TO a target port.
# Create a destination component
dest_ports = {
'in_A': Port((0, 0), rotation=R90, ptype='m2wire'),
'in_B': Port((5_000, 0), rotation=R90, ptype='m2wire')
}
library['dest'] = Pattern(ports=dest_ports)
# Place dest so that its ports are to the West and South of our current wires.
# Rotating by pi/2 makes the ports face West (pointing East).
rpather.place('dest', offset=(-100_000, -100_000), rotation=R90, port_map={'in_A': 'DEST_A', 'in_B': 'DEST_B'})
# Connect GND_FEED to DEST_A
# Since GND_FEED is moving South and DEST_A faces West, a single bend will suffice.
rpather.at('GND_FEED').trace_into('DEST_A')
# Connect VCC_BRANCH to DEST_B
rpather.at('VCC_BRANCH').trace_into('DEST_B')
#
# Direct Port Transformations and Metadata
#
(rpather.at('GND')
.set_ptype('m1wire') # Change metadata
.translate((1000, 0)) # Shift the port 1um East
.rotate(R90 / 2) # Rotate it 45 degrees
.set_rotation(R90) # Force it to face West
)
# Demonstrate .plugged() to acknowledge a manual connection
# (Normally used when you place components so their ports perfectly overlap)
rpather.add_port_pair(offset=(0, 0), names=('TMP1', 'TMP2'))
rpather.at('TMP1').plugged('TMP2') # Removes both ports
#
# Rendering and Saving
#
# Since we deferred auto-rendering, we must call .render() to generate the geometry.
rpather.render()
library['PortPather_Tutorial'] = rpather.pattern
library.map_layers(map_layer)
writefile(library, 'port_pather.gds', **GDS_OPTS)
print("Tutorial complete. Output written to port_pather.gds")
if __name__ == '__main__':
main()

View file

@ -1,8 +1,7 @@
"""
Manual wire routing tutorial: RenderPather an PathTool
Manual wire routing tutorial: deferred Pather and PathTool
"""
from collections.abc import Callable
from masque import RenderPather, Library, Pattern, Port, layer_t, map_layers
from masque import Pather, Library
from masque.builder.tools import PathTool
from masque.file.gdsii import writefile
@ -12,9 +11,9 @@ from pather import M1_WIDTH, V1_WIDTH, M2_WIDTH, map_layer, make_pad, make_via
def main() -> None:
#
# To illustrate the advantages of using `RenderPather`, we use `PathTool` instead
# of `BasicTool`. `PathTool` lacks some sophistication (e.g. no automatic transitions)
# but when used with `RenderPather`, it can consolidate multiple routing steps into
# To illustrate deferred routing with `Pather`, we use `PathTool` instead
# of `AutoTool`. `PathTool` lacks some sophistication (e.g. no automatic transitions)
# but when used with `Pather(auto_render=False)`, it can consolidate multiple routing steps into
# a single `Path` shape.
#
# We'll try to nearly replicate the layout from the `Pather` tutorial; see `pather.py`
@ -25,66 +24,68 @@ def main() -> None:
library = Library()
library['pad'] = make_pad()
library['v1_via'] = make_via(
layer_top='M2',
layer_via='V1',
layer_bot='M1',
width_top=M2_WIDTH,
width_via=V1_WIDTH,
width_bot=M1_WIDTH,
ptype_bot='m1wire',
ptype_top='m2wire',
layer_top = 'M2',
layer_via = 'V1',
layer_bot = 'M1',
width_top = M2_WIDTH,
width_via = V1_WIDTH,
width_bot = M1_WIDTH,
ptype_bot = 'm1wire',
ptype_top = 'm2wire',
)
# `PathTool` is more limited than `BasicTool`. It only generates one type of shape
# `PathTool` is more limited than `AutoTool`. It only generates one type of shape
# (`Path`), so it only needs to know what layer to draw on, what width to draw with,
# and what port type to present.
M1_ptool = PathTool(layer='M1', width=M1_WIDTH, ptype='m1wire')
M2_ptool = PathTool(layer='M2', width=M2_WIDTH, ptype='m2wire')
rpather = RenderPather(tools=M2_ptool, library=library)
rpather = Pather(tools=M2_ptool, library=library, auto_render=False)
# As in the pather tutorial, we make soem pads and labels...
# As in the pather tutorial, we make some pads and labels...
rpather.place('pad', offset=(18_000, 30_000), port_map={'wire_port': 'VCC'})
rpather.place('pad', offset=(18_000, 60_000), port_map={'wire_port': 'GND'})
rpather.pattern.label(layer='M2', string='VCC', offset=(18e3, 30e3))
rpather.pattern.label(layer='M2', string='GND', offset=(18e3, 60e3))
# ...and start routing the signals.
rpather.path('VCC', ccw=False, length=6_000)
rpather.path_to('VCC', ccw=None, x=0)
rpather.path('GND', 0, 5_000)
rpather.path_to('GND', None, x=rpather['VCC'].offset[0])
rpather.cw('VCC', 6_000)
rpather.straight('VCC', x=0)
rpather.cw('GND', 5_000)
rpather.straight('GND', x=rpather.pattern['VCC'].x)
# `PathTool` doesn't know how to transition betwen metal layers, so we have to
# `plug` the via into the GND wire ourselves.
rpather.plug('v1_via', {'GND': 'top'})
rpather.retool(M1_ptool, keys=['GND'])
rpather.mpath(['GND', 'VCC'], ccw=True, xmax=-10_000, spacing=5_000)
rpather.retool(M1_ptool, keys='GND')
rpather.ccw(['GND', 'VCC'], xmax=-10_000, spacing=5_000)
# Same thing on the VCC wire when it goes down to M1.
rpather.plug('v1_via', {'VCC': 'top'})
rpather.retool(M1_ptool)
rpather.mpath(['GND', 'VCC'], ccw=True, emax=50_000, spacing=1_200)
rpather.mpath(['GND', 'VCC'], ccw=False, emin=1_000, spacing=1_200)
rpather.mpath(['GND', 'VCC'], ccw=False, emin=2_000, spacing=4_500)
rpather.ccw(['GND', 'VCC'], emax=50_000, spacing=1_200)
rpather.cw(['GND', 'VCC'], emin=1_000, spacing=1_200)
rpather.cw(['GND', 'VCC'], emin=2_000, spacing=4_500)
# And again when VCC goes back up to M2.
rpather.plug('v1_via', {'VCC': 'bottom'})
rpather.retool(M2_ptool)
rpather.mpath(['GND', 'VCC'], None, xmin=-28_000)
rpather.straight(['GND', 'VCC'], xmin=-28_000)
# Finally, since PathTool has no conception of transitions, we can't
# just ask it to transition to an 'm1wire' port at the end of the final VCC segment.
# Instead, we have to calculate the via size ourselves, and adjust the final position
# to account for it.
via_size = abs(
library['v1_via'].ports['top'].offset[0]
- library['v1_via'].ports['bottom'].offset[0]
)
rpather.path_to('VCC', None, -50_000 + via_size)
v1pat = library['v1_via']
via_size = abs(v1pat.ports['top'].x - v1pat.ports['bottom'].x)
# alternatively, via_size = v1pat.ports['top'].measure_travel(v1pat.ports['bottom'])[0][0]
# would take into account the port orientations if we didn't already know they're along x
rpather.straight('VCC', x=-50_000 + via_size)
rpather.plug('v1_via', {'VCC': 'top'})
# Render the path we defined
rpather.render()
library['RenderPather_and_PathTool'] = rpather.pattern
library['Deferred_Pather_and_PathTool'] = rpather.pattern
# Convert from text-based layers to numeric layers for GDS, and output the file

View file

@ -42,6 +42,7 @@ from .error import (
from .shapes import (
Shape as Shape,
Polygon as Polygon,
RectCollection as RectCollection,
Path as Path,
Circle as Circle,
Arc as Arc,
@ -55,6 +56,7 @@ from .pattern import (
map_targets as map_targets,
chain_elements as chain_elements,
)
from .utils.boolean import boolean as boolean
from .library import (
ILibraryView as ILibraryView,
@ -72,13 +74,13 @@ from .ports import (
)
from .abstract import Abstract as Abstract
from .builder import (
Builder as Builder,
Tool as Tool,
Pather as Pather,
RenderPather as RenderPather,
RenderStep as RenderStep,
BasicTool as BasicTool,
SimpleTool as SimpleTool,
AutoTool as AutoTool,
PathTool as PathTool,
PortPather as PortPather,
)
from .utils import (
ports2data as ports2data,

View file

@ -8,22 +8,20 @@ from numpy.typing import ArrayLike
from .ref import Ref
from .ports import PortList, Port
from .utils import rotation_matrix_2d
#if TYPE_CHECKING:
# from .builder import Builder, Tool
# from .library import ILibrary
from .traits import Mirrorable
logger = logging.getLogger(__name__)
class Abstract(PortList):
class Abstract(PortList, Mirrorable):
"""
An `Abstract` is a container for a name and associated ports.
When snapping a sub-component to an existing pattern, only the name (not contained
in a `Pattern` object) and port info is needed, and not the geometry itself.
"""
# Alternate design option: do we want to store a Ref instead of just a name? then we can translate/rotate/mirror...
__slots__ = ('name', '_ports')
name: str
@ -48,8 +46,6 @@ class Abstract(PortList):
self.name = name
self.ports = copy.deepcopy(ports)
# TODO do we want to store a Ref instead of just a name? then we can translate/rotate/mirror...
def __repr__(self) -> str:
s = f'<Abstract {self.name} ['
for name, port in self.ports.items():
@ -88,7 +84,7 @@ class Abstract(PortList):
def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self:
"""
Rotate the Abstract around the a location.
Rotate the Abstract around a pivot point.
Args:
pivot: (x, y) location to rotate around
@ -132,50 +128,18 @@ class Abstract(PortList):
port.rotate(rotation)
return self
def mirror_port_offsets(self, across_axis: int = 0) -> Self:
def mirror(self, axis: int = 0) -> Self:
"""
Mirror the offsets of all shapes, labels, and refs across an axis
Mirror the Abstract across an axis through its origin.
Args:
across_axis: Axis to mirror across
(0: mirror across x axis, 1: mirror across y axis)
axis: Axis to mirror across (0: x-axis, 1: y-axis).
Returns:
self
"""
for port in self.ports.values():
port.offset[across_axis - 1] *= -1
return self
def mirror_ports(self, across_axis: int = 0) -> Self:
"""
Mirror each port's rotation across an axis, relative to its
offset
Args:
across_axis: Axis to mirror across
(0: mirror across x axis, 1: mirror across y axis)
Returns:
self
"""
for port in self.ports.values():
port.mirror(across_axis)
return self
def mirror(self, across_axis: int = 0) -> Self:
"""
Mirror the Pattern across an axis
Args:
axis: Axis to mirror across
(0: mirror across x axis, 1: mirror across y axis)
Returns:
self
"""
self.mirror_ports(across_axis)
self.mirror_port_offsets(across_axis)
port.flip_across(axis=axis)
return self
def apply_ref_transform(self, ref: Ref) -> Self:
@ -193,6 +157,8 @@ class Abstract(PortList):
self.mirror()
self.rotate_ports(ref.rotation)
self.rotate_port_offsets(ref.rotation)
if ref.scale != 1:
self.scale_by(ref.scale)
self.translate_ports(ref.offset)
return self
@ -210,6 +176,8 @@ class Abstract(PortList):
# TODO test undo_ref_transform
"""
self.translate_ports(-ref.offset)
if ref.scale != 1:
self.scale_by(1 / ref.scale)
self.rotate_port_offsets(-ref.rotation)
self.rotate_ports(-ref.rotation)
if ref.mirrored:

View file

@ -1,10 +1,13 @@
from .builder import Builder as Builder
from .pather import Pather as Pather
from .renderpather import RenderPather as RenderPather
from .pather import (
Pather as Pather,
PortPather as PortPather,
)
from .utils import ell as ell
from .tools import (
Tool as Tool,
RenderStep as RenderStep,
BasicTool as BasicTool,
SimpleTool as SimpleTool,
AutoTool as AutoTool,
PathTool as PathTool,
)
)
from .logging import logged_op as logged_op

View file

@ -1,443 +0,0 @@
"""
Simplified Pattern assembly (`Builder`)
"""
from typing import Self
from collections.abc import Iterable, Sequence, Mapping
import copy
import logging
from functools import wraps
from numpy.typing import ArrayLike
from ..pattern import Pattern
from ..library import ILibrary, TreeView
from ..error import BuildError
from ..ports import PortList, Port
from ..abstract import Abstract
logger = logging.getLogger(__name__)
class Builder(PortList):
"""
A `Builder` is a helper object used for snapping together multiple
lower-level patterns at their `Port`s.
The `Builder` mostly just holds context, in the form of a `Library`,
in addition to its underlying pattern. This simplifies some calls
to `plug` and `place`, by making the library implicit.
`Builder` can also be `set_dead()`, at which point further calls to `plug()`
and `place()` are ignored (intended for debugging).
Examples: Creating a Builder
===========================
- `Builder(library, ports={'A': port_a, 'C': port_c}, name='mypat')` makes
an empty pattern, adds the given ports, and places it into `library`
under the name `'mypat'`.
- `Builder(library)` makes an empty pattern with no ports. The pattern
is not added into `library` and must later be added with e.g.
`library['mypat'] = builder.pattern`
- `Builder(library, pattern=pattern, name='mypat')` uses an existing
pattern (including its ports) and sets `library['mypat'] = pattern`.
- `Builder.interface(other_pat, port_map=['A', 'B'], library=library)`
makes a new (empty) pattern, copies over ports 'A' and 'B' from
`other_pat`, and creates additional ports 'in_A' and 'in_B' facing
in the opposite directions. This can be used to build a device which
can plug into `other_pat` (using the 'in_*' ports) but which does not
itself include `other_pat` as a subcomponent.
- `Builder.interface(other_builder, ...)` does the same thing as
`Builder.interface(other_builder.pattern, ...)` but also uses
`other_builder.library` as its library by default.
Examples: Adding to a pattern
=============================
- `my_device.plug(subdevice, {'A': 'C', 'B': 'B'}, map_out={'D': 'myport'})`
instantiates `subdevice` into `my_device`, plugging ports 'A' and 'B'
of `my_device` into ports 'C' and 'B' of `subdevice`. The connected ports
are removed and any unconnected ports from `subdevice` are added to
`my_device`. Port 'D' of `subdevice` (unconnected) is renamed to 'myport'.
- `my_device.plug(wire, {'myport': 'A'})` places port 'A' of `wire` at 'myport'
of `my_device`. If `wire` has only two ports (e.g. 'A' and 'B'), no `map_out`,
argument is provided, and the `inherit_name` argument is not explicitly
set to `False`, the unconnected port of `wire` is automatically renamed to
'myport'. This allows easy extension of existing ports without changing
their names or having to provide `map_out` each time `plug` is called.
- `my_device.place(pad, offset=(10, 10), rotation=pi / 2, port_map={'A': 'gnd'})`
instantiates `pad` at the specified (x, y) offset and with the specified
rotation, adding its ports to those of `my_device`. Port 'A' of `pad` is
renamed to 'gnd' so that further routing can use this signal or net name
rather than the port name on the original `pad` device.
"""
__slots__ = ('pattern', 'library', '_dead')
pattern: Pattern
""" Layout of this device """
library: ILibrary
"""
Library from which patterns should be referenced
"""
_dead: bool
""" If True, plug()/place() are skipped (for debugging)"""
@property
def ports(self) -> dict[str, Port]:
return self.pattern.ports
@ports.setter
def ports(self, value: dict[str, Port]) -> None:
self.pattern.ports = value
def __init__(
self,
library: ILibrary,
*,
pattern: Pattern | None = None,
ports: str | Mapping[str, Port] | None = None,
name: str | None = None,
) -> None:
"""
Args:
library: The library from which referenced patterns will be taken
pattern: The pattern which will be modified by subsequent operations.
If `None` (default), a new pattern is created.
ports: Allows specifying the initial set of ports, if `pattern` does
not already have any ports (or is not provided). May be a string,
in which case it is interpreted as a name in `library`.
Default `None` (no ports).
name: If specified, `library[name]` is set to `self.pattern`.
"""
self._dead = False
self.library = library
if pattern is not None:
self.pattern = pattern
else:
self.pattern = Pattern()
if ports is not None:
if self.pattern.ports:
raise BuildError('Ports supplied for pattern with pre-existing ports!')
if isinstance(ports, str):
ports = library.abstract(ports).ports
self.pattern.ports.update(copy.deepcopy(dict(ports)))
if name is not None:
library[name] = self.pattern
@classmethod
def interface(
cls: type['Builder'],
source: PortList | Mapping[str, Port] | str,
*,
library: ILibrary | None = None,
in_prefix: str = 'in_',
out_prefix: str = '',
port_map: dict[str, str] | Sequence[str] | None = None,
name: str | None = None,
) -> 'Builder':
"""
Wrapper for `Pattern.interface()`, which returns a Builder instead.
Args:
source: A collection of ports (e.g. Pattern, Builder, or dict)
from which to create the interface. May be a pattern name if
`library` is provided.
library: Library from which existing patterns should be referenced,
and to which the new one should be added (if named). If not provided,
`source.library` must exist and will be used.
in_prefix: Prepended to port names for newly-created ports with
reversed directions compared to the current device.
out_prefix: Prepended to port names for ports which are directly
copied from the current device.
port_map: Specification for ports to copy into the new device:
- If `None`, all ports are copied.
- If a sequence, only the listed ports are copied
- If a mapping, the listed ports (keys) are copied and
renamed (to the values).
Returns:
The new builder, with an empty pattern and 2x as many ports as
listed in port_map.
Raises:
`PortError` if `port_map` contains port names not present in the
current device.
`PortError` if applying the prefixes results in duplicate port
names.
"""
if library is None:
if hasattr(source, 'library') and isinstance(source.library, ILibrary):
library = source.library
else:
raise BuildError('No library was given, and `source.library` does not have one either.')
if isinstance(source, str):
source = library.abstract(source).ports
pat = Pattern.interface(source, in_prefix=in_prefix, out_prefix=out_prefix, port_map=port_map)
new = Builder(library=library, pattern=pat, name=name)
return new
@wraps(Pattern.label)
def label(self, *args, **kwargs) -> Self:
self.pattern.label(*args, **kwargs)
return self
@wraps(Pattern.ref)
def ref(self, *args, **kwargs) -> Self:
self.pattern.ref(*args, **kwargs)
return self
@wraps(Pattern.polygon)
def polygon(self, *args, **kwargs) -> Self:
self.pattern.polygon(*args, **kwargs)
return self
@wraps(Pattern.rect)
def rect(self, *args, **kwargs) -> Self:
self.pattern.rect(*args, **kwargs)
return self
# Note: We're a superclass of `Pather`, where path() means something different...
#@wraps(Pattern.path)
#def path(self, *args, **kwargs) -> Self:
# self.pattern.path(*args, **kwargs)
# return self
def plug(
self,
other: Abstract | str | Pattern | TreeView,
map_in: dict[str, str],
map_out: dict[str, str | None] | None = None,
*,
mirrored: bool = False,
inherit_name: bool = True,
set_rotation: bool | None = None,
append: bool = False,
ok_connections: Iterable[tuple[str, str]] = (),
) -> Self:
"""
Wrapper around `Pattern.plug` which allows a string for `other`.
The `Builder`'s library is used to dereference the string (or `Abstract`, if
one is passed with `append=True`). If a `TreeView` is passed, it is first
added into `self.library`.
Args:
other: An `Abstract`, string, `Pattern`, or `TreeView` describing the
device to be instatiated. If it is a `TreeView`, it is first
added into `self.library`, after which the topcell is plugged;
an equivalent statement is `self.plug(self.library << other, ...)`.
map_in: dict of `{'self_port': 'other_port'}` mappings, specifying
port connections between the two devices.
map_out: dict of `{'old_name': 'new_name'}` mappings, specifying
new names for ports in `other`.
mirrored: Enables mirroring `other` across the x axis prior to
connecting any ports.
inherit_name: If `True`, and `map_in` specifies only a single port,
and `map_out` is `None`, and `other` has only two ports total,
then automatically renames the output port of `other` to the
name of the port from `self` that appears in `map_in`. This
makes it easy to extend a device with simple 2-port devices
(e.g. wires) without providing `map_out` each time `plug` is
called. See "Examples" above for more info. Default `True`.
set_rotation: If the necessary rotation cannot be determined from
the ports being connected (i.e. all pairs have at least one
port with `rotation=None`), `set_rotation` must be provided
to indicate how much `other` should be rotated. Otherwise,
`set_rotation` must remain `None`.
append: If `True`, `other` is appended instead of being referenced.
Note that this does not flatten `other`, so its refs will still
be refs (now inside `self`).
ok_connections: Set of "allowed" ptype combinations. Identical
ptypes are always allowed to connect, as is `'unk'` with
any other ptypte. Non-allowed ptype connections will emit a
warning. Order is ignored, i.e. `(a, b)` is equivalent to
`(b, a)`.
Returns:
self
Raises:
`PortError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other_names`.
`PortError` if there are any duplicate names after `map_in` and `map_out`
are applied.
`PortError` if the specified port mapping is not achieveable (the ports
do not line up)
"""
if self._dead:
logger.error('Skipping plug() since device is dead')
return self
if not isinstance(other, str | Abstract | Pattern):
# We got a Tree; add it into self.library and grab an Abstract for it
other = self.library << other
if isinstance(other, str):
other = self.library.abstract(other)
if append and isinstance(other, Abstract):
other = self.library[other.name]
self.pattern.plug(
other=other,
map_in=map_in,
map_out=map_out,
mirrored=mirrored,
inherit_name=inherit_name,
set_rotation=set_rotation,
append=append,
ok_connections=ok_connections,
)
return self
def place(
self,
other: Abstract | str | Pattern | TreeView,
*,
offset: ArrayLike = (0, 0),
rotation: float = 0,
pivot: ArrayLike = (0, 0),
mirrored: bool = False,
port_map: dict[str, str | None] | None = None,
skip_port_check: bool = False,
append: bool = False,
) -> Self:
"""
Wrapper around `Pattern.place` which allows a string or `TreeView` for `other`.
The `Builder`'s library is used to dereference the string (or `Abstract`, if
one is passed with `append=True`). If a `TreeView` is passed, it is first
added into `self.library`.
Args:
other: An `Abstract`, string, `Pattern`, or `TreeView` describing the
device to be instatiated. If it is a `TreeView`, it is first
added into `self.library`, after which the topcell is plugged;
an equivalent statement is `self.plug(self.library << other, ...)`.
offset: Offset at which to place the instance. Default (0, 0).
rotation: Rotation applied to the instance before placement. Default 0.
pivot: Rotation is applied around this pivot point (default (0, 0)).
Rotation is applied prior to translation (`offset`).
mirrored: Whether theinstance should be mirrored across the x axis.
Mirroring is applied before translation and rotation.
port_map: dict of `{'old_name': 'new_name'}` mappings, specifying
new names for ports in the instantiated device. New names can be
`None`, which will delete those ports.
skip_port_check: Can be used to skip the internal call to `check_ports`,
in case it has already been performed elsewhere.
append: If `True`, `other` is appended instead of being referenced.
Note that this does not flatten `other`, so its refs will still
be refs (now inside `self`).
Returns:
self
Raises:
`PortError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other.ports`.
`PortError` if there are any duplicate names after `map_in` and `map_out`
are applied.
"""
if self._dead:
logger.error('Skipping place() since device is dead')
return self
if not isinstance(other, str | Abstract | Pattern):
# We got a Tree; add it into self.library and grab an Abstract for it
other = self.library << other
if isinstance(other, str):
other = self.library.abstract(other)
if append and isinstance(other, Abstract):
other = self.library[other.name]
self.pattern.place(
other=other,
offset=offset,
rotation=rotation,
pivot=pivot,
mirrored=mirrored,
port_map=port_map,
skip_port_check=skip_port_check,
append=append,
)
return self
def translate(self, offset: ArrayLike) -> Self:
"""
Translate the pattern and all ports.
Args:
offset: (x, y) distance to translate by
Returns:
self
"""
self.pattern.translate_elements(offset)
return self
def rotate_around(self, pivot: ArrayLike, angle: float) -> Self:
"""
Rotate the pattern and all ports.
Args:
angle: angle (radians, counterclockwise) to rotate by
pivot: location to rotate around
Returns:
self
"""
self.pattern.rotate_around(pivot, angle)
for port in self.ports.values():
port.rotate_around(pivot, angle)
return self
def mirror(self, axis: int = 0) -> Self:
"""
Mirror the pattern and all ports across the specified axis.
Args:
axis: Axis to mirror across (x=0, y=1)
Returns:
self
"""
self.pattern.mirror(axis)
return self
def set_dead(self) -> Self:
"""
Disallows further changes through `plug()` or `place()`.
This is meant for debugging:
```
dev.plug(a, ...)
dev.set_dead() # added for debug purposes
dev.plug(b, ...) # usually raises an error, but now skipped
dev.plug(c, ...) # also skipped
dev.pattern.visualize() # shows the device as of the set_dead() call
```
Returns:
self
"""
self._dead = True
return self
def __repr__(self) -> str:
s = f'<Builder {self.pattern} L({len(self.library)})>'
return s

120
masque/builder/logging.py Normal file
View file

@ -0,0 +1,120 @@
"""
Logging and operation decorators for Pather
"""
from typing import TYPE_CHECKING, Any
from collections.abc import Iterator, Sequence, Callable
import logging
from functools import wraps
import inspect
import numpy
from contextlib import contextmanager
if TYPE_CHECKING:
from .pather import Pather
logger = logging.getLogger(__name__)
def _format_log_args(**kwargs) -> str:
arg_strs = []
for k, v in kwargs.items():
if isinstance(v, str | int | float | bool | None):
arg_strs.append(f"{k}={v}")
elif isinstance(v, numpy.ndarray):
arg_strs.append(f"{k}={v.tolist()}")
elif isinstance(v, list | tuple) and len(v) <= 10:
arg_strs.append(f"{k}={v}")
else:
arg_strs.append(f"{k}=...")
return ", ".join(arg_strs)
class PatherLogger:
"""
Encapsulates state for Pather diagnostic logging.
"""
debug: bool
indent: int
depth: int
def __init__(self, debug: bool = False) -> None:
self.debug = debug
self.indent = 0
self.depth = 0
def _log(self, module_name: str, msg: str) -> None:
if self.debug and self.depth <= 1:
log_obj = logging.getLogger(module_name)
log_obj.info(' ' * self.indent + msg)
@contextmanager
def log_operation(
self,
pather: 'Pather',
op: str,
portspec: str | Sequence[str] | None = None,
**kwargs: Any,
) -> Iterator[None]:
if not self.debug or self.depth > 0:
self.depth += 1
try:
yield
finally:
self.depth -= 1
return
target = f"({portspec})" if portspec else ""
module_name = pather.__class__.__module__
self._log(module_name, f"Operation: {op}{target} {_format_log_args(**kwargs)}")
before_ports = {name: port.copy() for name, port in pather.ports.items()}
self.depth += 1
self.indent += 1
try:
yield
finally:
after_ports = pather.ports
for name in sorted(after_ports.keys()):
if name not in before_ports or after_ports[name] != before_ports[name]:
self._log(module_name, f"Port {name}: {pather.ports[name].describe()}")
for name in sorted(before_ports.keys()):
if name not in after_ports:
self._log(module_name, f"Port {name}: removed")
self.indent -= 1
self.depth -= 1
def logged_op(
portspec_getter: Callable[[dict[str, Any]], str | Sequence[str] | None] | None = None,
) -> Callable[[Callable[..., Any]], Callable[..., Any]]:
"""
Decorator to wrap Pather methods with logging.
"""
def decorator(func: Callable[..., Any]) -> Callable[..., Any]:
sig = inspect.signature(func)
@wraps(func)
def wrapper(self: 'Pather', *args: Any, **kwargs: Any) -> Any:
logger_obj = getattr(self, '_logger', None)
if logger_obj is None or not logger_obj.debug:
return func(self, *args, **kwargs)
bound = sig.bind(self, *args, **kwargs)
bound.apply_defaults()
all_args = bound.arguments
# remove 'self' from logged args
logged_args = {k: v for k, v in all_args.items() if k != 'self'}
ps = portspec_getter(all_args) if portspec_getter else None
# Remove portspec from logged_args if it's there to avoid duplicate arg to log_operation
logged_args.pop('portspec', None)
with logger_obj.log_operation(self, func.__name__, ps, **logged_args):
if getattr(self, '_dead', False) and func.__name__ in ('plug', 'place'):
logger.warning(f"Skipping geometry for {func.__name__}() since device is dead")
return func(self, *args, **kwargs)
return wrapper
return decorator

File diff suppressed because it is too large Load diff

View file

@ -1,703 +0,0 @@
"""
Pather with batched (multi-step) rendering
"""
from typing import Self
from collections.abc import Sequence, Mapping, MutableMapping
import copy
import logging
from collections import defaultdict
from pprint import pformat
import numpy
from numpy import pi
from numpy.typing import ArrayLike
from ..pattern import Pattern
from ..library import ILibrary
from ..error import PortError, BuildError
from ..ports import PortList, Port
from ..abstract import Abstract
from ..utils import SupportsBool
from .tools import Tool, RenderStep
from .utils import ell
logger = logging.getLogger(__name__)
class RenderPather(PortList):
"""
`RenderPather` is an alternative to `Pather` which uses the `path`/`path_to`/`mpath`
functions to plan out wire paths without incrementally generating the layout. Instead,
it waits until `render` is called, at which point it draws all the planned segments
simultaneously. This allows it to e.g. draw each wire using a single `Path` or
`Polygon` shape instead of multiple rectangles.
`RenderPather` calls out to `Tool.planL` and `Tool.render` to provide tool-specific
dimensions and build the final geometry for each wire. `Tool.planL` provides the
output port data (relative to the input) for each segment. The tool, input and output
ports are placed into a `RenderStep`, and a sequence of `RenderStep`s is stored for
each port. When `render` is called, it bundles `RenderStep`s into batches which use
the same `Tool`, and passes each batch to the relevant tool's `Tool.render` to build
the geometry.
See `Pather` for routing examples. After routing is complete, `render` must be called
to generate the final geometry.
"""
__slots__ = ('pattern', 'library', 'paths', 'tools', '_dead', )
pattern: Pattern
""" Layout of this device """
library: ILibrary
""" Library from which patterns should be referenced """
_dead: bool
""" If True, plug()/place() are skipped (for debugging) """
paths: defaultdict[str, list[RenderStep]]
""" Per-port list of operations, to be used by `render` """
tools: dict[str | None, Tool]
"""
Tool objects are used to dynamically generate new single-use Devices
(e.g wires or waveguides) to be plugged into this device.
"""
@property
def ports(self) -> dict[str, Port]:
return self.pattern.ports
@ports.setter
def ports(self, value: dict[str, Port]) -> None:
self.pattern.ports = value
def __init__(
self,
library: ILibrary,
*,
pattern: Pattern | None = None,
ports: str | Mapping[str, Port] | None = None,
tools: Tool | MutableMapping[str | None, Tool] | None = None,
name: str | None = None,
) -> None:
"""
Args:
library: The library from which referenced patterns will be taken,
and where new patterns (e.g. generated by the `tools`) will be placed.
pattern: The pattern which will be modified by subsequent operations.
If `None` (default), a new pattern is created.
ports: Allows specifying the initial set of ports, if `pattern` does
not already have any ports (or is not provided). May be a string,
in which case it is interpreted as a name in `library`.
Default `None` (no ports).
tools: A mapping of {port: tool} which specifies what `Tool` should be used
to generate waveguide or wire segments when `path`/`path_to`/`mpath`
are called. Relies on `Tool.planL` and `Tool.render` implementations.
name: If specified, `library[name]` is set to `self.pattern`.
"""
self._dead = False
self.paths = defaultdict(list)
self.library = library
if pattern is not None:
self.pattern = pattern
else:
self.pattern = Pattern()
if ports is not None:
if self.pattern.ports:
raise BuildError('Ports supplied for pattern with pre-existing ports!')
if isinstance(ports, str):
if library is None:
raise BuildError('Ports given as a string, but `library` was `None`!')
ports = library.abstract(ports).ports
self.pattern.ports.update(copy.deepcopy(dict(ports)))
if name is not None:
if library is None:
raise BuildError('Name was supplied, but no library was given!')
library[name] = self.pattern
if tools is None:
self.tools = {}
elif isinstance(tools, Tool):
self.tools = {None: tools}
else:
self.tools = dict(tools)
@classmethod
def interface(
cls: type['RenderPather'],
source: PortList | Mapping[str, Port] | str,
*,
library: ILibrary | None = None,
tools: Tool | MutableMapping[str | None, Tool] | None = None,
in_prefix: str = 'in_',
out_prefix: str = '',
port_map: dict[str, str] | Sequence[str] | None = None,
name: str | None = None,
) -> 'RenderPather':
"""
Wrapper for `Pattern.interface()`, which returns a RenderPather instead.
Args:
source: A collection of ports (e.g. Pattern, Builder, or dict)
from which to create the interface. May be a pattern name if
`library` is provided.
library: Library from which existing patterns should be referenced,
and to which the new one should be added (if named). If not provided,
`source.library` must exist and will be used.
tools: `Tool`s which will be used by the pather for generating new wires
or waveguides (via `path`/`path_to`/`mpath`).
in_prefix: Prepended to port names for newly-created ports with
reversed directions compared to the current device.
out_prefix: Prepended to port names for ports which are directly
copied from the current device.
port_map: Specification for ports to copy into the new device:
- If `None`, all ports are copied.
- If a sequence, only the listed ports are copied
- If a mapping, the listed ports (keys) are copied and
renamed (to the values).
Returns:
The new `RenderPather`, with an empty pattern and 2x as many ports as
listed in port_map.
Raises:
`PortError` if `port_map` contains port names not present in the
current device.
`PortError` if applying the prefixes results in duplicate port
names.
"""
if library is None:
if hasattr(source, 'library') and isinstance(source.library, ILibrary):
library = source.library
else:
raise BuildError('No library provided (and not present in `source.library`')
if tools is None and hasattr(source, 'tools') and isinstance(source.tools, dict):
tools = source.tools
if isinstance(source, str):
source = library.abstract(source).ports
pat = Pattern.interface(source, in_prefix=in_prefix, out_prefix=out_prefix, port_map=port_map)
new = RenderPather(library=library, pattern=pat, name=name, tools=tools)
return new
def plug(
self,
other: Abstract | str,
map_in: dict[str, str],
map_out: dict[str, str | None] | None = None,
*,
mirrored: bool = False,
inherit_name: bool = True,
set_rotation: bool | None = None,
append: bool = False,
) -> Self:
"""
Wrapper for `Pattern.plug` which adds a `RenderStep` with opcode 'P'
for any affected ports. This separates any future `RenderStep`s on the
same port into a new batch, since the plugged device interferes with drawing.
Args:
other: An `Abstract`, string, or `Pattern` describing the device to be instatiated.
map_in: dict of `{'self_port': 'other_port'}` mappings, specifying
port connections between the two devices.
map_out: dict of `{'old_name': 'new_name'}` mappings, specifying
new names for ports in `other`.
mirrored: Enables mirroring `other` across the x axis prior to
connecting any ports.
inherit_name: If `True`, and `map_in` specifies only a single port,
and `map_out` is `None`, and `other` has only two ports total,
then automatically renames the output port of `other` to the
name of the port from `self` that appears in `map_in`. This
makes it easy to extend a device with simple 2-port devices
(e.g. wires) without providing `map_out` each time `plug` is
called. See "Examples" above for more info. Default `True`.
set_rotation: If the necessary rotation cannot be determined from
the ports being connected (i.e. all pairs have at least one
port with `rotation=None`), `set_rotation` must be provided
to indicate how much `other` should be rotated. Otherwise,
`set_rotation` must remain `None`.
append: If `True`, `other` is appended instead of being referenced.
Note that this does not flatten `other`, so its refs will still
be refs (now inside `self`).
Returns:
self
Raises:
`PortError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other_names`.
`PortError` if there are any duplicate names after `map_in` and `map_out`
are applied.
`PortError` if the specified port mapping is not achieveable (the ports
do not line up)
"""
if self._dead:
logger.error('Skipping plug() since device is dead')
return self
other_tgt: Pattern | Abstract
if isinstance(other, str):
other_tgt = self.library.abstract(other)
if append and isinstance(other, Abstract):
other_tgt = self.library[other.name]
# get rid of plugged ports
for kk in map_in:
if kk in self.paths:
self.paths[kk].append(RenderStep('P', None, self.ports[kk].copy(), self.ports[kk].copy(), None))
plugged = map_in.values()
for name, port in other_tgt.ports.items():
if name in plugged:
continue
new_name = map_out.get(name, name) if map_out is not None else name
if new_name is not None and new_name in self.paths:
self.paths[new_name].append(RenderStep('P', None, port.copy(), port.copy(), None))
self.pattern.plug(
other=other_tgt,
map_in=map_in,
map_out=map_out,
mirrored=mirrored,
inherit_name=inherit_name,
set_rotation=set_rotation,
append=append,
)
return self
def place(
self,
other: Abstract | str,
*,
offset: ArrayLike = (0, 0),
rotation: float = 0,
pivot: ArrayLike = (0, 0),
mirrored: bool = False,
port_map: dict[str, str | None] | None = None,
skip_port_check: bool = False,
append: bool = False,
) -> Self:
"""
Wrapper for `Pattern.place` which adds a `RenderStep` with opcode 'P'
for any affected ports. This separates any future `RenderStep`s on the
same port into a new batch, since the placed device interferes with drawing.
Note that mirroring is applied before rotation; translation (`offset`) is applied last.
Args:
other: An `Abstract` or `Pattern` describing the device to be instatiated.
offset: Offset at which to place the instance. Default (0, 0).
rotation: Rotation applied to the instance before placement. Default 0.
pivot: Rotation is applied around this pivot point (default (0, 0)).
Rotation is applied prior to translation (`offset`).
mirrored: Whether theinstance should be mirrored across the x axis.
Mirroring is applied before translation and rotation.
port_map: dict of `{'old_name': 'new_name'}` mappings, specifying
new names for ports in the instantiated pattern. New names can be
`None`, which will delete those ports.
skip_port_check: Can be used to skip the internal call to `check_ports`,
in case it has already been performed elsewhere.
append: If `True`, `other` is appended instead of being referenced.
Note that this does not flatten `other`, so its refs will still
be refs (now inside `self`).
Returns:
self
Raises:
`PortError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other.ports`.
`PortError` if there are any duplicate names after `map_in` and `map_out`
are applied.
"""
if self._dead:
logger.error('Skipping place() since device is dead')
return self
other_tgt: Pattern | Abstract
if isinstance(other, str):
other_tgt = self.library.abstract(other)
if append and isinstance(other, Abstract):
other_tgt = self.library[other.name]
for name, port in other_tgt.ports.items():
new_name = port_map.get(name, name) if port_map is not None else name
if new_name is not None and new_name in self.paths:
self.paths[new_name].append(RenderStep('P', None, port.copy(), port.copy(), None))
self.pattern.place(
other=other_tgt,
offset=offset,
rotation=rotation,
pivot=pivot,
mirrored=mirrored,
port_map=port_map,
skip_port_check=skip_port_check,
append=append,
)
return self
def retool(
self,
tool: Tool,
keys: str | Sequence[str | None] | None = None,
) -> Self:
"""
Update the `Tool` which will be used when generating `Pattern`s for the ports
given by `keys`.
Args:
tool: The new `Tool` to use for the given ports.
keys: Which ports the tool should apply to. `None` indicates the default tool,
used when there is no matching entry in `self.tools` for the port in question.
Returns:
self
"""
if keys is None or isinstance(keys, str):
self.tools[keys] = tool
else:
for key in keys:
self.tools[key] = tool
return self
def path(
self,
portspec: str,
ccw: SupportsBool | None,
length: float,
**kwargs,
) -> Self:
"""
Plan a "wire"/"waveguide" extending from the port `portspec`, with the aim
of traveling exactly `length` distance.
The wire will travel `length` distance along the port's axis, an an unspecified
(tool-dependent) distance in the perpendicular direction. The output port will
be rotated (or not) based on the `ccw` parameter.
`RenderPather.render` must be called after all paths have been fully planned.
Args:
portspec: The name of the port into which the wire will be plugged.
ccw: If `None`, the output should be along the same axis as the input.
Otherwise, cast to bool and turn counterclockwise if True
and clockwise otherwise.
length: The total distance from input to output, along the input's axis only.
(There may be a tool-dependent offset along the other axis.)
Returns:
self
Raises:
BuildError if `distance` is too small to fit the bend (if a bend is present).
LibraryError if no valid name could be picked for the pattern.
"""
if self._dead:
logger.error('Skipping path() since device is dead')
return self
port = self.pattern[portspec]
in_ptype = port.ptype
port_rot = port.rotation
assert port_rot is not None # TODO allow manually setting rotation for RenderPather.path()?
tool = self.tools.get(portspec, self.tools[None])
# ask the tool for bend size (fill missing dx or dy), check feasibility, and get out_ptype
out_port, data = tool.planL(ccw, length, in_ptype=in_ptype, **kwargs)
# Update port
out_port.rotate_around((0, 0), pi + port_rot)
out_port.translate(port.offset)
step = RenderStep('L', tool, port.copy(), out_port.copy(), data)
self.paths[portspec].append(step)
self.pattern.ports[portspec] = out_port.copy()
return self
def path_to(
self,
portspec: str,
ccw: SupportsBool | None,
position: float | None = None,
*,
x: float | None = None,
y: float | None = None,
**kwargs,
) -> Self:
"""
Plan a "wire"/"waveguide" extending from the port `portspec`, with the aim
of ending exactly at a target position.
The wire will travel so that the output port will be placed at exactly the target
position along the input port's axis. There can be an unspecified (tool-dependent)
offset in the perpendicular direction. The output port will be rotated (or not)
based on the `ccw` parameter.
`RenderPather.render` must be called after all paths have been fully planned.
Args:
portspec: The name of the port into which the wire will be plugged.
ccw: If `None`, the output should be along the same axis as the input.
Otherwise, cast to bool and turn counterclockwise if True
and clockwise otherwise.
position: The final port position, along the input's axis only.
(There may be a tool-dependent offset along the other axis.)
Only one of `position`, `x`, and `y` may be specified.
x: The final port position along the x axis.
`portspec` must refer to a horizontal port if `x` is passed, otherwise a
BuildError will be raised.
y: The final port position along the y axis.
`portspec` must refer to a vertical port if `y` is passed, otherwise a
BuildError will be raised.
Returns:
self
Raises:
BuildError if `position`, `x`, or `y` is too close to fit the bend (if a bend
is present).
BuildError if `x` or `y` is specified but does not match the axis of `portspec`.
BuildError if more than one of `x`, `y`, and `position` is specified.
"""
if self._dead:
logger.error('Skipping path_to() since device is dead')
return self
pos_count = sum(vv is not None for vv in (position, x, y))
if pos_count > 1:
raise BuildError('Only one of `position`, `x`, and `y` may be specified at once')
if pos_count < 1:
raise BuildError('One of `position`, `x`, and `y` must be specified')
port = self.pattern[portspec]
if port.rotation is None:
raise PortError(f'Port {portspec} has no rotation and cannot be used for path_to()')
if not numpy.isclose(port.rotation % (pi / 2), 0):
raise BuildError('path_to was asked to route from non-manhattan port')
is_horizontal = numpy.isclose(port.rotation % pi, 0)
if is_horizontal:
if y is not None:
raise BuildError('Asked to path to y-coordinate, but port is horizontal')
if position is None:
position = x
else:
if x is not None:
raise BuildError('Asked to path to x-coordinate, but port is vertical')
if position is None:
position = y
x0, y0 = port.offset
if is_horizontal:
if numpy.sign(numpy.cos(port.rotation)) == numpy.sign(position - x0):
raise BuildError(f'path_to routing to behind source port: x0={x0:g} to {position:g}')
length = numpy.abs(position - x0)
else:
if numpy.sign(numpy.sin(port.rotation)) == numpy.sign(position - y0):
raise BuildError(f'path_to routing to behind source port: y0={y0:g} to {position:g}')
length = numpy.abs(position - y0)
return self.path(portspec, ccw, length, **kwargs)
def mpath(
self,
portspec: str | Sequence[str],
ccw: SupportsBool | None,
*,
spacing: float | ArrayLike | None = None,
set_rotation: float | None = None,
**kwargs,
) -> Self:
"""
`mpath` is a superset of `path` and `path_to` which can act on bundles or buses
of "wires or "waveguides".
See `Pather.mpath` for details.
Args:
portspec: The names of the ports which are to be routed.
ccw: If `None`, the outputs should be along the same axis as the inputs.
Otherwise, cast to bool and turn 90 degrees counterclockwise if `True`
and clockwise otherwise.
spacing: Center-to-center distance between output ports along the input port's axis.
Must be provided if (and only if) `ccw` is not `None`.
set_rotation: If the provided ports have `rotation=None`, this can be used
to set a rotation for them.
Returns:
self
Raises:
BuildError if the implied length for any wire is too close to fit the bend
(if a bend is requested).
BuildError if `xmin`/`xmax` or `ymin`/`ymax` is specified but does not
match the axis of `portspec`.
BuildError if an incorrect bound type or spacing is specified.
"""
if self._dead:
logger.error('Skipping mpath() since device is dead')
return self
bound_types = set()
if 'bound_type' in kwargs:
bound_types.add(kwargs['bound_type'])
bound = kwargs['bound']
for bt in ('emin', 'emax', 'pmin', 'pmax', 'xmin', 'xmax', 'ymin', 'ymax', 'min_past_furthest'):
if bt in kwargs:
bound_types.add(bt)
bound = kwargs[bt]
if not bound_types:
raise BuildError('No bound type specified for mpath')
if len(bound_types) > 1:
raise BuildError(f'Too many bound types specified for mpath: {bound_types}')
bound_type = tuple(bound_types)[0]
if isinstance(portspec, str):
portspec = [portspec]
ports = self.pattern[tuple(portspec)]
extensions = ell(ports, ccw, spacing=spacing, bound=bound, bound_type=bound_type, set_rotation=set_rotation)
if len(ports) == 1:
# Not a bus, so having a container just adds noise to the layout
port_name = tuple(portspec)[0]
self.path(port_name, ccw, extensions[port_name])
else:
for port_name, length in extensions.items():
self.path(port_name, ccw, length)
return self
def render(
self,
append: bool = True,
) -> Self:
"""
Generate the geometry which has been planned out with `path`/`path_to`/etc.
Args:
append: If `True`, the rendered geometry will be directly appended to
`self.pattern`. Note that it will not be flattened, so if only one
layer of hierarchy is eliminated.
Returns:
self
"""
lib = self.library
tool_port_names = ('A', 'B')
pat = Pattern()
def render_batch(portspec: str, batch: list[RenderStep], append: bool) -> None:
assert batch[0].tool is not None
name = lib << batch[0].tool.render(batch, port_names=tool_port_names)
pat.ports[portspec] = batch[0].start_port.copy()
if append:
pat.plug(lib[name], {portspec: tool_port_names[0]}, append=append)
del lib[name] # NOTE if the rendered pattern has refs, those are now in `pat` but not flattened
else:
pat.plug(lib.abstract(name), {portspec: tool_port_names[0]}, append=append)
for portspec, steps in self.paths.items():
batch: list[RenderStep] = []
for step in steps:
appendable_op = step.opcode in ('L', 'S', 'U')
same_tool = batch and step.tool == batch[0].tool
# If we can't continue a batch, render it
if batch and (not appendable_op or not same_tool):
render_batch(portspec, batch, append)
batch = []
# batch is emptied already if we couldn't continue it
if appendable_op:
batch.append(step)
# Opcodes which break the batch go below this line
if not appendable_op and portspec in pat.ports:
del pat.ports[portspec]
#If the last batch didn't end yet
if batch:
render_batch(portspec, batch, append)
self.paths.clear()
pat.ports.clear()
self.pattern.append(pat)
return self
def translate(self, offset: ArrayLike) -> Self:
"""
Translate the pattern and all ports.
Args:
offset: (x, y) distance to translate by
Returns:
self
"""
self.pattern.translate_elements(offset)
return self
def rotate_around(self, pivot: ArrayLike, angle: float) -> Self:
"""
Rotate the pattern and all ports.
Args:
angle: angle (radians, counterclockwise) to rotate by
pivot: location to rotate around
Returns:
self
"""
self.pattern.rotate_around(pivot, angle)
return self
def mirror(self, axis: int) -> Self:
"""
Mirror the pattern and all ports across the specified axis.
Args:
axis: Axis to mirror across (x=0, y=1)
Returns:
self
"""
self.pattern.mirror(axis)
return self
def set_dead(self) -> Self:
"""
Disallows further changes through `plug()` or `place()`.
This is meant for debugging:
```
dev.plug(a, ...)
dev.set_dead() # added for debug purposes
dev.plug(b, ...) # usually raises an error, but now skipped
dev.plug(c, ...) # also skipped
dev.pattern.visualize() # shows the device as of the set_dead() call
```
Returns:
self
"""
self._dead = True
return self
def __repr__(self) -> str:
s = f'<Pather {self.pattern} L({len(self.library)}) {pformat(self.tools)}>'
return s

File diff suppressed because it is too large Load diff

View file

@ -46,7 +46,7 @@ def ell(
ccw: Turn direction. `True` means counterclockwise, `False` means clockwise,
and `None` means no bend. If `None`, spacing must remain `None` or `0` (default),
Otherwise, spacing must be set to a non-`None` value.
bound_method: Method used for determining the travel distance; see diagram above.
bound_type: Method used for determining the travel distance; see diagram above.
Valid values are:
- 'min_extension' or 'emin':
The total extension value for the furthest-out port (B in the diagram).
@ -64,7 +64,7 @@ def ell(
the x- and y- axes. If specifying a position, it is projected onto
the extension direction.
bound_value: Value associated with `bound_type`, see above.
bound: Value associated with `bound_type`, see above.
spacing: Distance between adjacent channels. Can be scalar, resulting in evenly
spaced channels, or a vector with length one less than `ports`, allowing
non-uniform spacing.
@ -84,7 +84,7 @@ def ell(
raise BuildError('Empty port list passed to `ell()`')
if ccw is None:
if spacing is not None and not numpy.isclose(spacing, 0):
if spacing is not None and not numpy.allclose(spacing, 0):
raise BuildError('Spacing must be 0 or None when ccw=None')
spacing = 0
elif spacing is None:
@ -106,7 +106,7 @@ def ell(
raise BuildError('Asked to find aggregation for ports that face in different directions:\n'
+ pformat(port_rotations))
else:
if set_rotation is not None:
if set_rotation is None:
raise BuildError('set_rotation must be specified if no ports have rotations!')
rotations = numpy.full_like(has_rotation, set_rotation, dtype=float)
@ -132,8 +132,17 @@ def ell(
if spacing is None:
ch_offsets = numpy.zeros_like(y_order)
else:
spacing_arr = numpy.asarray(spacing, dtype=float).reshape(-1)
steps = numpy.zeros_like(y_order)
steps[1:] = spacing
if spacing_arr.size == 1:
steps[1:] = spacing_arr[0]
elif spacing_arr.size == len(ports) - 1:
steps[1:] = spacing_arr
else:
raise BuildError(
f'spacing must be scalar or have length {len(ports) - 1} for {len(ports)} ports; '
f'got length {spacing_arr.size}'
)
ch_offsets = numpy.cumsum(steps)[y_ind]
x_start = rot_offsets[:, 0]

View file

@ -1,3 +1,10 @@
import traceback
import pathlib
MASQUE_DIR = str(pathlib.Path(__file__).parent)
class MasqueError(Exception):
"""
Parent exception for all Masque-related Exceptions
@ -25,15 +32,64 @@ class BuildError(MasqueError):
"""
pass
class PortError(MasqueError):
"""
Exception raised by builder-related functions
Exception raised by port-related functions
"""
pass
class OneShotError(MasqueError):
"""
Exception raised when a function decorated with `@oneshot` is called more than once
"""
def __init__(self, func_name: str) -> None:
Exception.__init__(self, f'Function "{func_name}" with @oneshot was called more than once')
def format_stacktrace(
stacklevel: int = 1,
*,
skip_file_prefixes: tuple[str, ...] = (MASQUE_DIR,),
low_file_prefixes: tuple[str, ...] = ('<frozen', '<runpy', '<string>'),
low_file_suffixes: tuple[str, ...] = ('IPython/utils/py3compat.py', 'concurrent/futures/process.py'),
) -> str:
"""
Utility function for making nicer stack traces (e.g. excluding <frozen runpy> and similar)
Args:
stacklevel: Number of frames to remove from near this function (default is to
show caller but not ourselves). Similar to `warnings.warn` and `logging.warning`.
skip_file_prefixes: Indicates frames to ignore after counting stack levels; similar
to `warnings.warn` *TODO check if this is actually the same effect re:stacklevel*.
Forces stacklevel to max(2, stacklevel).
Default is to exclude anything within `masque`.
low_file_prefixes: Indicates frames to ignore on the other (entry-point) end of the stack,
based on prefixes on their filenames.
low_file_suffixes: Indicates frames to ignore on the other (entry-point) end of the stack,
based on suffixes on their filenames.
Returns:
Formatted trimmed stack trace
"""
if skip_file_prefixes:
stacklevel = max(2, stacklevel)
stack = traceback.extract_stack()
bad_inds = [ii + 1 for ii, frame in enumerate(stack)
if frame.filename.startswith(low_file_prefixes) or frame.filename.endswith(low_file_suffixes)]
first_ok = max([0] + bad_inds)
last_ok = -stacklevel - 1
while last_ok >= -len(stack) and stack[last_ok].filename.startswith(skip_file_prefixes):
last_ok -= 1
if selected := stack[first_ok:last_ok + 1]:
pass
elif selected := stack[:-stacklevel]:
pass # noqa: SIM114 # separate elif for clarity
else:
selected = stack
return ''.join(traceback.format_list(selected))

View file

@ -16,7 +16,7 @@ import gzip
import numpy
import ezdxf
from ezdxf.enums import TextEntityAlignment
from ezdxf.entities import LWPolyline, Polyline, Text, Insert
from ezdxf.entities import LWPolyline, Polyline, Text, Insert, Solid, Trace
from .utils import is_gzipped, tmpfile
from .. import Pattern, Ref, PatternError, Label
@ -55,8 +55,7 @@ def write(
tuple: (1, 2) -> '1.2'
str: '1.2' -> '1.2' (no change)
DXF does not support shape repetition (only block repeptition). Please call
library.wrap_repeated_shapes() before writing to file.
Shape repetitions are expanded into individual DXF entities.
Other functions you may want to call:
- `masque.file.oasis.check_valid_names(library.keys())` to check for invalid names
@ -193,8 +192,37 @@ def read(
top_name, top_pat = _read_block(msp)
mlib = Library({top_name: top_pat})
blocks_by_name = {
bb.name: bb
for bb in lib.blocks
if not bb.is_any_layout
}
referenced: set[str] = set()
pending = [msp]
seen_blocks: set[str] = set()
while pending:
block = pending.pop()
block_name = getattr(block, 'name', None)
if block_name is not None and block_name in seen_blocks:
continue
if block_name is not None:
seen_blocks.add(block_name)
for element in block:
if not isinstance(element, Insert):
continue
target = element.dxfattribs().get('name')
if target is None or target in referenced:
continue
referenced.add(target)
if target in blocks_by_name:
pending.append(blocks_by_name[target])
for bb in lib.blocks:
if bb.name == '*Model_Space':
if bb.is_any_layout:
continue
if bb.name.startswith('_') and bb.name not in referenced:
continue
name, pat = _read_block(bb)
mlib[name] = pat
@ -213,32 +241,60 @@ def _read_block(block: ezdxf.layouts.BlockLayout | ezdxf.layouts.Modelspace) ->
if isinstance(element, LWPolyline | Polyline):
if isinstance(element, LWPolyline):
points = numpy.asarray(element.get_points())
elif isinstance(element, Polyline):
is_closed = element.closed
else:
points = numpy.asarray([pp.xyz for pp in element.points()])
is_closed = element.is_closed
attr = element.dxfattribs()
layer = attr.get('layer', DEFAULT_LAYER)
if points.shape[1] == 2:
raise PatternError('Invalid or unimplemented polygon?')
width = 0
if isinstance(element, LWPolyline):
# ezdxf 1.4+ get_points() returns (x, y, start_width, end_width, bulge)
if points.shape[1] >= 5:
if (points[:, 4] != 0).any():
raise PatternError('LWPolyline has bulge (not yet representable in masque!)')
if (points[:, 2] != points[:, 3]).any() or (points[:, 2] != points[0, 2]).any():
raise PatternError('LWPolyline has non-constant width (not yet representable in masque!)')
width = points[0, 2]
elif points.shape[1] == 3:
# width used to be in column 2
width = points[0, 2]
if points.shape[1] > 2:
if (points[0, 2] != points[:, 2]).any():
raise PatternError('PolyLine has non-constant width (not yet representable in masque!)')
if points.shape[1] == 4 and (points[:, 3] != 0).any():
raise PatternError('LWPolyLine has bulge (not yet representable in masque!)')
if width == 0:
width = attr.get('const_width', 0)
width = points[0, 2]
if width == 0:
width = attr.get('const_width', 0)
verts = points[:, :2]
if is_closed and (len(verts) < 2 or not numpy.allclose(verts[0], verts[-1])):
verts = numpy.vstack((verts, verts[0]))
shape: Path | Polygon
if width == 0 and len(points) > 2 and numpy.array_equal(points[0], points[-1]):
shape = Polygon(vertices=points[:-1, :2])
shape: Path | Polygon
if width == 0 and is_closed:
# Use Polygon if it has at least 3 unique vertices
shape_verts = verts[:-1] if len(verts) > 1 else verts
if len(shape_verts) >= 3:
shape = Polygon(vertices=shape_verts)
else:
shape = Path(width=width, vertices=points[:, :2])
shape = Path(width=width, vertices=verts)
else:
shape = Path(width=width, vertices=verts)
pat.shapes[layer].append(shape)
elif isinstance(element, Solid | Trace):
attr = element.dxfattribs()
layer = attr.get('layer', DEFAULT_LAYER)
points = numpy.array([element.get_dxf_attrib(f'vtx{i}') for i in range(4)
if element.has_dxf_attrib(f'vtx{i}')])
if len(points) >= 3:
# If vtx2 == vtx3, it's a triangle. ezdxf handles this.
if len(points) == 4 and numpy.allclose(points[2], points[3]):
verts = points[:3, :2]
# DXF Solid/Trace uses 0-1-3-2 vertex order for quadrilaterals!
elif len(points) == 4:
verts = points[[0, 1, 3, 2], :2]
else:
verts = points[:, :2]
pat.shapes[layer].append(Polygon(vertices=verts))
elif isinstance(element, Text):
args = dict(
offset=numpy.asarray(element.get_placement()[1])[:2],
@ -273,12 +329,57 @@ def _read_block(block: ezdxf.layouts.BlockLayout | ezdxf.layouts.Modelspace) ->
)
if 'column_count' in attr:
args['repetition'] = Grid(
a_vector=(attr['column_spacing'], 0),
b_vector=(0, attr['row_spacing']),
a_count=attr['column_count'],
b_count=attr['row_count'],
col_spacing = attr['column_spacing']
row_spacing = attr['row_spacing']
col_count = attr['column_count']
row_count = attr['row_count']
local_x = numpy.array((col_spacing, 0.0))
local_y = numpy.array((0.0, row_spacing))
inv_rot = rotation_matrix_2d(-rotation)
candidates = (
(inv_rot @ local_x, inv_rot @ local_y, col_count, row_count),
(inv_rot @ local_y, inv_rot @ local_x, row_count, col_count),
)
repetition = None
for a_vector, b_vector, a_count, b_count in candidates:
rotated_a = rotation_matrix_2d(rotation) @ a_vector
rotated_b = rotation_matrix_2d(rotation) @ b_vector
if (numpy.isclose(rotated_a[1], 0, atol=1e-8)
and numpy.isclose(rotated_b[0], 0, atol=1e-8)
and numpy.isclose(rotated_a[0], col_spacing, atol=1e-8)
and numpy.isclose(rotated_b[1], row_spacing, atol=1e-8)
and a_count == col_count
and b_count == row_count):
repetition = Grid(
a_vector=a_vector,
b_vector=b_vector,
a_count=a_count,
b_count=b_count,
)
break
if (numpy.isclose(rotated_a[0], 0, atol=1e-8)
and numpy.isclose(rotated_b[1], 0, atol=1e-8)
and numpy.isclose(rotated_b[0], col_spacing, atol=1e-8)
and numpy.isclose(rotated_a[1], row_spacing, atol=1e-8)
and b_count == col_count
and a_count == row_count):
repetition = Grid(
a_vector=a_vector,
b_vector=b_vector,
a_count=a_count,
b_count=b_count,
)
break
if repetition is None:
repetition = Grid(
a_vector=inv_rot @ local_x,
b_vector=inv_rot @ local_y,
a_count=col_count,
b_count=row_count,
)
args['repetition'] = repetition
pat.ref(**args)
else:
logger.warning(f'Ignoring DXF element {element.dxftype()} (not implemented).')
@ -303,15 +404,23 @@ def _mrefs_to_drefs(
elif isinstance(rep, Grid):
a = rep.a_vector
b = rep.b_vector if rep.b_vector is not None else numpy.zeros(2)
rotated_a = rotation_matrix_2d(-ref.rotation) @ a
rotated_b = rotation_matrix_2d(-ref.rotation) @ b
if rotated_a[1] == 0 and rotated_b[0] == 0:
# In masque, the grid basis vectors are NOT rotated by the reference's rotation.
# In DXF, the grid basis vectors are [column_spacing, 0] and [0, row_spacing],
# which ARE then rotated by the block reference's rotation.
# Therefore, we can only use a DXF array if ref.rotation is 0 (or a multiple of 90)
# AND the grid is already manhattan.
# Rotate basis vectors by the reference rotation to see where they end up in the DXF frame
rotated_a = rotation_matrix_2d(ref.rotation) @ a
rotated_b = rotation_matrix_2d(ref.rotation) @ b
if numpy.isclose(rotated_a[1], 0, atol=1e-8) and numpy.isclose(rotated_b[0], 0, atol=1e-8):
attribs['column_count'] = rep.a_count
attribs['row_count'] = rep.b_count
attribs['column_spacing'] = rotated_a[0]
attribs['row_spacing'] = rotated_b[1]
block.add_blockref(encoded_name, ref.offset, dxfattribs=attribs)
elif rotated_a[0] == 0 and rotated_b[1] == 0:
elif numpy.isclose(rotated_a[0], 0, atol=1e-8) and numpy.isclose(rotated_b[1], 0, atol=1e-8):
attribs['column_count'] = rep.b_count
attribs['row_count'] = rep.a_count
attribs['column_spacing'] = rotated_b[0]
@ -344,16 +453,23 @@ def _shapes_to_elements(
for layer, sseq in shapes.items():
attribs = dict(layer=_mlayer2dxf(layer))
for shape in sseq:
displacements = [numpy.zeros(2)]
if shape.repetition is not None:
raise PatternError(
'Shape repetitions are not supported by DXF.'
' Please call library.wrap_repeated_shapes() before writing to file.'
)
displacements = shape.repetition.displacements
for polygon in shape.to_polygons():
xy_open = polygon.vertices + polygon.offset
xy_closed = numpy.vstack((xy_open, xy_open[0, :]))
block.add_lwpolyline(xy_closed, dxfattribs=attribs)
for dd in displacements:
if isinstance(shape, Path):
# preserve path.
# Note: DXF paths don't support endcaps well, so this is still a bit limited.
xy = shape.vertices + dd
attribs_path = {**attribs}
if shape.width > 0:
attribs_path['const_width'] = shape.width
block.add_lwpolyline(xy, dxfattribs=attribs_path)
else:
for polygon in shape.to_polygons():
xy_open = polygon.vertices + dd
block.add_lwpolyline(xy_open, close=True, dxfattribs=attribs)
def _labels_to_texts(
@ -363,11 +479,17 @@ def _labels_to_texts(
for layer, lseq in labels.items():
attribs = dict(layer=_mlayer2dxf(layer))
for label in lseq:
xy = label.offset
block.add_text(
label.string,
dxfattribs=attribs
).set_placement(xy, align=TextEntityAlignment.BOTTOM_LEFT)
if label.repetition is None:
block.add_text(
label.string,
dxfattribs=attribs
).set_placement(label.offset, align=TextEntityAlignment.BOTTOM_LEFT)
else:
for dd in label.repetition.displacements:
block.add_text(
label.string,
dxfattribs=attribs
).set_placement(label.offset + dd, align=TextEntityAlignment.BOTTOM_LEFT)
def _mlayer2dxf(layer: layer_t) -> str:
@ -376,5 +498,5 @@ def _mlayer2dxf(layer: layer_t) -> str:
if isinstance(layer, int):
return str(layer)
if isinstance(layer, tuple):
return f'{layer[0]}.{layer[1]}'
return f'{layer[0]:d}.{layer[1]:d}'
raise PatternError(f'Unknown layer type: {layer} ({type(layer)})')

View file

@ -21,6 +21,7 @@ Notes:
"""
from typing import IO, cast, Any
from collections.abc import Iterable, Mapping, Callable
from types import MappingProxyType
import io
import mmap
import logging
@ -36,7 +37,7 @@ from klamath import records
from .utils import is_gzipped, tmpfile
from .. import Pattern, Ref, PatternError, LibraryError, Label, Shape
from ..shapes import Polygon, Path
from ..shapes import Polygon, Path, RectCollection
from ..repetition import Grid
from ..utils import layer_t, annotations_t
from ..library import LazyLibrary, Library, ILibrary, ILibraryView
@ -52,6 +53,8 @@ path_cap_map = {
4: Path.Cap.SquareCustom,
}
RO_EMPTY_DICT: Mapping[int, bytes] = MappingProxyType({})
def rint_cast(val: ArrayLike) -> NDArray[numpy.int32]:
return numpy.rint(val).astype(numpy.int32)
@ -79,7 +82,7 @@ def write(
datatype is chosen to be `shape.layer[1]` if available,
otherwise `0`
GDS does not support shape repetition (only cell repeptition). Please call
GDS does not support shape repetition (only cell repetition). Please call
`library.wrap_repeated_shapes()` before writing to file.
Other functions you may want to call:
@ -320,26 +323,40 @@ def _gpath_to_mpath(gpath: klamath.library.Path, raw_mode: bool) -> tuple[layer_
else:
raise PatternError(f'Unrecognized path type: {gpath.path_type}')
mpath = Path(
vertices=gpath.xy.astype(float),
width=gpath.width,
cap=cap,
offset=numpy.zeros(2),
annotations=_properties_to_annotations(gpath.properties),
raw=raw_mode,
)
vertices = gpath.xy.astype(float)
annotations = _properties_to_annotations(gpath.properties)
cap_extensions = None
if cap == Path.Cap.SquareCustom:
mpath.cap_extensions = gpath.extension
cap_extensions = numpy.asarray(gpath.extension, dtype=float)
if raw_mode:
mpath = Path._from_raw(
vertices=vertices,
width=gpath.width,
cap=cap,
cap_extensions=cap_extensions,
annotations=annotations,
)
else:
mpath = Path(
vertices=vertices,
width=gpath.width,
cap=cap,
cap_extensions=cap_extensions,
offset=numpy.zeros(2),
annotations=annotations,
)
return gpath.layer, mpath
def _boundary_to_polygon(boundary: klamath.library.Boundary, raw_mode: bool) -> tuple[layer_t, Polygon]:
return boundary.layer, Polygon(
vertices=boundary.xy[:-1].astype(float),
offset=numpy.zeros(2),
annotations=_properties_to_annotations(boundary.properties),
raw=raw_mode,
)
vertices = boundary.xy[:-1].astype(float)
annotations = _properties_to_annotations(boundary.properties)
if raw_mode:
poly = Polygon._from_raw(vertices=vertices, annotations=annotations)
else:
poly = Polygon(vertices=vertices, offset=numpy.zeros(2), annotations=annotations)
return boundary.layer, poly
def _mrefs_to_grefs(refs: dict[str | None, list[Ref]]) -> list[klamath.library.Reference]:
@ -399,11 +416,15 @@ def _mrefs_to_grefs(refs: dict[str | None, list[Ref]]) -> list[klamath.library.R
return grefs
def _properties_to_annotations(properties: dict[int, bytes]) -> annotations_t:
def _properties_to_annotations(properties: Mapping[int, bytes]) -> annotations_t:
if not properties:
return None
return {str(k): [v.decode()] for k, v in properties.items()}
def _annotations_to_properties(annotations: annotations_t, max_len: int = 126) -> dict[int, bytes]:
def _annotations_to_properties(annotations: annotations_t, max_len: int = 126) -> Mapping[int, bytes]:
if annotations is None:
return RO_EMPTY_DICT
cum_len = 0
props = {}
for key, vals in annotations.items():
@ -411,8 +432,8 @@ def _annotations_to_properties(annotations: annotations_t, max_len: int = 126) -
i = int(key)
except ValueError as err:
raise PatternError(f'Annotation key {key} is not convertable to an integer') from err
if not (0 < i < 126):
raise PatternError(f'Annotation key {key} converts to {i} (must be in the range [1,125])')
if not (0 < i <= 126):
raise PatternError(f'Annotation key {key} converts to {i} (must be in the range [1,126])')
val_strings = ' '.join(str(val) for val in vals)
b = val_strings.encode()
@ -446,7 +467,7 @@ def _shapes_to_elements(
extension: tuple[int, int]
if shape.cap == Path.Cap.SquareCustom and shape.cap_extensions is not None:
extension = tuple(shape.cap_extensions) # type: ignore
extension = tuple(rint_cast(shape.cap_extensions))
else:
extension = (0, 0)
@ -459,6 +480,20 @@ def _shapes_to_elements(
properties=properties,
)
elements.append(path)
elif isinstance(shape, RectCollection):
for rect in shape.rects:
xy_closed = numpy.empty((5, 2), dtype=numpy.int32)
xy_closed[0] = rint_cast((rect[0], rect[1]))
xy_closed[1] = rint_cast((rect[0], rect[3]))
xy_closed[2] = rint_cast((rect[2], rect[3]))
xy_closed[3] = rint_cast((rect[2], rect[1]))
xy_closed[4] = xy_closed[0]
boundary = klamath.elements.Boundary(
layer=(layer, data_type),
xy=xy_closed,
properties=properties,
)
elements.append(boundary)
elif isinstance(shape, Polygon):
polygon = shape
xy_closed = numpy.empty((polygon.vertices.shape[0] + 1, 2), dtype=numpy.int32)
@ -610,7 +645,12 @@ def load_libraryfile(
stream = mmap.mmap(base_stream.fileno(), 0, access=mmap.ACCESS_READ) # type: ignore
else:
stream = path.open(mode='rb') # noqa: SIM115
return load_library(stream, full_load=full_load, postprocess=postprocess)
try:
return load_library(stream, full_load=full_load, postprocess=postprocess)
finally:
if full_load:
stream.close()
def check_valid_names(
@ -625,6 +665,7 @@ def check_valid_names(
max_length: Max allowed length
"""
names = tuple(names)
allowed_chars = set(string.ascii_letters + string.digits + '_?$')
bad_chars = [
@ -641,7 +682,7 @@ def check_valid_names(
logger.error('Names contain invalid characters:\n' + pformat(bad_chars))
if bad_lengths:
logger.error(f'Names too long (>{max_length}:\n' + pformat(bad_chars))
logger.error(f'Names too long (>{max_length}):\n' + pformat(bad_lengths))
if bad_chars or bad_lengths:
raise LibraryError('Library contains invalid names, see log above')

910
masque/file/gdsii_arrow.py Normal file
View file

@ -0,0 +1,910 @@
# ruff: noqa: ARG001, F401
"""
GDSII file format readers and writers using the `TODO` library.
Note that GDSII references follow the same convention as `masque`,
with this order of operations:
1. Mirroring
2. Rotation
3. Scaling
4. Offset and array expansion (no mirroring/rotation/scaling applied to offsets)
Scaling, rotation, and mirroring apply to individual instances, not grid
vectors or offsets.
Notes:
* absolute positioning is not supported
* PLEX is not supported
* ELFLAGS are not supported
* GDS does not support library- or structure-level annotations
* GDS creation/modification/access times are set to 1900-01-01 for reproducibility.
* Gzip modification time is set to 0 (start of current epoch, usually 1970-01-01)
TODO writing
TODO warn on boxes, nodes
"""
from typing import IO, cast, Any
from collections.abc import Iterable, Mapping, Callable
from importlib.machinery import EXTENSION_SUFFIXES
import importlib.util
import io
import mmap
import logging
import os
import pathlib
import gzip
import string
import sys
from pprint import pformat
import numpy
from numpy.typing import ArrayLike, NDArray
from numpy.testing import assert_equal
import pyarrow
from pyarrow.cffi import ffi
from .utils import is_gzipped, tmpfile
from .. import Pattern, Ref, PatternError, LibraryError, Label, Shape
from ..shapes import Polygon, Path, PolyCollection, RectCollection
from ..repetition import Grid
from ..utils import layer_t, annotations_t
from ..library import LazyLibrary, Library, ILibrary, ILibraryView
logger = logging.getLogger(__name__)
ffi.cdef(
"""
void read_path(char* path, struct ArrowArray* array, struct ArrowSchema* schema);
void scan_bytes(uint8_t* data, size_t size, struct ArrowArray* array, struct ArrowSchema* schema);
void read_cells_bytes(
uint8_t* data,
size_t size,
uint64_t* ranges,
size_t range_count,
struct ArrowArray* array,
struct ArrowSchema* schema
);
"""
)
clib: Any | None = None
ZERO_OFFSET = numpy.zeros(2)
path_cap_map = {
0: Path.Cap.Flush,
1: Path.Cap.Circle,
2: Path.Cap.Square,
4: Path.Cap.SquareCustom,
}
def rint_cast(val: ArrayLike) -> NDArray[numpy.int32]:
return numpy.rint(val).astype(numpy.int32)
def _packed_layer_u32_to_pairs(values: NDArray[numpy.unsignedinteger[Any]]) -> NDArray[numpy.int16]:
layer = (values >> numpy.uint32(16)).astype(numpy.uint16).view(numpy.int16)
dtype = (values & numpy.uint32(0xffff)).astype(numpy.uint16).view(numpy.int16)
return numpy.stack((layer, dtype), axis=-1)
def _packed_counts_u32_to_pairs(values: NDArray[numpy.unsignedinteger[Any]]) -> NDArray[numpy.int64]:
a_count = (values >> numpy.uint32(16)).astype(numpy.uint16).astype(numpy.int64)
b_count = (values & numpy.uint32(0xffff)).astype(numpy.uint16).astype(numpy.int64)
return numpy.stack((a_count, b_count), axis=-1)
def _packed_xy_u64_to_pairs(values: NDArray[numpy.unsignedinteger[Any]]) -> NDArray[numpy.int32]:
xx = (values >> numpy.uint64(32)).astype(numpy.uint32).view(numpy.int32)
yy = (values & numpy.uint64(0xffff_ffff)).astype(numpy.uint32).view(numpy.int32)
return numpy.stack((xx, yy), axis=-1)
def _local_library_filename() -> str:
if sys.platform.startswith('linux'):
return 'libklamath_rs_ext.so'
if sys.platform == 'darwin':
return 'libklamath_rs_ext.dylib'
if sys.platform == 'win32':
return 'klamath_rs_ext.dll'
raise OSError(f'Unsupported platform for klamath_rs_ext: {sys.platform!r}')
def _installed_library_candidates() -> list[pathlib.Path]:
candidates: list[pathlib.Path] = []
try:
spec = importlib.util.find_spec('klamath_rs_ext.klamath_rs_ext')
except ModuleNotFoundError:
spec = None
if spec is not None and spec.origin is not None:
candidates.append(pathlib.Path(spec.origin))
try:
pkg_spec = importlib.util.find_spec('klamath_rs_ext')
except ModuleNotFoundError:
pkg_spec = None
if pkg_spec is not None and pkg_spec.submodule_search_locations is not None:
for location in pkg_spec.submodule_search_locations:
pkg_dir = pathlib.Path(location)
for suffix in EXTENSION_SUFFIXES:
candidates.extend(sorted(pkg_dir.glob(f'klamath_rs_ext*{suffix}')))
return candidates
def _repo_library_candidates() -> list[pathlib.Path]:
repo_root = pathlib.Path(__file__).resolve().parents[2]
library_name = _local_library_filename()
return [
repo_root / 'klamath-rs' / 'target' / 'release' / library_name,
repo_root / 'klamath-rs' / 'target' / 'debug' / library_name,
]
def find_klamath_rs_library() -> pathlib.Path | None:
env_path = os.environ.get('KLAMATH_RS_EXT_LIB')
if env_path:
candidate = pathlib.Path(env_path).expanduser()
if candidate.exists():
return candidate.resolve()
seen: set[pathlib.Path] = set()
for candidate in _installed_library_candidates() + _repo_library_candidates():
resolved = candidate.expanduser()
if resolved in seen:
continue
seen.add(resolved)
if resolved.exists():
return resolved.resolve()
return None
def is_available() -> bool:
return find_klamath_rs_library() is not None
def _get_clib() -> Any:
global clib
if clib is None:
lib_path = find_klamath_rs_library()
if lib_path is None:
raise ImportError(
'Could not locate klamath_rs_ext shared library. '
'Build klamath-rs with `cargo build --release --manifest-path klamath-rs/Cargo.toml` '
'or set KLAMATH_RS_EXT_LIB to the built library path.'
)
clib = ffi.dlopen(str(lib_path))
return clib
def _read_annotations(
prop_offs: NDArray[numpy.integer[Any]],
prop_key: NDArray[numpy.integer[Any]],
prop_val: list[str],
ee: int,
) -> annotations_t:
prop_ii, prop_ff = prop_offs[ee], prop_offs[ee + 1]
if prop_ii >= prop_ff:
return None
return {str(prop_key[off]): [prop_val[off]] for off in range(prop_ii, prop_ff)}
def _read_to_arrow(
filename: str | pathlib.Path,
*args,
**kwargs,
) -> pyarrow.Array:
path = pathlib.Path(filename).expanduser().resolve()
ptr_array = ffi.new('struct ArrowArray[]', 1)
ptr_schema = ffi.new('struct ArrowSchema[]', 1)
_get_clib().read_path(str(path).encode(), ptr_array, ptr_schema)
return _import_arrow_array(ptr_array, ptr_schema)
def _import_arrow_array(ptr_array: Any, ptr_schema: Any) -> pyarrow.Array:
iptr_schema = int(ffi.cast('uintptr_t', ptr_schema))
iptr_array = int(ffi.cast('uintptr_t', ptr_array))
return pyarrow.Array._import_from_c(iptr_array, iptr_schema)
def _scan_buffer_to_arrow(buffer: bytes | mmap.mmap | memoryview) -> pyarrow.Array:
ptr_array = ffi.new('struct ArrowArray[]', 1)
ptr_schema = ffi.new('struct ArrowSchema[]', 1)
buf_view = memoryview(buffer)
cbuf = ffi.from_buffer('uint8_t[]', buf_view)
_get_clib().scan_bytes(cbuf, len(buf_view), ptr_array, ptr_schema)
return _import_arrow_array(ptr_array, ptr_schema)
def _read_selected_cells_to_arrow(
buffer: bytes | mmap.mmap | memoryview,
ranges: NDArray[numpy.uint64],
) -> pyarrow.Array:
ptr_array = ffi.new('struct ArrowArray[]', 1)
ptr_schema = ffi.new('struct ArrowSchema[]', 1)
buf_view = memoryview(buffer)
cbuf = ffi.from_buffer('uint8_t[]', buf_view)
flat_ranges = numpy.require(ranges, dtype=numpy.uint64, requirements=('C_CONTIGUOUS', 'ALIGNED'))
cranges = ffi.from_buffer('uint64_t[]', flat_ranges)
_get_clib().read_cells_bytes(cbuf, len(buf_view), cranges, int(flat_ranges.shape[0]), ptr_array, ptr_schema)
return _import_arrow_array(ptr_array, ptr_schema)
def readfile(
filename: str | pathlib.Path,
*args,
**kwargs,
) -> tuple[Library, dict[str, Any]]:
"""
Wrapper for `read()` that takes a filename or path instead of a stream.
Will automatically decompress gzipped files.
Args:
filename: Filename to save to.
*args: passed to `read()`
**kwargs: passed to `read()`
For callers that can consume Arrow directly, prefer `readfile_arrow()`
to skip Python `Pattern` construction entirely.
"""
arrow_arr = _read_to_arrow(filename)
assert len(arrow_arr) == 1
results = read_arrow(arrow_arr[0])
return results
def readfile_arrow(
filename: str | pathlib.Path,
) -> tuple[pyarrow.StructScalar, dict[str, Any]]:
"""
Read a GDSII file into the native Arrow representation without converting
it into `masque.Library` / `Pattern` objects.
This is the lowest-overhead public read path exposed by this module.
Args:
filename: Filename to read.
Returns:
- Arrow struct scalar for the library payload
- dict of GDSII library info
"""
arrow_arr = _read_to_arrow(filename)
assert len(arrow_arr) == 1
libarr = arrow_arr[0]
return libarr, _read_header(libarr)
def read_arrow(
libarr: pyarrow.Array,
raw_mode: bool = True,
) -> tuple[Library, dict[str, Any]]:
"""
# TODO check GDSII file for cycles!
Read a gdsii file and translate it into a dict of Pattern objects. GDSII structures are
translated into Pattern objects; boundaries are translated into polygons, and srefs and arefs
are translated into Ref objects.
Additional library info is returned in a dict, containing:
'name': name of the library
'meters_per_unit': number of meters per database unit (all values are in database units)
'logical_units_per_unit': number of "logical" units displayed by layout tools (typically microns)
per database unit
Args:
stream: Stream to read from.
raw_mode: If True, constructs shapes in raw mode, bypassing most data validation, Default True.
Returns:
- dict of pattern_name:Patterns generated from GDSII structures
- dict of GDSII library info
"""
library_info = _read_header(libarr)
layer_names_np = _packed_layer_u32_to_pairs(libarr['layers'].values.to_numpy())
layer_tups = [(int(pair[0]), int(pair[1])) for pair in layer_names_np]
cell_ids = libarr['cells'].values.field('id').to_numpy()
cell_names = libarr['cell_names'].as_py()
def get_geom(libarr: pyarrow.Array, geom_type: str) -> dict[str, Any]:
el = libarr['cells'].values.field(geom_type)
elem = dict(
offsets = el.offsets.to_numpy(),
xy_arr = el.values.field('xy').values.to_numpy().reshape((-1, 2)),
xy_off = el.values.field('xy').offsets.to_numpy() // 2,
layer_inds = el.values.field('layer').to_numpy(),
prop_off = el.values.field('properties').offsets.to_numpy(),
prop_key = el.values.field('properties').values.field('key').to_numpy(),
prop_val = el.values.field('properties').values.field('value').to_pylist(),
)
return elem
def get_boundary_batches(libarr: pyarrow.Array) -> dict[str, Any]:
batches = libarr['cells'].values.field('boundary_batches')
return dict(
offsets = batches.offsets.to_numpy(),
layer_inds = batches.values.field('layer').to_numpy(),
vert_arr = batches.values.field('vertices').values.to_numpy().reshape((-1, 2)),
vert_off = batches.values.field('vertices').offsets.to_numpy() // 2,
poly_off = batches.values.field('vertex_offsets').offsets.to_numpy(),
poly_offsets = batches.values.field('vertex_offsets').values.to_numpy(),
)
def get_rect_batches(libarr: pyarrow.Array) -> dict[str, Any]:
batches = libarr['cells'].values.field('rect_batches')
return dict(
offsets = batches.offsets.to_numpy(),
layer_inds = batches.values.field('layer').to_numpy(),
rect_arr = batches.values.field('rects').values.to_numpy().reshape((-1, 4)),
rect_off = batches.values.field('rects').offsets.to_numpy() // 4,
)
def get_boundary_props(libarr: pyarrow.Array) -> dict[str, Any]:
boundaries = libarr['cells'].values.field('boundary_props')
return dict(
offsets = boundaries.offsets.to_numpy(),
layer_inds = boundaries.values.field('layer').to_numpy(),
vert_arr = boundaries.values.field('vertices').values.to_numpy().reshape((-1, 2)),
vert_off = boundaries.values.field('vertices').offsets.to_numpy() // 2,
prop_off = boundaries.values.field('properties').offsets.to_numpy(),
prop_key = boundaries.values.field('properties').values.field('key').to_numpy(),
prop_val = boundaries.values.field('properties').values.field('value').to_pylist(),
)
def get_refs(libarr: pyarrow.Array, geom_type: str, has_repetition: bool) -> dict[str, Any]:
refs = libarr['cells'].values.field(geom_type)
values = refs.values
elem = dict(
offsets = refs.offsets.to_numpy(),
targets = values.field('target').to_numpy(),
xy = _packed_xy_u64_to_pairs(values.field('xy').to_numpy()),
invert_y = values.field('invert_y').to_numpy(zero_copy_only=False),
angle_rad = values.field('angle_rad').to_numpy(),
scale = values.field('scale').to_numpy(),
)
if has_repetition:
elem.update(dict(
xy0 = _packed_xy_u64_to_pairs(values.field('xy0').to_numpy()),
xy1 = _packed_xy_u64_to_pairs(values.field('xy1').to_numpy()),
counts = _packed_counts_u32_to_pairs(values.field('counts').to_numpy()),
))
return elem
def get_ref_props(libarr: pyarrow.Array, geom_type: str, has_repetition: bool) -> dict[str, Any]:
refs = libarr['cells'].values.field(geom_type)
values = refs.values
elem = dict(
offsets = refs.offsets.to_numpy(),
targets = values.field('target').to_numpy(),
xy = _packed_xy_u64_to_pairs(values.field('xy').to_numpy()),
invert_y = values.field('invert_y').to_numpy(zero_copy_only=False),
angle_rad = values.field('angle_rad').to_numpy(),
scale = values.field('scale').to_numpy(),
prop_off = values.field('properties').offsets.to_numpy(),
prop_key = values.field('properties').values.field('key').to_numpy(),
prop_val = values.field('properties').values.field('value').to_pylist(),
)
if has_repetition:
elem.update(dict(
xy0 = _packed_xy_u64_to_pairs(values.field('xy0').to_numpy()),
xy1 = _packed_xy_u64_to_pairs(values.field('xy1').to_numpy()),
counts = _packed_counts_u32_to_pairs(values.field('counts').to_numpy()),
))
return elem
txt = libarr['cells'].values.field('texts')
texts = dict(
offsets = txt.offsets.to_numpy(),
layer_inds = txt.values.field('layer').to_numpy(),
xy = _packed_xy_u64_to_pairs(txt.values.field('xy').to_numpy()),
string = txt.values.field('string').to_pylist(),
prop_off = txt.values.field('properties').offsets.to_numpy(),
prop_key = txt.values.field('properties').values.field('key').to_numpy(),
prop_val = txt.values.field('properties').values.field('value').to_pylist(),
)
elements = dict(
srefs = get_refs(libarr, 'srefs', has_repetition=False),
arefs = get_refs(libarr, 'arefs', has_repetition=True),
sref_props = get_ref_props(libarr, 'sref_props', has_repetition=False),
aref_props = get_ref_props(libarr, 'aref_props', has_repetition=True),
rect_batches = get_rect_batches(libarr),
boundary_batches = get_boundary_batches(libarr),
boundary_props = get_boundary_props(libarr),
paths = get_geom(libarr, 'paths'),
texts = texts,
)
paths = libarr['cells'].values.field('paths')
elements['paths'].update(dict(
width = paths.values.field('width').fill_null(0).to_numpy(),
path_type = paths.values.field('path_type').fill_null(0).to_numpy(),
extensions = numpy.stack((
paths.values.field('extension_start').fill_null(0).to_numpy(),
paths.values.field('extension_end').fill_null(0).to_numpy(),
), axis=-1),
))
global_args = dict(
cell_names = cell_names,
layer_tups = layer_tups,
raw_mode = raw_mode,
)
mlib = Library()
for cc in range(len(libarr['cells'])):
name = cell_names[int(cell_ids[cc])]
pat = Pattern()
_rect_batches_to_rectcollections(pat, global_args, elements['rect_batches'], cc)
_boundary_batches_to_polygons(pat, global_args, elements['boundary_batches'], cc)
_boundary_props_to_polygons(pat, global_args, elements['boundary_props'], cc)
_gpaths_to_mpaths(pat, global_args, elements['paths'], cc)
_srefs_to_mrefs(pat, global_args, elements['srefs'], cc)
_arefs_to_mrefs(pat, global_args, elements['arefs'], cc)
_sref_props_to_mrefs(pat, global_args, elements['sref_props'], cc)
_aref_props_to_mrefs(pat, global_args, elements['aref_props'], cc)
_texts_to_labels(pat, global_args, elements['texts'], cc)
mlib[name] = pat
return mlib, library_info
def _read_header(libarr: pyarrow.Array) -> dict[str, Any]:
"""
Read the file header and create the library_info dict.
"""
library_info = dict(
name = libarr['lib_name'].as_py(),
meters_per_unit = libarr['meters_per_db_unit'].as_py(),
logical_units_per_unit = libarr['user_units_per_db_unit'].as_py(),
)
return library_info
def _srefs_to_mrefs(
pat: Pattern,
global_args: dict[str, Any],
elem: dict[str, Any],
cc: int,
) -> None:
cell_names = global_args['cell_names']
elem_off = elem['offsets']
elem_count = elem_off[cc + 1] - elem_off[cc]
if elem_count == 0:
return
start = elem_off[cc]
stop = elem_off[cc + 1]
elem_targets = elem['targets'][start:stop]
elem_xy = elem['xy'][start:stop]
elem_invert_y = elem['invert_y'][start:stop]
elem_angle_rad = elem['angle_rad'][start:stop]
elem_scale = elem['scale'][start:stop]
raw_mode = global_args['raw_mode']
_append_plain_refs_sorted(
pat=pat,
cell_names=cell_names,
elem_targets=elem_targets,
elem_xy=elem_xy,
elem_invert_y=elem_invert_y,
elem_angle_rad=elem_angle_rad,
elem_scale=elem_scale,
raw_mode=raw_mode,
)
def _append_plain_refs_sorted(
*,
pat: Pattern,
cell_names: list[str],
elem_targets: NDArray[numpy.integer[Any]],
elem_xy: NDArray[numpy.integer[Any]],
elem_invert_y: NDArray[numpy.bool_ | numpy.bool],
elem_angle_rad: NDArray[numpy.floating[Any]],
elem_scale: NDArray[numpy.floating[Any]],
raw_mode: bool,
) -> None:
elem_count = len(elem_targets)
if elem_count == 0:
return
make_ref = Ref._from_raw if raw_mode else Ref
target_start = 0
while target_start < elem_count:
target_id = int(elem_targets[target_start])
target_stop = target_start + 1
while target_stop < elem_count and elem_targets[target_stop] == target_id:
target_stop += 1
append_refs = pat.refs[cell_names[target_id]].extend
append_refs(
make_ref(
offset=elem_xy[ee],
mirrored=elem_invert_y[ee],
rotation=elem_angle_rad[ee],
scale=elem_scale[ee],
repetition=None,
annotations=None,
)
for ee in range(target_start, target_stop)
)
target_start = target_stop
def _arefs_to_mrefs(
pat: Pattern,
global_args: dict[str, Any],
elem: dict[str, Any],
cc: int,
) -> None:
cell_names = global_args['cell_names']
elem_off = elem['offsets']
elem_count = elem_off[cc + 1] - elem_off[cc]
if elem_count == 0:
return
start = elem_off[cc]
stop = elem_off[cc + 1]
elem_targets = elem['targets'][start:stop]
elem_xy = elem['xy'][start:stop]
elem_invert_y = elem['invert_y'][start:stop]
elem_angle_rad = elem['angle_rad'][start:stop]
elem_scale = elem['scale'][start:stop]
elem_xy0 = elem['xy0'][start:stop]
elem_xy1 = elem['xy1'][start:stop]
elem_counts = elem['counts'][start:stop]
raw_mode = global_args['raw_mode']
make_ref = Ref._from_raw if raw_mode else Ref
make_grid = Grid._from_raw if raw_mode else Grid
if len(elem_targets) == 0:
return
target = None
append_ref: Callable[[Ref], Any] | None = None
for ee in range(len(elem_targets)):
target_id = int(elem_targets[ee])
if target != target_id:
target = target_id
append_ref = pat.refs[cell_names[target_id]].append
assert append_ref is not None
a_count, b_count = elem_counts[ee]
append_ref(make_ref(
offset=elem_xy[ee],
mirrored=elem_invert_y[ee],
rotation=elem_angle_rad[ee],
scale=elem_scale[ee],
repetition=make_grid(a_vector=elem_xy0[ee], b_vector=elem_xy1[ee], a_count=a_count, b_count=b_count),
annotations=None,
))
def _sref_props_to_mrefs(
pat: Pattern,
global_args: dict[str, Any],
elem: dict[str, Any],
cc: int,
) -> None:
cell_names = global_args['cell_names']
elem_off = elem['offsets']
prop_key = elem['prop_key']
prop_val = elem['prop_val']
elem_count = elem_off[cc + 1] - elem_off[cc]
if elem_count == 0:
return
elem_slc = slice(elem_off[cc], elem_off[cc] + elem_count + 1)
prop_offs = elem['prop_off'][elem_slc]
elem_targets = elem['targets'][elem_off[cc]:elem_off[cc + 1]]
elem_xy = elem['xy'][elem_off[cc]:elem_off[cc + 1]]
elem_invert_y = elem['invert_y'][elem_off[cc]:elem_off[cc + 1]]
elem_angle_rad = elem['angle_rad'][elem_off[cc]:elem_off[cc + 1]]
elem_scale = elem['scale'][elem_off[cc]:elem_off[cc + 1]]
raw_mode = global_args['raw_mode']
make_ref = Ref._from_raw if raw_mode else Ref
for ee in range(elem_count):
annotations = _read_annotations(prop_offs, prop_key, prop_val, ee)
ref = make_ref(
offset=elem_xy[ee],
mirrored=elem_invert_y[ee],
rotation=elem_angle_rad[ee],
scale=elem_scale[ee],
repetition=None,
annotations=annotations,
)
pat.refs[cell_names[int(elem_targets[ee])]].append(ref)
def _aref_props_to_mrefs(
pat: Pattern,
global_args: dict[str, Any],
elem: dict[str, Any],
cc: int,
) -> None:
cell_names = global_args['cell_names']
elem_off = elem['offsets']
prop_key = elem['prop_key']
prop_val = elem['prop_val']
elem_count = elem_off[cc + 1] - elem_off[cc]
if elem_count == 0:
return
elem_slc = slice(elem_off[cc], elem_off[cc] + elem_count + 1)
prop_offs = elem['prop_off'][elem_slc]
elem_targets = elem['targets'][elem_off[cc]:elem_off[cc + 1]]
elem_xy = elem['xy'][elem_off[cc]:elem_off[cc + 1]]
elem_invert_y = elem['invert_y'][elem_off[cc]:elem_off[cc + 1]]
elem_angle_rad = elem['angle_rad'][elem_off[cc]:elem_off[cc + 1]]
elem_scale = elem['scale'][elem_off[cc]:elem_off[cc + 1]]
elem_xy0 = elem['xy0'][elem_off[cc]:elem_off[cc + 1]]
elem_xy1 = elem['xy1'][elem_off[cc]:elem_off[cc + 1]]
elem_counts = elem['counts'][elem_off[cc]:elem_off[cc + 1]]
raw_mode = global_args['raw_mode']
make_ref = Ref._from_raw if raw_mode else Ref
make_grid = Grid._from_raw if raw_mode else Grid
for ee in range(elem_count):
a_count, b_count = elem_counts[ee]
annotations = _read_annotations(prop_offs, prop_key, prop_val, ee)
ref = make_ref(
offset=elem_xy[ee],
mirrored=elem_invert_y[ee],
rotation=elem_angle_rad[ee],
scale=elem_scale[ee],
repetition=make_grid(a_vector=elem_xy0[ee], b_vector=elem_xy1[ee], a_count=a_count, b_count=b_count),
annotations=annotations,
)
pat.refs[cell_names[int(elem_targets[ee])]].append(ref)
def _texts_to_labels(
pat: Pattern,
global_args: dict[str, Any],
elem: dict[str, Any],
cc: int,
) -> None:
elem_off = elem['offsets'] # which elements belong to each cell
xy = elem['xy']
layer_tups = global_args['layer_tups']
layer_inds = elem['layer_inds']
prop_key = elem['prop_key']
prop_val = elem['prop_val']
elem_count = elem_off[cc + 1] - elem_off[cc]
elem_slc = slice(elem_off[cc], elem_off[cc] + elem_count + 1) # +1 to capture ending location for last elem
prop_offs = elem['prop_off'][elem_slc] # which props belong to each element
elem_xy = xy[elem_slc][:elem_count]
elem_layer_inds = layer_inds[elem_slc][:elem_count]
elem_strings = elem['string'][elem_slc][:elem_count]
raw_mode = global_args['raw_mode']
for ee in range(elem_count):
layer = layer_tups[int(elem_layer_inds[ee])]
offset = elem_xy[ee]
string = elem_strings[ee]
annotations = _read_annotations(prop_offs, prop_key, prop_val, ee)
if raw_mode:
mlabel = Label._from_raw(string=string, offset=offset, annotations=annotations)
else:
mlabel = Label(string=string, offset=offset, annotations=annotations)
pat.labels[layer].append(mlabel)
def _gpaths_to_mpaths(
pat: Pattern,
global_args: dict[str, Any],
elem: dict[str, Any],
cc: int,
) -> None:
elem_off = elem['offsets'] # which elements belong to each cell
xy_val = elem['xy_arr']
layer_tups = global_args['layer_tups']
layer_inds = elem['layer_inds']
prop_key = elem['prop_key']
prop_val = elem['prop_val']
elem_count = elem_off[cc + 1] - elem_off[cc]
elem_slc = slice(elem_off[cc], elem_off[cc] + elem_count + 1) # +1 to capture ending location for last elem
xy_offs = elem['xy_off'][elem_slc] # which xy coords belong to each element
prop_offs = elem['prop_off'][elem_slc] # which props belong to each element
elem_layer_inds = layer_inds[elem_slc][:elem_count]
elem_widths = elem['width'][elem_slc][:elem_count]
elem_path_types = elem['path_type'][elem_slc][:elem_count]
elem_extensions = elem['extensions'][elem_slc][:elem_count]
raw_mode = global_args['raw_mode']
for ee in range(elem_count):
layer = layer_tups[int(elem_layer_inds[ee])]
vertices = xy_val[xy_offs[ee]:xy_offs[ee + 1]]
width = elem_widths[ee]
cap_int = elem_path_types[ee]
cap = path_cap_map[cap_int]
if cap_int == 4:
cap_extensions = elem_extensions[ee]
else:
cap_extensions = None
annotations = _read_annotations(prop_offs, prop_key, prop_val, ee)
if raw_mode:
path = Path._from_raw(
vertices=vertices,
width=width,
cap=cap,
cap_extensions=cap_extensions,
annotations=annotations,
)
else:
path = Path(
vertices=vertices,
width=width,
cap=cap,
cap_extensions=cap_extensions,
offset=ZERO_OFFSET,
annotations=annotations,
)
pat.shapes[layer].append(path)
def _boundary_batches_to_polygons(
pat: Pattern,
global_args: dict[str, Any],
elem: dict[str, Any],
cc: int,
) -> None:
elem_off = elem['offsets'] # which elements belong to each cell
vert_arr = elem['vert_arr']
vert_off = elem['vert_off']
layer_inds = elem['layer_inds']
layer_tups = global_args['layer_tups']
poly_off = elem['poly_off']
poly_offsets = elem['poly_offsets']
batch_count = elem_off[cc + 1] - elem_off[cc]
if batch_count == 0:
return
elem_slc = slice(elem_off[cc], elem_off[cc] + batch_count + 1) # +1 to capture ending location for last elem
elem_vert_off = vert_off[elem_slc]
elem_poly_off = poly_off[elem_slc]
elem_layer_inds = layer_inds[elem_slc][:batch_count]
raw_mode = global_args['raw_mode']
for bb in range(batch_count):
layer = layer_tups[int(elem_layer_inds[bb])]
vertices = vert_arr[elem_vert_off[bb]:elem_vert_off[bb + 1]]
vertex_offsets = poly_offsets[elem_poly_off[bb]:elem_poly_off[bb + 1]]
if vertex_offsets.size == 1:
if raw_mode:
poly = Polygon._from_raw(vertices=vertices, annotations=None)
else:
poly = Polygon(vertices=vertices, offset=ZERO_OFFSET, annotations=None)
pat.shapes[layer].append(poly)
else:
if raw_mode:
polys = PolyCollection._from_raw(vertex_lists=vertices, vertex_offsets=vertex_offsets, annotations=None)
else:
polys = PolyCollection(vertex_lists=vertices, vertex_offsets=vertex_offsets, offset=ZERO_OFFSET, annotations=None)
pat.shapes[layer].append(polys)
def _rect_batches_to_rectcollections(
pat: Pattern,
global_args: dict[str, Any],
elem: dict[str, Any],
cc: int,
) -> None:
elem_off = elem['offsets']
rect_arr = elem['rect_arr']
rect_off = elem['rect_off']
layer_inds = elem['layer_inds']
layer_tups = global_args['layer_tups']
batch_count = elem_off[cc + 1] - elem_off[cc]
if batch_count == 0:
return
elem_slc = slice(elem_off[cc], elem_off[cc] + batch_count + 1)
elem_rect_off = rect_off[elem_slc]
elem_layer_inds = layer_inds[elem_slc][:batch_count]
raw_mode = global_args['raw_mode']
for bb in range(batch_count):
layer = layer_tups[int(elem_layer_inds[bb])]
rects = rect_arr[elem_rect_off[bb]:elem_rect_off[bb + 1]]
if raw_mode:
rect_collection = RectCollection._from_raw(rects=rects, annotations=None)
else:
rect_collection = RectCollection(rects=rects, offset=ZERO_OFFSET, annotations=None)
pat.shapes[layer].append(rect_collection)
def _boundary_props_to_polygons(
pat: Pattern,
global_args: dict[str, Any],
elem: dict[str, Any],
cc: int,
) -> None:
elem_off = elem['offsets']
vert_arr = elem['vert_arr']
vert_off = elem['vert_off']
layer_inds = elem['layer_inds']
layer_tups = global_args['layer_tups']
prop_key = elem['prop_key']
prop_val = elem['prop_val']
elem_count = elem_off[cc + 1] - elem_off[cc]
if elem_count == 0:
return
elem_slc = slice(elem_off[cc], elem_off[cc] + elem_count + 1)
elem_vert_off = vert_off[elem_slc]
prop_offs = elem['prop_off'][elem_slc]
elem_layer_inds = layer_inds[elem_slc][:elem_count]
raw_mode = global_args['raw_mode']
for ee in range(elem_count):
layer = layer_tups[int(elem_layer_inds[ee])]
vertices = vert_arr[elem_vert_off[ee]:elem_vert_off[ee + 1]]
annotations = _read_annotations(prop_offs, prop_key, prop_val, ee)
if raw_mode:
poly = Polygon._from_raw(vertices=vertices, annotations=annotations)
else:
poly = Polygon(vertices=vertices, offset=ZERO_OFFSET, annotations=annotations)
pat.shapes[layer].append(poly)
#def _properties_to_annotations(properties: pyarrow.Array) -> annotations_t:
# return {prop['key'].as_py(): prop['value'].as_py() for prop in properties}
def check_valid_names(
names: Iterable[str],
max_length: int = 32,
) -> None:
"""
Check all provided names to see if they're valid GDSII cell names.
Args:
names: Collection of names to check
max_length: Max allowed length
"""
allowed_chars = set(string.ascii_letters + string.digits + '_?$')
bad_chars = [
name for name in names
if not set(name).issubset(allowed_chars)
]
bad_lengths = [
name for name in names
if len(name) > max_length
]
if bad_chars:
logger.error('Names contain invalid characters:\n' + pformat(bad_chars))
if bad_lengths:
logger.error(f'Names too long (>{max_length}:\n' + pformat(bad_chars))
if bad_chars or bad_lengths:
raise LibraryError('Library contains invalid names, see log above')

View file

@ -0,0 +1,960 @@
"""
Lazy GDSII readers and writers backed by native Arrow scan/materialize paths.
This module is intentionally separate from `gdsii_arrow` so the eager read path
keeps its current behavior and performance profile.
"""
from __future__ import annotations
from dataclasses import dataclass
from typing import IO, Any, cast
from collections import defaultdict
from collections.abc import Callable, Iterator, Mapping, Sequence
import copy
import gzip
import logging
import mmap
import pathlib
import numpy
from numpy.typing import NDArray
import pyarrow
import klamath
from . import gdsii, gdsii_arrow
from .utils import is_gzipped, tmpfile
from ..error import LibraryError
from ..library import ILibrary, ILibraryView, Library, LibraryView, dangling_mode_t
from ..pattern import Pattern, map_targets
from ..utils import apply_transforms
logger = logging.getLogger(__name__)
@dataclass(frozen=True)
class _StructRange:
start: int
end: int
@dataclass
class _SourceBuffer:
path: pathlib.Path
data: bytes | mmap.mmap
handle: IO[bytes] | None = None
def raw_slice(self, start: int, end: int) -> bytes:
return self.data[start:end]
@dataclass
class _ScanRefs:
offsets: NDArray[numpy.integer[Any]]
targets: NDArray[numpy.integer[Any]]
xy: NDArray[numpy.int32]
xy0: NDArray[numpy.int32]
xy1: NDArray[numpy.int32]
counts: NDArray[numpy.int64]
invert_y: NDArray[numpy.bool_ | numpy.bool]
angle_rad: NDArray[numpy.floating[Any]]
scale: NDArray[numpy.floating[Any]]
@dataclass(frozen=True)
class _CellScan:
cell_id: int
struct_range: _StructRange
ref_start: int
ref_stop: int
children: set[str]
@dataclass
class _ScanPayload:
libarr: pyarrow.StructScalar
library_info: dict[str, Any]
cell_names: list[str]
cell_order: list[str]
cells: dict[str, _CellScan]
refs: _ScanRefs
@dataclass
class _SourceLayer:
library: ILibraryView
source_to_visible: dict[str, str]
visible_to_source: dict[str, str]
child_graph: dict[str, set[str]]
order: list[str]
@dataclass(frozen=True)
class _SourceEntry:
layer_index: int
source_name: str
def is_available() -> bool:
return gdsii_arrow.is_available()
def _read_header(libarr: pyarrow.StructScalar) -> dict[str, Any]:
return gdsii_arrow._read_header(libarr)
def _open_source_buffer(path: pathlib.Path) -> _SourceBuffer:
if is_gzipped(path):
with gzip.open(path, mode='rb') as stream:
data = stream.read()
return _SourceBuffer(path=path, data=data)
handle = path.open(mode='rb', buffering=0)
mapped = mmap.mmap(handle.fileno(), 0, access=mmap.ACCESS_READ)
return _SourceBuffer(path=path, data=mapped, handle=handle)
def _extract_scan_payload(libarr: pyarrow.StructScalar) -> _ScanPayload:
library_info = _read_header(libarr)
cell_names = libarr['cell_names'].as_py()
cells = libarr['cells']
cell_values = cells.values
cell_ids = cell_values.field('id').to_numpy()
struct_starts = cell_values.field('struct_start_offset').to_numpy()
struct_ends = cell_values.field('struct_end_offset').to_numpy()
refs = cell_values.field('refs')
ref_values = refs.values
ref_offsets = refs.offsets.to_numpy()
targets = ref_values.field('target').to_numpy()
xy = gdsii_arrow._packed_xy_u64_to_pairs(ref_values.field('xy').to_numpy())
xy0 = gdsii_arrow._packed_xy_u64_to_pairs(ref_values.field('xy0').to_numpy())
xy1 = gdsii_arrow._packed_xy_u64_to_pairs(ref_values.field('xy1').to_numpy())
counts = gdsii_arrow._packed_counts_u32_to_pairs(ref_values.field('counts').to_numpy())
invert_y = ref_values.field('invert_y').to_numpy(zero_copy_only=False)
angle_rad = ref_values.field('angle_rad').to_numpy()
scale = ref_values.field('scale').to_numpy()
ref_payload = _ScanRefs(
offsets=ref_offsets,
targets=targets,
xy=xy,
xy0=xy0,
xy1=xy1,
counts=counts,
invert_y=invert_y,
angle_rad=angle_rad,
scale=scale,
)
cell_order = [cell_names[int(cell_id)] for cell_id in cell_ids]
cell_scan: dict[str, _CellScan] = {}
for cc, name in enumerate(cell_order):
ref_start = int(ref_offsets[cc])
ref_stop = int(ref_offsets[cc + 1])
children = {
cell_names[int(target)]
for target in targets[ref_start:ref_stop]
}
cell_scan[name] = _CellScan(
cell_id=int(cell_ids[cc]),
struct_range=_StructRange(int(struct_starts[cc]), int(struct_ends[cc])),
ref_start=ref_start,
ref_stop=ref_stop,
children=children,
)
return _ScanPayload(
libarr=libarr,
library_info=library_info,
cell_names=cell_names,
cell_order=cell_order,
cells=cell_scan,
refs=ref_payload,
)
def _pattern_children(pat: Pattern) -> set[str]:
return {child for child, refs in pat.refs.items() if child is not None and refs}
def _remap_pattern_targets(pat: Pattern, remap: Callable[[str | None], str | None]) -> Pattern:
if not pat.refs:
return pat
pat.refs = map_targets(pat.refs, remap)
return pat
def _coerce_library_view(source: Mapping[str, Pattern] | ILibraryView) -> ILibraryView:
if isinstance(source, ILibraryView):
return source
return LibraryView(source)
def _source_order(source: ILibraryView) -> list[str]:
if isinstance(source, ArrowLibrary):
return list(source.source_order())
return list(source.keys())
def _make_ref_rows(
xy: NDArray[numpy.integer[Any]],
angle_rad: NDArray[numpy.floating[Any]],
invert_y: NDArray[numpy.bool_ | numpy.bool],
scale: NDArray[numpy.floating[Any]],
) -> NDArray[numpy.float64]:
rows = numpy.empty((len(xy), 5), dtype=float)
rows[:, :2] = xy
rows[:, 2] = angle_rad
rows[:, 3] = invert_y.astype(float)
rows[:, 4] = scale
return rows
def _expand_aref_row(
xy: NDArray[numpy.integer[Any]],
xy0: NDArray[numpy.integer[Any]],
xy1: NDArray[numpy.integer[Any]],
counts: NDArray[numpy.integer[Any]],
angle_rad: float,
invert_y: bool,
scale: float,
) -> NDArray[numpy.float64]:
a_count = int(counts[0])
b_count = int(counts[1])
aa, bb = numpy.meshgrid(numpy.arange(a_count), numpy.arange(b_count), indexing='ij')
displacements = aa.reshape(-1, 1) * xy0[None, :] + bb.reshape(-1, 1) * xy1[None, :]
rows = numpy.empty((displacements.shape[0], 5), dtype=float)
rows[:, :2] = xy + displacements
rows[:, 2] = angle_rad
rows[:, 3] = float(invert_y)
rows[:, 4] = scale
return rows
class ArrowLibrary(ILibraryView):
"""
Read-only library backed by the native lazy Arrow scan schema.
Materializing a cell via `__getitem__` caches a real `Pattern` for that cell.
Cached cells are treated as edited for future writes from this module.
"""
path: pathlib.Path
library_info: dict[str, Any]
def __init__(
self,
*,
path: pathlib.Path,
payload: _ScanPayload,
source: _SourceBuffer,
) -> None:
self.path = path
self.library_info = payload.library_info
self._payload = payload
self._source = source
self._cache: dict[str, Pattern] = {}
@classmethod
def from_file(cls, filename: str | pathlib.Path) -> ArrowLibrary:
path = pathlib.Path(filename).expanduser().resolve()
source = _open_source_buffer(path)
scan_arr = gdsii_arrow._scan_buffer_to_arrow(source.data)
assert len(scan_arr) == 1
payload = _extract_scan_payload(scan_arr[0])
return cls(path=path, payload=payload, source=source)
def __getitem__(self, key: str) -> Pattern:
return self._materialize_pattern(key, persist=True)
def __iter__(self) -> Iterator[str]:
return iter(self._payload.cell_order)
def __len__(self) -> int:
return len(self._payload.cell_order)
def __contains__(self, key: object) -> bool:
return key in self._payload.cells
def source_order(self) -> tuple[str, ...]:
return tuple(self._payload.cell_order)
def raw_struct_bytes(self, name: str) -> bytes:
struct_range = self._payload.cells[name].struct_range
return self._source.raw_slice(struct_range.start, struct_range.end)
def materialize_many(
self,
names: Sequence[str],
*,
persist: bool = True,
) -> LibraryView:
mats = self._materialize_patterns(names, persist=persist)
return LibraryView(mats)
def _materialize_patterns(
self,
names: Sequence[str],
*,
persist: bool,
) -> dict[str, Pattern]:
ordered_names = list(dict.fromkeys(names))
missing = [name for name in ordered_names if name not in self._payload.cells]
if missing:
raise KeyError(missing[0])
materialized: dict[str, Pattern] = {}
uncached = [name for name in ordered_names if name not in self._cache]
if uncached:
ranges = numpy.asarray(
[
[
self._payload.cells[name].struct_range.start,
self._payload.cells[name].struct_range.end,
]
for name in uncached
],
dtype=numpy.uint64,
)
arrow_arr = gdsii_arrow._read_selected_cells_to_arrow(self._source.data, ranges)
assert len(arrow_arr) == 1
selected_lib, _info = gdsii_arrow.read_arrow(arrow_arr[0])
for name in uncached:
pat = selected_lib[name]
materialized[name] = pat
if persist:
self._cache[name] = pat
for name in ordered_names:
if name in self._cache:
materialized[name] = self._cache[name]
return materialized
def _materialize_pattern(self, name: str, *, persist: bool) -> Pattern:
return self._materialize_patterns((name,), persist=persist)[name]
def _raw_children(self, name: str) -> set[str]:
return set(self._payload.cells[name].children)
def _collect_raw_transforms(self, cell: _CellScan, target_id: int) -> list[NDArray[numpy.float64]]:
refs = self._payload.refs
start = cell.ref_start
stop = cell.ref_stop
if stop <= start:
return []
targets = refs.targets[start:stop]
mask = targets == target_id
if not mask.any():
return []
rows: list[NDArray[numpy.float64]] = []
counts = refs.counts[start:stop]
unit_mask = mask & (counts[:, 0] == 1) & (counts[:, 1] == 1)
if unit_mask.any():
rows.append(_make_ref_rows(
refs.xy[start:stop][unit_mask],
refs.angle_rad[start:stop][unit_mask],
refs.invert_y[start:stop][unit_mask],
refs.scale[start:stop][unit_mask],
))
aref_indices = numpy.nonzero(mask & ~unit_mask)[0]
for idx in aref_indices:
abs_idx = start + int(idx)
rows.append(_expand_aref_row(
xy=refs.xy[abs_idx],
xy0=refs.xy0[abs_idx],
xy1=refs.xy1[abs_idx],
counts=refs.counts[abs_idx],
angle_rad=float(refs.angle_rad[abs_idx]),
invert_y=bool(refs.invert_y[abs_idx]),
scale=float(refs.scale[abs_idx]),
))
return rows
def child_graph(
self,
dangling: dangling_mode_t = 'error',
) -> dict[str, set[str]]:
graph: dict[str, set[str]] = {}
for name in self._payload.cell_order:
if name in self._cache:
graph[name] = _pattern_children(self._cache[name])
else:
graph[name] = self._raw_children(name)
existing = set(graph)
dangling_refs = set().union(*(children - existing for children in graph.values()))
if dangling == 'error':
if dangling_refs:
raise self._dangling_refs_error(cast('set[str]', dangling_refs), 'building child graph')
return graph
if dangling == 'ignore':
return {name: {child for child in children if child in existing} for name, children in graph.items()}
for child in dangling_refs:
graph.setdefault(cast('str', child), set())
return graph
def parent_graph(
self,
dangling: dangling_mode_t = 'error',
) -> dict[str, set[str]]:
child_graph = self.child_graph(dangling='include' if dangling == 'include' else 'ignore')
existing = set(self.keys())
igraph: dict[str, set[str]] = {name: set() for name in child_graph}
for parent, children in child_graph.items():
for child in children:
if child in existing or dangling == 'include':
igraph.setdefault(child, set()).add(parent)
if dangling == 'error':
raw = self.child_graph(dangling='include')
dangling_refs = set().union(*(children - existing for children in raw.values()))
if dangling_refs:
raise self._dangling_refs_error(cast('set[str]', dangling_refs), 'building parent graph')
return igraph
def subtree(
self,
tops: str | Sequence[str],
) -> ILibraryView:
if isinstance(tops, str):
tops = (tops,)
keep = cast('set[str]', self.referenced_patterns(tops) - {None})
keep |= set(tops)
return self.materialize_many(tuple(keep), persist=True)
def tops(self) -> list[str]:
graph = self.child_graph(dangling='ignore')
names = set(graph)
not_toplevel: set[str] = set()
for children in graph.values():
not_toplevel |= children
return list(names - not_toplevel)
def find_refs_local(
self,
name: str,
parent_graph: dict[str, set[str]] | None = None,
dangling: dangling_mode_t = 'error',
) -> dict[str, list[NDArray[numpy.float64]]]:
instances: dict[str, list[NDArray[numpy.float64]]] = defaultdict(list)
if parent_graph is None:
graph_mode = 'ignore' if dangling == 'ignore' else 'include'
parent_graph = self.parent_graph(dangling=graph_mode)
if name not in self:
if name not in parent_graph:
return instances
if dangling == 'error':
raise self._dangling_refs_error({name}, f'finding local refs for {name!r}')
if dangling == 'ignore':
return instances
target_id = self._payload.cells.get(name)
for parent in parent_graph.get(name, set()):
if parent in self._cache:
for ref in self._cache[parent].refs.get(name, []):
instances[parent].append(ref.as_transforms())
continue
if target_id is None or parent not in self._payload.cells:
continue
rows = self._collect_raw_transforms(self._payload.cells[parent], target_id.cell_id)
if rows:
instances[parent].extend(rows)
return instances
def find_refs_global(
self,
name: str,
order: list[str] | None = None,
parent_graph: dict[str, set[str]] | None = None,
dangling: dangling_mode_t = 'error',
) -> dict[tuple[str, ...], NDArray[numpy.float64]]:
graph_mode = 'ignore' if dangling == 'ignore' else 'include'
if order is None:
order = self.child_order(dangling=graph_mode)
if parent_graph is None:
parent_graph = self.parent_graph(dangling=graph_mode)
if name not in self:
if name not in parent_graph:
return {}
if dangling == 'error':
raise self._dangling_refs_error({name}, f'finding global refs for {name!r}')
if dangling == 'ignore':
return {}
self_keys = set(self.keys())
transforms: dict[str, list[tuple[tuple[str, ...], NDArray[numpy.float64]]]]
transforms = defaultdict(list)
for parent, vals in self.find_refs_local(name, parent_graph=parent_graph, dangling=dangling).items():
transforms[parent] = [((name,), numpy.concatenate(vals))]
for next_name in order:
if next_name not in transforms:
continue
if not parent_graph.get(next_name, set()) & self_keys:
continue
outers = self.find_refs_local(next_name, parent_graph=parent_graph, dangling=dangling)
inners = transforms.pop(next_name)
for parent, outer in outers.items():
outer_tf = numpy.concatenate(outer)
for path, inner in inners:
combined = apply_transforms(outer_tf, inner)
transforms[parent].append(((next_name,) + path, combined))
result = {}
for parent, targets in transforms.items():
for path, instances in targets:
full_path = (parent,) + path
result[full_path] = instances
return result
class OverlayLibrary(ILibrary):
"""
Mutable overlay over one or more source libraries.
Source-backed cells remain lazy until accessed through `__getitem__`, at
which point that visible cell is promoted into an overlay-owned materialized
`Pattern`.
"""
def __init__(self) -> None:
self._layers: list[_SourceLayer] = []
self._entries: dict[str, Pattern | _SourceEntry] = {}
self._order: list[str] = []
self._target_remap: dict[str, str] = {}
def __iter__(self) -> Iterator[str]:
return (name for name in self._order if name in self._entries)
def __len__(self) -> int:
return len(self._entries)
def __contains__(self, key: object) -> bool:
return key in self._entries
def __getitem__(self, key: str) -> Pattern:
return self._materialize_pattern(key, persist=True)
def __setitem__(
self,
key: str,
value: Pattern | Callable[[], Pattern],
) -> None:
if key in self._entries:
raise LibraryError(f'"{key}" already exists in the library. Overwriting is not allowed!')
pattern = value() if callable(value) else value
self._entries[key] = pattern
if key not in self._order:
self._order.append(key)
def __delitem__(self, key: str) -> None:
if key not in self._entries:
raise KeyError(key)
del self._entries[key]
def _merge(self, key_self: str, other: Mapping[str, Pattern], key_other: str) -> None:
self[key_self] = copy.deepcopy(other[key_other])
def add_source(
self,
source: Mapping[str, Pattern] | ILibraryView,
*,
rename_theirs: Callable[[ILibraryView, str], str] | None = None,
) -> dict[str, str]:
view = _coerce_library_view(source)
source_order = _source_order(view)
child_graph = view.child_graph(dangling='include')
source_to_visible: dict[str, str] = {}
visible_to_source: dict[str, str] = {}
rename_map: dict[str, str] = {}
for name in source_order:
visible = name
if visible in self._entries or visible in visible_to_source:
if rename_theirs is None:
raise LibraryError(f'Conflicting name while adding source: {name!r}')
visible = rename_theirs(self, name)
if visible in self._entries or visible in visible_to_source:
raise LibraryError(f'Unresolved duplicate key encountered while adding source: {name!r} -> {visible!r}')
rename_map[name] = visible
source_to_visible[name] = visible
visible_to_source[visible] = name
layer = _SourceLayer(
library=view,
source_to_visible=source_to_visible,
visible_to_source=visible_to_source,
child_graph=child_graph,
order=[source_to_visible[name] for name in source_order],
)
layer_index = len(self._layers)
self._layers.append(layer)
for source_name, visible_name in source_to_visible.items():
self._entries[visible_name] = _SourceEntry(layer_index=layer_index, source_name=source_name)
if visible_name not in self._order:
self._order.append(visible_name)
return rename_map
def rename(
self,
old_name: str,
new_name: str,
move_references: bool = False,
) -> OverlayLibrary:
if old_name not in self._entries:
raise LibraryError(f'"{old_name}" does not exist in the library.')
if old_name == new_name:
return self
if new_name in self._entries:
raise LibraryError(f'"{new_name}" already exists in the library.')
entry = self._entries.pop(old_name)
self._entries[new_name] = entry
if isinstance(entry, _SourceEntry):
layer = self._layers[entry.layer_index]
layer.source_to_visible[entry.source_name] = new_name
del layer.visible_to_source[old_name]
layer.visible_to_source[new_name] = entry.source_name
idx = self._order.index(old_name)
self._order[idx] = new_name
if move_references:
self.move_references(old_name, new_name)
return self
def _resolve_target(self, target: str) -> str:
seen: set[str] = set()
current = target
while current in self._target_remap:
if current in seen:
raise LibraryError(f'Cycle encountered while resolving target remap for {target!r}')
seen.add(current)
current = self._target_remap[current]
return current
def _set_target_remap(self, old_target: str, new_target: str) -> None:
resolved_new = self._resolve_target(new_target)
if resolved_new == old_target:
raise LibraryError(f'Ref target remap would create a cycle: {old_target!r} -> {new_target!r}')
self._target_remap[old_target] = resolved_new
for key in list(self._target_remap):
self._target_remap[key] = self._resolve_target(self._target_remap[key])
def move_references(self, old_target: str, new_target: str) -> OverlayLibrary:
if old_target == new_target:
return self
self._set_target_remap(old_target, new_target)
for entry in list(self._entries.values()):
if isinstance(entry, Pattern) and old_target in entry.refs:
entry.refs[new_target].extend(entry.refs[old_target])
del entry.refs[old_target]
return self
def _effective_target(self, layer: _SourceLayer, target: str) -> str:
visible = layer.source_to_visible.get(target, target)
return self._resolve_target(visible)
def _materialize_pattern(self, name: str, *, persist: bool) -> Pattern:
if name not in self._entries:
raise KeyError(name)
entry = self._entries[name]
if isinstance(entry, Pattern):
return entry
layer = self._layers[entry.layer_index]
source_pat = layer.library[entry.source_name].deepcopy()
remap = lambda target: None if target is None else self._effective_target(layer, target)
pat = _remap_pattern_targets(source_pat, remap)
if persist:
self._entries[name] = pat
return pat
def child_graph(
self,
dangling: dangling_mode_t = 'error',
) -> dict[str, set[str]]:
graph: dict[str, set[str]] = {}
for name in self._order:
if name not in self._entries:
continue
entry = self._entries[name]
if isinstance(entry, Pattern):
graph[name] = _pattern_children(entry)
continue
layer = self._layers[entry.layer_index]
children = {self._effective_target(layer, child) for child in layer.child_graph.get(entry.source_name, set())}
graph[name] = children
existing = set(graph)
dangling_refs = set().union(*(children - existing for children in graph.values()))
if dangling == 'error':
if dangling_refs:
raise self._dangling_refs_error(cast('set[str]', dangling_refs), 'building child graph')
return graph
if dangling == 'ignore':
return {name: {child for child in children if child in existing} for name, children in graph.items()}
for child in dangling_refs:
graph.setdefault(cast('str', child), set())
return graph
def parent_graph(
self,
dangling: dangling_mode_t = 'error',
) -> dict[str, set[str]]:
child_graph = self.child_graph(dangling='include' if dangling == 'include' else 'ignore')
existing = set(self.keys())
igraph: dict[str, set[str]] = {name: set() for name in child_graph}
for parent, children in child_graph.items():
for child in children:
if child in existing or dangling == 'include':
igraph.setdefault(child, set()).add(parent)
if dangling == 'error':
raw = self.child_graph(dangling='include')
dangling_refs = set().union(*(children - existing for children in raw.values()))
if dangling_refs:
raise self._dangling_refs_error(cast('set[str]', dangling_refs), 'building parent graph')
return igraph
def subtree(
self,
tops: str | Sequence[str],
) -> ILibraryView:
if isinstance(tops, str):
tops = (tops,)
keep = cast('set[str]', self.referenced_patterns(tops) - {None})
keep |= set(tops)
return LibraryView({name: self[name] for name in keep})
def find_refs_local(
self,
name: str,
parent_graph: dict[str, set[str]] | None = None,
dangling: dangling_mode_t = 'error',
) -> dict[str, list[NDArray[numpy.float64]]]:
instances: dict[str, list[NDArray[numpy.float64]]] = defaultdict(list)
if parent_graph is None:
graph_mode = 'ignore' if dangling == 'ignore' else 'include'
parent_graph = self.parent_graph(dangling=graph_mode)
if name not in self:
if name not in parent_graph:
return instances
if dangling == 'error':
raise self._dangling_refs_error({name}, f'finding local refs for {name!r}')
if dangling == 'ignore':
return instances
for parent in parent_graph.get(name, set()):
pat = self._materialize_pattern(parent, persist=False)
for ref in pat.refs.get(name, []):
instances[parent].append(ref.as_transforms())
return instances
def find_refs_global(
self,
name: str,
order: list[str] | None = None,
parent_graph: dict[str, set[str]] | None = None,
dangling: dangling_mode_t = 'error',
) -> dict[tuple[str, ...], NDArray[numpy.float64]]:
graph_mode = 'ignore' if dangling == 'ignore' else 'include'
if order is None:
order = self.child_order(dangling=graph_mode)
if parent_graph is None:
parent_graph = self.parent_graph(dangling=graph_mode)
if name not in self:
if name not in parent_graph:
return {}
if dangling == 'error':
raise self._dangling_refs_error({name}, f'finding global refs for {name!r}')
if dangling == 'ignore':
return {}
self_keys = set(self.keys())
transforms: dict[str, list[tuple[tuple[str, ...], NDArray[numpy.float64]]]]
transforms = defaultdict(list)
for parent, vals in self.find_refs_local(name, parent_graph=parent_graph, dangling=dangling).items():
transforms[parent] = [((name,), numpy.concatenate(vals))]
for next_name in order:
if next_name not in transforms:
continue
if not parent_graph.get(next_name, set()) & self_keys:
continue
outers = self.find_refs_local(next_name, parent_graph=parent_graph, dangling=dangling)
inners = transforms.pop(next_name)
for parent, outer in outers.items():
outer_tf = numpy.concatenate(outer)
for path, inner in inners:
combined = apply_transforms(outer_tf, inner)
transforms[parent].append(((next_name,) + path, combined))
result = {}
for parent, targets in transforms.items():
for path, instances in targets:
result[(parent,) + path] = instances
return result
def source_order(self) -> tuple[str, ...]:
return tuple(name for name in self._order if name in self._entries)
def readfile(
filename: str | pathlib.Path,
) -> tuple[ArrowLibrary, dict[str, Any]]:
lib = ArrowLibrary.from_file(filename)
return lib, lib.library_info
def load_libraryfile(
filename: str | pathlib.Path,
) -> tuple[ArrowLibrary, dict[str, Any]]:
return readfile(filename)
def _get_write_info(
library: Mapping[str, Pattern] | ILibraryView,
*,
meters_per_unit: float | None,
logical_units_per_unit: float | None,
library_name: str | None,
) -> tuple[float, float, str]:
if meters_per_unit is not None and logical_units_per_unit is not None and library_name is not None:
return meters_per_unit, logical_units_per_unit, library_name
infos: list[dict[str, Any]] = []
if isinstance(library, ArrowLibrary):
infos.append(library.library_info)
elif isinstance(library, OverlayLibrary):
for layer in library._layers:
if isinstance(layer.library, ArrowLibrary):
infos.append(layer.library.library_info)
if infos:
unit_pairs = {(info['meters_per_unit'], info['logical_units_per_unit']) for info in infos}
if len(unit_pairs) > 1:
raise LibraryError('Merged lazy GDS sources must have identical units before writing')
info = infos[0]
meters = info['meters_per_unit'] if meters_per_unit is None else meters_per_unit
logical = info['logical_units_per_unit'] if logical_units_per_unit is None else logical_units_per_unit
name = info['name'] if library_name is None else library_name
return meters, logical, name
if meters_per_unit is None or logical_units_per_unit is None or library_name is None:
raise LibraryError('meters_per_unit, logical_units_per_unit, and library_name are required for non-GDS-backed lazy writes')
return meters_per_unit, logical_units_per_unit, library_name
def _can_copy_arrow_cell(library: ArrowLibrary, name: str) -> bool:
return name not in library._cache
def _can_copy_overlay_cell(library: OverlayLibrary, name: str, entry: _SourceEntry) -> bool:
layer = library._layers[entry.layer_index]
if not isinstance(layer.library, ArrowLibrary):
return False
if name != entry.source_name:
return False
children = layer.child_graph.get(entry.source_name, set())
return all(library._effective_target(layer, child) == child for child in children)
def _write_pattern_struct(stream: IO[bytes], name: str, pat: Pattern) -> None:
elements: list[klamath.elements.Element] = []
elements += gdsii._shapes_to_elements(pat.shapes)
elements += gdsii._labels_to_texts(pat.labels)
elements += gdsii._mrefs_to_grefs(pat.refs)
klamath.library.write_struct(stream, name=name.encode('ASCII'), elements=elements)
def write(
library: Mapping[str, Pattern] | ILibraryView,
stream: IO[bytes],
*,
meters_per_unit: float | None = None,
logical_units_per_unit: float | None = None,
library_name: str | None = None,
) -> None:
meters_per_unit, logical_units_per_unit, library_name = _get_write_info(
library,
meters_per_unit=meters_per_unit,
logical_units_per_unit=logical_units_per_unit,
library_name=library_name,
)
header = klamath.library.FileHeader(
name=library_name.encode('ASCII'),
user_units_per_db_unit=logical_units_per_unit,
meters_per_db_unit=meters_per_unit,
)
header.write(stream)
if isinstance(library, ArrowLibrary):
for name in library.source_order():
if _can_copy_arrow_cell(library, name):
stream.write(library.raw_struct_bytes(name))
else:
_write_pattern_struct(stream, name, library._materialize_pattern(name, persist=False))
klamath.records.ENDLIB.write(stream, None)
return
if isinstance(library, OverlayLibrary):
for name in library.source_order():
entry = library._entries[name]
if isinstance(entry, _SourceEntry) and _can_copy_overlay_cell(library, name, entry):
layer = library._layers[entry.layer_index]
assert isinstance(layer.library, ArrowLibrary)
stream.write(layer.library.raw_struct_bytes(entry.source_name))
else:
_write_pattern_struct(stream, name, library._materialize_pattern(name, persist=False))
klamath.records.ENDLIB.write(stream, None)
return
gdsii.write(cast('Mapping[str, Pattern]', library), stream, meters_per_unit, logical_units_per_unit, library_name)
def writefile(
library: Mapping[str, Pattern] | ILibraryView,
filename: str | pathlib.Path,
*,
meters_per_unit: float | None = None,
logical_units_per_unit: float | None = None,
library_name: str | None = None,
) -> None:
path = pathlib.Path(filename)
with tmpfile(path) as base_stream:
streams: tuple[Any, ...] = (base_stream,)
if path.suffix == '.gz':
stream = cast('IO[bytes]', gzip.GzipFile(filename='', mtime=0, fileobj=base_stream, mode='wb', compresslevel=6))
streams = (stream,) + streams
else:
stream = base_stream
try:
write(
library,
stream,
meters_per_unit=meters_per_unit,
logical_units_per_unit=logical_units_per_unit,
library_name=library_name,
)
finally:
for ss in streams:
ss.close()

633
masque/file/gdsii_perf.py Normal file
View file

@ -0,0 +1,633 @@
"""
Synthetic GDS fixture generation for reader/writer performance testing.
The presets here are intentionally hierarchical and deterministic. They aim to
approximate a pair of real-world layout families discussed during GDS reader and
writer work:
* `many_cells`: tens of thousands of cells, moderate reference count, very heavy
box usage after flattening, and moderate polygon density.
* `many_instances`: a much smaller cell library with very high reference count,
similar box density, and far fewer polygons.
Fixtures are written by streaming structures through `klamath` directly so large
benchmark files can be produced without first materializing an equally large
`masque.Library` in Python.
"""
from __future__ import annotations
from dataclasses import asdict, dataclass
from pathlib import Path
from typing import Any
import argparse
import json
import math
import numpy
import klamath
from klamath import elements
EMPTY_PROPERTIES: dict[int, bytes] = {}
METERS_PER_DB_UNIT = 1e-9
USER_UNITS_PER_DB_UNIT = 1e-3
TOTAL_LAYERS = 200
@dataclass(frozen=True)
class FixturePreset:
name: str
total_layers: int
box_layers: int
heavy_box_layers: int
polygon_layers: int
box_cells: int
poly_cells: int
box_wrappers: int
poly_wrappers: int
box_clusters: int
poly_clusters: int
box_cluster_refs: int
poly_cluster_refs: int
top_direct_box_refs: int
top_direct_poly_refs: int
heavy_boxes_per_cell: int
regular_boxes_per_cell: int
polygons_per_cell: int
path_stride: int
text_stride: int
box_cluster_array: tuple[int, int]
top_box_array: tuple[int, int]
poly_cluster_array: tuple[int, int]
top_poly_array: tuple[int, int]
rare_annotation_stride: int
PRESETS: dict[str, FixturePreset] = {
'many_cells': FixturePreset(
name='many_cells',
total_layers=TOTAL_LAYERS,
box_layers=20,
heavy_box_layers=3,
polygon_layers=20,
box_cells=17_000,
poly_cells=6_000,
box_wrappers=18_000,
poly_wrappers=6_000,
box_clusters=2_000,
poly_clusters=999,
box_cluster_refs=24,
poly_cluster_refs=16,
top_direct_box_refs=21_000,
top_direct_poly_refs=7_000,
heavy_boxes_per_cell=6,
regular_boxes_per_cell=2,
polygons_per_cell=50,
path_stride=2,
text_stride=3,
box_cluster_array=(24, 16),
top_box_array=(8, 8),
poly_cluster_array=(4, 2),
top_poly_array=(3, 2),
rare_annotation_stride=1_250,
),
'many_instances': FixturePreset(
name='many_instances',
total_layers=TOTAL_LAYERS,
box_layers=25,
heavy_box_layers=3,
polygon_layers=10,
box_cells=2_500,
poly_cells=500,
box_wrappers=1_000,
poly_wrappers=500,
box_clusters=1_000,
poly_clusters=499,
box_cluster_refs=1_200,
poly_cluster_refs=400,
top_direct_box_refs=102_001,
top_direct_poly_refs=0,
heavy_boxes_per_cell=40,
regular_boxes_per_cell=16,
polygons_per_cell=60,
path_stride=1,
text_stride=2,
box_cluster_array=(1, 1),
top_box_array=(1, 1),
poly_cluster_array=(1, 1),
top_poly_array=(1, 1),
rare_annotation_stride=250,
),
}
@dataclass(frozen=True)
class FixtureManifest:
preset: str
scale: float
gds_path: str
library_name: str
cells: int
refs: int
layers: int
box_layers: int
heavy_box_layers: list[list[int]]
polygon_layers: list[list[int]]
hierarchical_boxes_per_heavy_layer: int
hierarchical_boxes_per_regular_layer: int
hierarchical_polygons_total: int
hierarchical_paths_total: int
hierarchical_texts_total: int
flattened_box_placements: int
flattened_poly_placements: int
estimated_flat_boxes_per_heavy_layer: int
estimated_flat_polygons_per_active_polygon_layer: int
def _scaled_count(value: int, scale: float, minimum: int = 0) -> int:
if value == 0:
return 0
scaled = int(math.ceil(value * scale))
return max(minimum, scaled)
def _scaled_preset(preset: FixturePreset, scale: float) -> FixturePreset:
if scale <= 0:
raise ValueError(f'scale must be positive, got {scale!r}')
return FixturePreset(
name=preset.name,
total_layers=preset.total_layers,
box_layers=min(preset.box_layers, preset.total_layers),
heavy_box_layers=min(preset.heavy_box_layers, preset.box_layers),
polygon_layers=min(preset.polygon_layers, preset.total_layers),
box_cells=_scaled_count(preset.box_cells, scale, minimum=1),
poly_cells=_scaled_count(preset.poly_cells, scale, minimum=1),
box_wrappers=_scaled_count(preset.box_wrappers, scale),
poly_wrappers=_scaled_count(preset.poly_wrappers, scale),
box_clusters=_scaled_count(preset.box_clusters, scale, minimum=1),
poly_clusters=_scaled_count(preset.poly_clusters, scale, minimum=1),
box_cluster_refs=_scaled_count(preset.box_cluster_refs, scale, minimum=1),
poly_cluster_refs=_scaled_count(preset.poly_cluster_refs, scale, minimum=1),
top_direct_box_refs=_scaled_count(preset.top_direct_box_refs, scale),
top_direct_poly_refs=_scaled_count(preset.top_direct_poly_refs, scale),
heavy_boxes_per_cell=max(1, preset.heavy_boxes_per_cell),
regular_boxes_per_cell=max(1, preset.regular_boxes_per_cell),
polygons_per_cell=max(1, preset.polygons_per_cell),
path_stride=max(1, preset.path_stride),
text_stride=max(1, preset.text_stride),
box_cluster_array=preset.box_cluster_array,
top_box_array=preset.top_box_array,
poly_cluster_array=preset.poly_cluster_array,
top_poly_array=preset.top_poly_array,
rare_annotation_stride=max(1, _scaled_count(preset.rare_annotation_stride, scale, minimum=1)),
)
def _rect_xy(xmin: int, ymin: int, xmax: int, ymax: int) -> numpy.ndarray[Any, numpy.dtype[numpy.int32]]:
return numpy.array(
[[xmin, ymin], [xmin, ymax], [xmax, ymax], [xmax, ymin], [xmin, ymin]],
dtype=numpy.int32,
)
def _poly_xy(points: list[tuple[int, int]]) -> numpy.ndarray[Any, numpy.dtype[numpy.int32]]:
closed = points + [points[0]]
return numpy.array(closed, dtype=numpy.int32)
def _sref(
target: str,
xy: tuple[int, int],
properties: dict[int, bytes] | None = None,
) -> elements.Reference:
return klamath.library.Reference(
struct_name=target.encode('ASCII'),
invert_y=False,
mag=1.0,
angle_deg=0.0,
xy=numpy.array([xy], dtype=numpy.int32),
colrow=None,
properties=EMPTY_PROPERTIES if properties is None else properties,
)
def _aref(
target: str,
origin: tuple[int, int],
counts: tuple[int, int],
step: tuple[int, int],
properties: dict[int, bytes] | None = None,
) -> elements.Reference:
cols, rows = counts
dx, dy = step
xy = numpy.array(
[
origin,
(origin[0] + cols * dx, origin[1]),
(origin[0], origin[1] + rows * dy),
],
dtype=numpy.int32,
)
return klamath.library.Reference(
struct_name=target.encode('ASCII'),
invert_y=False,
mag=1.0,
angle_deg=0.0,
xy=xy,
colrow=(cols, rows),
properties=EMPTY_PROPERTIES if properties is None else properties,
)
def _annotation(index: int) -> dict[int, bytes]:
return {1: f'perf-{index}'.encode('ASCII')}
def _make_box_cell(name: str, index: int, cfg: FixturePreset) -> list[elements.Element]:
cell_elements: list[elements.Element] = []
xbase = (index % 17) * 600
ybase = (index // 17) * 180
for layer in range(cfg.heavy_box_layers):
for box_idx in range(cfg.heavy_boxes_per_cell):
x0 = xbase + box_idx * 22
y0 = ybase + layer * 40
width = 10 + ((index + box_idx + layer) % 7) * 6
height = 10 + ((index * 3 + box_idx + layer) % 5) * 8
properties = _annotation(index) if index % cfg.rare_annotation_stride == 0 and box_idx == 0 and layer == 0 else EMPTY_PROPERTIES
cell_elements.append(elements.Boundary(
layer=(layer, 0),
xy=_rect_xy(x0, y0, x0 + width, y0 + height),
properties=properties,
))
for layer in range(cfg.heavy_box_layers, cfg.box_layers):
for box_idx in range(cfg.regular_boxes_per_cell):
x0 = xbase + box_idx * 38
y0 = ybase + (layer - cfg.heavy_box_layers) * 28 + 400
width = 18 + ((index + layer + box_idx) % 9) * 4
height = 12 + ((index + 2 * layer + box_idx) % 6) * 5
cell_elements.append(elements.Boundary(
layer=(layer, 0),
xy=_rect_xy(x0, y0, x0 + width, y0 + height),
properties=EMPTY_PROPERTIES,
))
return cell_elements
def _make_poly_cell(name: str, index: int, cfg: FixturePreset) -> list[elements.Element]:
cell_elements: list[elements.Element] = []
xbase = (index % 19) * 900
ybase = (index // 19) * 260
for poly_idx in range(cfg.polygons_per_cell):
layer = poly_idx % cfg.polygon_layers
dx = xbase + (poly_idx % 5) * 120
dy = ybase + (poly_idx // 5) * 80
size = 18 + ((index + poly_idx + layer) % 11) * 7
points = [
(dx, dy),
(dx + size, dy + size // 5),
(dx + size + size // 3, dy + size),
(dx + size // 2, dy + size + size // 2),
(dx - size // 4, dy + size // 2),
]
properties = _annotation(index) if poly_idx == 0 and index % cfg.rare_annotation_stride == 0 else EMPTY_PROPERTIES
cell_elements.append(elements.Boundary(
layer=(layer, 0),
xy=_poly_xy(points),
properties=properties,
))
if index % cfg.path_stride == 0:
layer = index % cfg.polygon_layers
cell_elements.append(elements.Path(
layer=(layer, 1),
path_type=2,
width=12 + (index % 5) * 4,
extension=(0, 0),
xy=numpy.array(
[
[xbase, ybase + 900],
[xbase + 240, ybase + 930],
[xbase + 420, ybase + 960],
],
dtype=numpy.int32,
),
properties=EMPTY_PROPERTIES,
))
if index % cfg.text_stride == 0:
layer = index % cfg.polygon_layers
properties = _annotation(index) if index % cfg.rare_annotation_stride == 0 else EMPTY_PROPERTIES
cell_elements.append(elements.Text(
layer=(layer, 2),
presentation=0,
path_type=0,
width=0,
invert_y=False,
mag=1.0,
angle_deg=0.0,
xy=numpy.array([[xbase + 64, ybase + 1536]], dtype=numpy.int32),
string=f'T{index:05d}'.encode('ASCII'),
properties=properties,
))
return cell_elements
def _write_struct(stream: Any, name: str, cell_elements: list[elements.Element]) -> None:
klamath.library.write_struct(stream, name=name.encode('ASCII'), elements=cell_elements)
def _box_name(index: int) -> str:
return f'box_{index:05d}'
def _poly_name(index: int) -> str:
return f'poly_{index:05d}'
def _box_wrapper_name(index: int) -> str:
return f'box_wrap_{index:05d}'
def _poly_wrapper_name(index: int) -> str:
return f'poly_wrap_{index:05d}'
def _box_cluster_name(index: int) -> str:
return f'box_cluster_{index:05d}'
def _poly_cluster_name(index: int) -> str:
return f'poly_cluster_{index:05d}'
def _write_box_cells(stream: Any, cfg: FixturePreset) -> None:
for idx in range(cfg.box_cells):
_write_struct(stream, _box_name(idx), _make_box_cell(_box_name(idx), idx, cfg))
def _write_poly_cells(stream: Any, cfg: FixturePreset) -> None:
for idx in range(cfg.poly_cells):
_write_struct(stream, _poly_name(idx), _make_poly_cell(_poly_name(idx), idx, cfg))
def _write_wrappers(stream: Any, cfg: FixturePreset) -> None:
for idx in range(cfg.box_wrappers):
target = _box_name(idx % cfg.box_cells)
origin = ((idx % 97) * 2_000, (idx // 97) * 2_000)
_write_struct(stream, _box_wrapper_name(idx), [_sref(target, origin)])
for idx in range(cfg.poly_wrappers):
target = _poly_name(idx % cfg.poly_cells)
origin = ((idx % 61) * 3_200, (idx // 61) * 3_200)
_write_struct(stream, _poly_wrapper_name(idx), [_sref(target, origin)])
def _write_box_clusters(stream: Any, cfg: FixturePreset) -> None:
array_refs = min(cfg.box_cluster_refs, max(1, (3 * cfg.box_cluster_refs) // 4))
for idx in range(cfg.box_clusters):
cell_elements: list[elements.Element] = []
for ref_idx in range(cfg.box_cluster_refs):
target = _box_name((idx * cfg.box_cluster_refs + ref_idx) % cfg.box_cells)
origin = (
(ref_idx % 6) * 48_000,
(ref_idx // 6) * 48_000,
)
if ref_idx < array_refs:
cell_elements.append(_aref(target, origin, cfg.box_cluster_array, (720, 900)))
else:
cell_elements.append(_sref(target, origin))
_write_struct(stream, _box_cluster_name(idx), cell_elements)
def _write_poly_clusters(stream: Any, cfg: FixturePreset) -> None:
array_refs = min(cfg.poly_cluster_refs, cfg.poly_cluster_refs // 2)
for idx in range(cfg.poly_clusters):
cell_elements: list[elements.Element] = []
for ref_idx in range(cfg.poly_cluster_refs):
target = _poly_name((idx * cfg.poly_cluster_refs + ref_idx) % cfg.poly_cells)
origin = (
(ref_idx % 10) * 96_000,
(ref_idx // 10) * 96_000,
)
if ref_idx < array_refs:
cell_elements.append(_aref(target, origin, cfg.poly_cluster_array, (12_000, 8_500)))
else:
cell_elements.append(_sref(target, origin))
_write_struct(stream, _poly_cluster_name(idx), cell_elements)
def _top_box_refs(cfg: FixturePreset) -> list[elements.Reference]:
refs: list[elements.Reference] = []
for idx in range(cfg.box_wrappers):
refs.append(_sref(
_box_wrapper_name(idx),
((idx % 240) * 240_000, (idx // 240) * 240_000),
))
for idx in range(cfg.box_clusters):
refs.append(_sref(
_box_cluster_name(idx),
((idx % 100) * 800_000, (idx // 100) * 800_000 + 14_000_000),
))
for idx in range(cfg.top_direct_box_refs):
target = _box_name(idx % cfg.box_cells)
origin = (
(idx % 150) * 160_000,
(idx // 150) * 160_000 + 26_000_000,
)
if cfg.top_box_array == (1, 1):
refs.append(_sref(target, origin))
else:
refs.append(_aref(target, origin, cfg.top_box_array, (1_100, 1_350)))
return refs
def _top_poly_refs(cfg: FixturePreset) -> list[elements.Reference]:
refs: list[elements.Reference] = []
for idx in range(cfg.poly_wrappers):
refs.append(_sref(
_poly_wrapper_name(idx),
((idx % 180) * 360_000, (idx // 180) * 360_000 + 44_000_000),
))
for idx in range(cfg.poly_clusters):
refs.append(_sref(
_poly_cluster_name(idx),
((idx % 70) * 1_100_000, (idx // 70) * 1_100_000 + 58_000_000),
))
for idx in range(cfg.top_direct_poly_refs):
target = _poly_name(idx % cfg.poly_cells)
origin = (
(idx % 110) * 420_000,
(idx // 110) * 420_000 + 72_000_000,
)
if cfg.top_poly_array == (1, 1):
refs.append(_sref(target, origin))
else:
refs.append(_aref(target, origin, cfg.top_poly_array, (16_000, 14_000)))
return refs
def _write_top(stream: Any, cfg: FixturePreset) -> None:
cell_elements: list[elements.Element] = []
cell_elements.extend(_top_box_refs(cfg))
cell_elements.extend(_top_poly_refs(cfg))
_write_struct(stream, 'TOP', cell_elements)
def _poly_paths_total(cfg: FixturePreset) -> int:
return (cfg.poly_cells - 1) // cfg.path_stride + 1
def _poly_texts_total(cfg: FixturePreset) -> int:
return (cfg.poly_cells - 1) // cfg.text_stride + 1
def _ref_instances_per_box_cluster(cfg: FixturePreset) -> int:
array_refs = min(cfg.box_cluster_refs, max(1, (3 * cfg.box_cluster_refs) // 4))
array_mult = cfg.box_cluster_array[0] * cfg.box_cluster_array[1]
return array_refs * array_mult + (cfg.box_cluster_refs - array_refs)
def _ref_instances_per_poly_cluster(cfg: FixturePreset) -> int:
array_refs = min(cfg.poly_cluster_refs, cfg.poly_cluster_refs // 2)
array_mult = cfg.poly_cluster_array[0] * cfg.poly_cluster_array[1]
return array_refs * array_mult + (cfg.poly_cluster_refs - array_refs)
def fixture_manifest(path: str | Path, preset: str, scale: float = 1.0) -> FixtureManifest:
base = PRESETS[preset]
cfg = _scaled_preset(base, scale)
flattened_box_placements = (
cfg.box_wrappers
+ cfg.box_clusters * _ref_instances_per_box_cluster(cfg)
+ cfg.top_direct_box_refs * cfg.top_box_array[0] * cfg.top_box_array[1]
)
flattened_poly_placements = (
cfg.poly_wrappers
+ cfg.poly_clusters * _ref_instances_per_poly_cluster(cfg)
+ cfg.top_direct_poly_refs * cfg.top_poly_array[0] * cfg.top_poly_array[1]
)
polygon_layers = max(1, cfg.polygon_layers)
polys_per_layer = (cfg.poly_cells * cfg.polygons_per_cell) // polygon_layers
return FixtureManifest(
preset=cfg.name,
scale=scale,
gds_path=str(Path(path)),
library_name=f'masque-perf-{cfg.name}',
cells=cfg.box_cells + cfg.poly_cells + cfg.box_wrappers + cfg.poly_wrappers + cfg.box_clusters + cfg.poly_clusters + 1,
refs=(
cfg.box_wrappers
+ cfg.poly_wrappers
+ cfg.box_clusters * cfg.box_cluster_refs
+ cfg.poly_clusters * cfg.poly_cluster_refs
+ cfg.box_wrappers + cfg.poly_wrappers + cfg.box_clusters + cfg.poly_clusters
+ cfg.top_direct_box_refs + cfg.top_direct_poly_refs
),
layers=cfg.total_layers,
box_layers=cfg.box_layers,
heavy_box_layers=[[layer, 0] for layer in range(cfg.heavy_box_layers)],
polygon_layers=[[layer, 0] for layer in range(cfg.polygon_layers)],
hierarchical_boxes_per_heavy_layer=cfg.box_cells * cfg.heavy_boxes_per_cell,
hierarchical_boxes_per_regular_layer=cfg.box_cells * cfg.regular_boxes_per_cell,
hierarchical_polygons_total=cfg.poly_cells * cfg.polygons_per_cell,
hierarchical_paths_total=_poly_paths_total(cfg),
hierarchical_texts_total=_poly_texts_total(cfg),
flattened_box_placements=flattened_box_placements,
flattened_poly_placements=flattened_poly_placements,
estimated_flat_boxes_per_heavy_layer=flattened_box_placements * cfg.heavy_boxes_per_cell,
estimated_flat_polygons_per_active_polygon_layer=flattened_poly_placements * polys_per_layer // cfg.poly_cells if cfg.poly_cells else 0,
)
def write_fixture(
path: str | Path,
*,
preset: str,
scale: float = 1.0,
write_manifest: bool = True,
) -> FixtureManifest:
if preset not in PRESETS:
known = ', '.join(sorted(PRESETS))
raise KeyError(f'unknown preset {preset!r}; expected one of: {known}')
manifest = fixture_manifest(path, preset, scale)
cfg = _scaled_preset(PRESETS[preset], scale)
output = Path(path)
output.parent.mkdir(parents=True, exist_ok=True)
with output.open('wb') as stream:
header = klamath.library.FileHeader(
name=manifest.library_name.encode('ASCII'),
user_units_per_db_unit=USER_UNITS_PER_DB_UNIT,
meters_per_db_unit=METERS_PER_DB_UNIT,
)
header.write(stream)
_write_box_cells(stream, cfg)
_write_poly_cells(stream, cfg)
_write_wrappers(stream, cfg)
_write_box_clusters(stream, cfg)
_write_poly_clusters(stream, cfg)
_write_top(stream, cfg)
klamath.records.ENDLIB.write(stream, None)
if write_manifest:
manifest_path = output.with_suffix(output.suffix + '.json')
manifest_path.write_text(json.dumps(asdict(manifest), indent=2, sort_keys=True) + '\n')
return manifest
def build_arg_parser() -> argparse.ArgumentParser:
parser = argparse.ArgumentParser(description='Generate synthetic GDS fixtures for GDS reader/writer performance work.')
parser.add_argument(
'preset',
nargs='?',
default='many_cells',
choices=sorted(PRESETS),
help='Fixture family to generate.',
)
parser.add_argument(
'output',
nargs='?',
help='Output .gds path. Defaults to build/gds_perf/<preset>.gds',
)
parser.add_argument(
'--scale',
type=float,
default=1.0,
help='Scale the preset counts down or up while keeping the same shape mix. Default: 1.0',
)
parser.add_argument(
'--no-manifest',
action='store_true',
help='Do not write the sidecar JSON manifest.',
)
return parser
def main(argv: list[str] | None = None) -> int:
parser = build_arg_parser()
args = parser.parse_args(argv)
output = Path(args.output) if args.output is not None else Path('build/gds_perf') / f'{args.preset}.gds'
manifest = write_fixture(output, preset=args.preset, scale=args.scale, write_manifest=not args.no_manifest)
print(json.dumps(asdict(manifest), indent=2, sort_keys=True))
return 0
if __name__ == '__main__':
raise SystemExit(main())

View file

@ -120,10 +120,10 @@ def build(
layer, data_type = _mlayer2oas(layer_num)
lib.layers += [
fatrec.LayerName(
nstring=name,
layer_interval=(layer, layer),
type_interval=(data_type, data_type),
is_textlayer=tt,
nstring = name,
layer_interval = (layer, layer),
type_interval = (data_type, data_type),
is_textlayer = tt,
)
for tt in (True, False)]
@ -182,8 +182,8 @@ def writefile(
Args:
library: A {name: Pattern} mapping of patterns to write.
filename: Filename to save to.
*args: passed to `oasis.write`
**kwargs: passed to `oasis.write`
*args: passed to `oasis.build()`
**kwargs: passed to `oasis.build()`
"""
path = pathlib.Path(filename)
@ -213,9 +213,9 @@ def readfile(
Will automatically decompress gzipped files.
Args:
filename: Filename to save to.
*args: passed to `oasis.read`
**kwargs: passed to `oasis.read`
filename: Filename to load from.
*args: passed to `oasis.read()`
**kwargs: passed to `oasis.read()`
"""
path = pathlib.Path(filename)
if is_gzipped(path):
@ -286,11 +286,11 @@ def read(
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
pat.polygon(
vertices=vertices,
layer=element.get_layer_tuple(),
offset=element.get_xy(),
annotations=annotations,
repetition=repetition,
vertices = vertices,
layer = element.get_layer_tuple(),
offset = element.get_xy(),
annotations = annotations,
repetition = repetition,
)
elif isinstance(element, fatrec.Path):
vertices = numpy.cumsum(numpy.vstack(((0, 0), element.get_point_list())), axis=0)
@ -310,13 +310,13 @@ def read(
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
pat.path(
vertices=vertices,
layer=element.get_layer_tuple(),
offset=element.get_xy(),
repetition=repetition,
annotations=annotations,
width=element.get_half_width() * 2,
cap=cap,
vertices = vertices,
layer = element.get_layer_tuple(),
offset = element.get_xy(),
repetition = repetition,
annotations = annotations,
width = element.get_half_width() * 2,
cap = cap,
**path_args,
)
@ -325,11 +325,11 @@ def read(
height = element.get_height()
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
pat.polygon(
layer=element.get_layer_tuple(),
offset=element.get_xy(),
repetition=repetition,
vertices=numpy.array(((0, 0), (1, 0), (1, 1), (0, 1))) * (width, height),
annotations=annotations,
layer = element.get_layer_tuple(),
offset = element.get_xy(),
repetition = repetition,
vertices = numpy.array(((0, 0), (1, 0), (1, 1), (0, 1))) * (width, height),
annotations = annotations,
)
elif isinstance(element, fatrec.Trapezoid):
@ -440,11 +440,11 @@ def read(
else:
string = str_or_ref.string
pat.label(
layer=element.get_layer_tuple(),
offset=element.get_xy(),
repetition=repetition,
annotations=annotations,
string=string,
layer = element.get_layer_tuple(),
offset = element.get_xy(),
repetition = repetition,
annotations = annotations,
string = string,
)
else:
@ -549,33 +549,35 @@ def _shapes_to_elements(
offset = rint_cast(shape.offset + rep_offset)
radius = rint_cast(shape.radius)
circle = fatrec.Circle(
layer=layer,
datatype=datatype,
radius=cast('int', radius),
x=offset[0],
y=offset[1],
properties=properties,
repetition=repetition,
layer = layer,
datatype = datatype,
radius = cast('int', radius),
x = offset[0],
y = offset[1],
properties = properties,
repetition = repetition,
)
elements.append(circle)
elif isinstance(shape, Path):
xy = rint_cast(shape.offset + shape.vertices[0] + rep_offset)
deltas = rint_cast(numpy.diff(shape.vertices, axis=0))
half_width = rint_cast(shape.width / 2)
path_type = next(k for k, v in path_cap_map.items() if v == shape.cap) # reverse lookup
path_type = next((k for k, v in path_cap_map.items() if v == shape.cap), None) # reverse lookup
if path_type is None:
raise PatternError(f'OASIS writer does not support path cap {shape.cap}')
extension_start = (path_type, shape.cap_extensions[0] if shape.cap_extensions is not None else None)
extension_end = (path_type, shape.cap_extensions[1] if shape.cap_extensions is not None else None)
path = fatrec.Path(
layer=layer,
datatype=datatype,
point_list=cast('Sequence[Sequence[int]]', deltas),
half_width=cast('int', half_width),
x=xy[0],
y=xy[1],
extension_start=extension_start, # TODO implement multiple cap types?
extension_end=extension_end,
properties=properties,
repetition=repetition,
layer = layer,
datatype = datatype,
point_list = cast('Sequence[Sequence[int]]', deltas),
half_width = cast('int', half_width),
x = xy[0],
y = xy[1],
extension_start = extension_start, # TODO implement multiple cap types?
extension_end = extension_end,
properties = properties,
repetition = repetition,
)
elements.append(path)
else:
@ -583,13 +585,13 @@ def _shapes_to_elements(
xy = rint_cast(polygon.offset + polygon.vertices[0] + rep_offset)
points = rint_cast(numpy.diff(polygon.vertices, axis=0))
elements.append(fatrec.Polygon(
layer=layer,
datatype=datatype,
x=xy[0],
y=xy[1],
point_list=cast('list[list[int]]', points),
properties=properties,
repetition=repetition,
layer = layer,
datatype = datatype,
x = xy[0],
y = xy[1],
point_list = cast('list[list[int]]', points),
properties = properties,
repetition = repetition,
))
return elements
@ -606,13 +608,13 @@ def _labels_to_texts(
xy = rint_cast(label.offset + rep_offset)
properties = annotations_to_properties(label.annotations)
texts.append(fatrec.Text(
layer=layer,
datatype=datatype,
x=xy[0],
y=xy[1],
string=label.string,
properties=properties,
repetition=repetition,
layer = layer,
datatype = datatype,
x = xy[0],
y = xy[1],
string = label.string,
properties = properties,
repetition = repetition,
))
return texts
@ -622,10 +624,12 @@ def repetition_fata2masq(
) -> Repetition | None:
mrep: Repetition | None
if isinstance(rep, fatamorgana.GridRepetition):
mrep = Grid(a_vector=rep.a_vector,
b_vector=rep.b_vector,
a_count=rep.a_count,
b_count=rep.b_count)
mrep = Grid(
a_vector = rep.a_vector,
b_vector = rep.b_vector,
a_count = rep.a_count,
b_count = rep.b_count,
)
elif isinstance(rep, fatamorgana.ArbitraryRepetition):
displacements = numpy.cumsum(numpy.column_stack((
rep.x_displacements,
@ -647,21 +651,26 @@ def repetition_masq2fata(
frep: fatamorgana.GridRepetition | fatamorgana.ArbitraryRepetition | None
if isinstance(rep, Grid):
a_vector = rint_cast(rep.a_vector)
b_vector = rint_cast(rep.b_vector) if rep.b_vector is not None else None
a_count = rint_cast(rep.a_count)
b_count = rint_cast(rep.b_count) if rep.b_count is not None else None
a_count = int(rep.a_count)
if rep.b_count > 1:
b_vector = rint_cast(rep.b_vector)
b_count = int(rep.b_count)
else:
b_vector = None
b_count = None
frep = fatamorgana.GridRepetition(
a_vector=cast('list[int]', a_vector),
b_vector=cast('list[int] | None', b_vector),
a_count=cast('int', a_count),
b_count=cast('int | None', b_count),
a_vector = a_vector,
b_vector = b_vector,
a_count = a_count,
b_count = b_count,
)
offset = (0, 0)
elif isinstance(rep, Arbitrary):
diffs = numpy.diff(rep.displacements, axis=0)
diff_ints = rint_cast(diffs)
frep = fatamorgana.ArbitraryRepetition(diff_ints[:, 0], diff_ints[:, 1]) # type: ignore
offset = rep.displacements[0, :]
offset = tuple(rep.displacements[0, :])
else:
assert rep is None
frep = None
@ -671,6 +680,8 @@ def repetition_masq2fata(
def annotations_to_properties(annotations: annotations_t) -> list[fatrec.Property]:
#TODO determine is_standard based on key?
if annotations is None:
return []
properties = []
for key, values in annotations.items():
vals = [AString(v) if isinstance(v, str) else v
@ -705,13 +716,9 @@ def properties_to_annotations(
string = repr(value)
logger.warning(f'Converting property value for key ({key}) to string ({string})')
values.append(string)
annotations[key] = values
annotations.setdefault(key, []).extend(values)
return annotations
properties = [fatrec.Property(key, vals, is_standard=False)
for key, vals in annotations.items()]
return properties
def check_valid_names(
names: Iterable[str],

View file

@ -2,7 +2,7 @@
SVG file format readers and writers
"""
from collections.abc import Mapping
import warnings
import logging
import numpy
from numpy.typing import ArrayLike
@ -10,6 +10,43 @@ import svgwrite # type: ignore
from .utils import mangle_name
from .. import Pattern
from ..utils import rotation_matrix_2d
logger = logging.getLogger(__name__)
def _ref_to_svg_transform(ref) -> str:
linear = rotation_matrix_2d(ref.rotation) * ref.scale
if ref.mirrored:
linear = linear @ numpy.diag((1.0, -1.0))
a = linear[0, 0]
b = linear[1, 0]
c = linear[0, 1]
d = linear[1, 1]
e = ref.offset[0]
f = ref.offset[1]
return f'matrix({a:g} {b:g} {c:g} {d:g} {e:g} {f:g})'
def _make_svg_ids(names: Mapping[str, Pattern]) -> dict[str, str]:
svg_ids: dict[str, str] = {}
seen_ids: set[str] = set()
for name in names:
base_id = mangle_name(name)
svg_id = base_id
suffix = 1
while svg_id in seen_ids:
suffix += 1
svg_id = f'{base_id}_{suffix}'
seen_ids.add(svg_id)
svg_ids[name] = svg_id
return svg_ids
def _detached_library(library: Mapping[str, Pattern]) -> dict[str, Pattern]:
return {name: pat.deepcopy() for name, pat in library.items()}
def writefile(
@ -17,15 +54,15 @@ def writefile(
top: str,
filename: str,
custom_attributes: bool = False,
annotate_ports: bool = False,
) -> None:
"""
Write a Pattern to an SVG file, by first calling .polygonize() on it
Write a Pattern to an SVG file, by first calling .polygonize() on a detached
materialized copy
to change the shapes into polygons, and then writing patterns as SVG
groups (<g>, inside <defs>), polygons as paths (<path>), and refs
as <use> elements.
Note that this function modifies the Pattern.
If `custom_attributes` is `True`, a non-standard `pattern_layer` attribute
is written to the relevant elements.
@ -37,20 +74,24 @@ def writefile(
prior to calling this function.
Args:
pattern: Pattern to write to file. Modified by this function.
library: Mapping of pattern names to patterns.
top: Name of the top-level pattern to render.
filename: Filename to write to.
custom_attributes: Whether to write non-standard `pattern_layer` attribute to the
SVG elements.
annotate_ports: If True, draw an arrow for each port (similar to
`Pattern.visualize(..., ports=True)`).
"""
pattern = library[top]
detached = _detached_library(library)
pattern = detached[top]
# Polygonize pattern
pattern.polygonize()
bounds = pattern.get_bounds(library=library)
bounds = pattern.get_bounds(library=detached)
if bounds is None:
bounds_min, bounds_max = numpy.array([[-1, -1], [1, 1]])
warnings.warn('Pattern had no bounds (empty?); setting arbitrary viewbox', stacklevel=1)
logger.warning('Pattern had no bounds (empty?); setting arbitrary viewbox', stacklevel=1)
else:
bounds_min, bounds_max = bounds
@ -60,10 +101,11 @@ def writefile(
# Create file
svg = svgwrite.Drawing(filename, profile='full', viewBox=viewbox_string,
debug=(not custom_attributes))
svg_ids = _make_svg_ids(detached)
# Now create a group for each pattern and add in any Boundary and Use elements
for name, pat in library.items():
svg_group = svg.g(id=mangle_name(name), fill='blue', stroke='red')
for name, pat in detached.items():
svg_group = svg.g(id=svg_ids[name], fill='blue', stroke='red')
for layer, shapes in pat.shapes.items():
for shape in shapes:
@ -76,16 +118,37 @@ def writefile(
svg_group.add(path)
if annotate_ports:
# Draw arrows for the ports, pointing into the device (per port definition)
for port_name, port in pat.ports.items():
if port.rotation is not None:
p1 = port.offset
angle = port.rotation
size = 1.0 # arrow size
p2 = p1 + size * numpy.array([numpy.cos(angle), numpy.sin(angle)])
# head
head_angle = 0.5
h1 = p1 + 0.7 * size * numpy.array([numpy.cos(angle + head_angle), numpy.sin(angle + head_angle)])
h2 = p1 + 0.7 * size * numpy.array([numpy.cos(angle - head_angle), numpy.sin(angle - head_angle)])
line = svg.line(start=p1, end=p2, stroke='green', stroke_width=0.2)
head = svg.polyline(points=[h1, p1, h2], fill='none', stroke='green', stroke_width=0.2)
svg_group.add(line)
svg_group.add(head)
svg_group.add(svg.text(port_name, insert=p2, font_size=0.5, fill='green'))
for target, refs in pat.refs.items():
if target is None:
continue
for ref in refs:
transform = f'scale({ref.scale:g}) rotate({ref.rotation:g}) translate({ref.offset[0]:g},{ref.offset[1]:g})'
use = svg.use(href='#' + mangle_name(target), transform=transform)
transform = _ref_to_svg_transform(ref)
use = svg.use(href='#' + svg_ids[target], transform=transform)
svg_group.add(use)
svg.defs.add(svg_group)
svg.add(svg.use(href='#' + mangle_name(top)))
svg.add(svg.use(href='#' + svg_ids[top]))
svg.save()
@ -100,24 +163,24 @@ def writefile_inverted(
box and drawing the polygons with reverse vertex order inside it, all within
one `<path>` element.
Note that this function modifies the Pattern.
If you want pattern polygonized with non-default arguments, just call `pattern.polygonize()`
prior to calling this function.
Args:
pattern: Pattern to write to file. Modified by this function.
library: Mapping of pattern names to patterns.
top: Name of the top-level pattern to render.
filename: Filename to write to.
"""
pattern = library[top]
detached = _detached_library(library)
pattern = detached[top]
# Polygonize and flatten pattern
pattern.polygonize().flatten(library)
pattern.polygonize().flatten(detached)
bounds = pattern.get_bounds(library=library)
bounds = pattern.get_bounds(library=detached)
if bounds is None:
bounds_min, bounds_max = numpy.array([[-1, -1], [1, 1]])
warnings.warn('Pattern had no bounds (empty?); setting arbitrary viewbox', stacklevel=1)
logger.warning('Pattern had no bounds (empty?); setting arbitrary viewbox', stacklevel=1)
else:
bounds_min, bounds_max = bounds

View file

@ -33,6 +33,12 @@ def preflight(
Run a standard set of useful operations and checks, usually done immediately prior
to writing to a file (or immediately after reading).
Note that this helper is not copy-isolating. When `sort=True`, it constructs a new
`Library` wrapper around the same `Pattern` objects after sorting them in place, so
later mutating preflight steps such as `prune_empty_patterns` and
`wrap_repeated_shapes` may still mutate caller-owned patterns. Callers that need
isolation should deep-copy the library before calling `preflight()`.
Args:
sort: Whether to sort the patterns based on their names, and optionaly sort the pattern contents.
Default True. Useful for reproducible builds.
@ -75,7 +81,8 @@ def preflight(
raise PatternError('Non-numeric layers found:' + pformat(named_layers))
if prune_empty_patterns:
pruned = lib.prune_empty()
prune_dangling = 'error' if allow_dangling_refs is False else 'ignore'
pruned = lib.prune_empty(dangling=prune_dangling)
if pruned:
logger.info(f'Preflight pruned {len(pruned)} empty patterns')
logger.debug('Pruned: ' + pformat(pruned))
@ -144,7 +151,11 @@ def tmpfile(path: str | pathlib.Path) -> Iterator[IO[bytes]]:
path = pathlib.Path(path)
suffixes = ''.join(path.suffixes)
with tempfile.NamedTemporaryFile(suffix=suffixes, delete=False) as tmp_stream:
yield tmp_stream
try:
yield tmp_stream
except Exception:
pathlib.Path(tmp_stream.name).unlink(missing_ok=True)
raise
try:
shutil.move(tmp_stream.name, path)

View file

@ -7,12 +7,12 @@ from numpy.typing import ArrayLike, NDArray
from .repetition import Repetition
from .utils import rotation_matrix_2d, annotations_t, annotations_eq, annotations_lt, rep2key
from .traits import PositionableImpl, Copyable, Pivotable, RepeatableImpl, Bounded
from .traits import PositionableImpl, Copyable, Pivotable, RepeatableImpl, Bounded, Flippable
from .traits import AnnotatableImpl
@functools.total_ordering
class Label(PositionableImpl, RepeatableImpl, AnnotatableImpl, Bounded, Pivotable, Copyable):
class Label(PositionableImpl, RepeatableImpl, AnnotatableImpl, Bounded, Pivotable, Copyable, Flippable):
"""
A text annotation with a position (but no size; it is not drawn)
"""
@ -53,17 +53,36 @@ class Label(PositionableImpl, RepeatableImpl, AnnotatableImpl, Bounded, Pivotabl
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
@classmethod
def _from_raw(
cls,
string: str,
*,
offset: NDArray[numpy.float64],
repetition: Repetition | None = None,
annotations: annotations_t | None = None,
) -> Self:
new = cls.__new__(cls)
new._string = string
new._offset = offset
new._repetition = repetition
new._annotations = annotations
return new
def __copy__(self) -> Self:
return type(self)(
string=self.string,
offset=self.offset.copy(),
repetition=self.repetition,
annotations=copy.copy(self.annotations),
)
def __deepcopy__(self, memo: dict | None = None) -> Self:
memo = {} if memo is None else memo
new = copy.copy(self)
new._offset = self._offset.copy()
new._repetition = copy.deepcopy(self._repetition, memo)
new._annotations = copy.deepcopy(self._annotations, memo)
return new
def __lt__(self, other: 'Label') -> bool:
@ -76,6 +95,8 @@ class Label(PositionableImpl, RepeatableImpl, AnnotatableImpl, Bounded, Pivotabl
return annotations_lt(self.annotations, other.annotations)
def __eq__(self, other: Any) -> bool:
if type(self) is not type(other):
return False
return (
self.string == other.string
and numpy.array_equal(self.offset, other.offset)
@ -96,10 +117,34 @@ class Label(PositionableImpl, RepeatableImpl, AnnotatableImpl, Bounded, Pivotabl
"""
pivot = numpy.asarray(pivot, dtype=float)
self.translate(-pivot)
if self.repetition is not None:
self.repetition.rotate(rotation)
self.offset = numpy.dot(rotation_matrix_2d(rotation), self.offset)
self.translate(+pivot)
return self
def flip_across(self, axis: int | None = None, *, x: float | None = None, y: float | None = None) -> Self:
"""
Extrinsic transformation: Flip the label across a line in the pattern's
coordinate system. This affects both the label's offset and its
repetition grid.
Args:
axis: Axis to mirror across. 0: x-axis (flip y), 1: y-axis (flip x).
x: Vertical line x=val to mirror across.
y: Horizontal line y=val to mirror across.
Returns:
self
"""
axis, pivot = self._check_flip_args(axis=axis, x=x, y=y)
self.translate(-pivot)
if self.repetition is not None:
self.repetition.mirror(axis)
self.offset[1 - axis] *= -1
self.translate(+pivot)
return self
def get_bounds_single(self) -> NDArray[numpy.float64]:
"""
Return the bounds of the label.

View file

@ -22,7 +22,7 @@ import copy
from pprint import pformat
from collections import defaultdict
from abc import ABCMeta, abstractmethod
from graphlib import TopologicalSorter
from graphlib import TopologicalSorter, CycleError
import numpy
from numpy.typing import ArrayLike, NDArray
@ -59,6 +59,9 @@ TreeView: TypeAlias = Mapping[str, 'Pattern']
Tree: TypeAlias = MutableMapping[str, 'Pattern']
""" A mutable name-to-`Pattern` mapping which is expected to have only one top-level cell """
dangling_mode_t: TypeAlias = Literal['error', 'ignore', 'include']
""" How helpers should handle refs whose targets are not present in the library. """
SINGLE_USE_PREFIX = '_'
"""
@ -141,7 +144,6 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta):
Args:
tops: Name(s) of the pattern(s) to check.
Default is all patterns in the library.
skip: Memo, set patterns which have already been traversed.
Returns:
Set of all referenced pattern names
@ -178,6 +180,8 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta):
if isinstance(tops, str):
tops = (tops,)
tops = set(tops)
skip |= tops # don't re-visit tops
# Get referenced patterns for all tops
targets = set()
@ -187,9 +191,9 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta):
# Perform recursive lookups, but only once for each name
for target in targets - skip:
assert target is not None
skip.add(target)
if target in self:
targets |= self.referenced_patterns(target, skip=skip)
skip.add(target)
return targets
@ -264,6 +268,7 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta):
self,
tops: str | Sequence[str],
flatten_ports: bool = False,
dangling_ok: bool = False,
) -> dict[str, 'Pattern']:
"""
Returns copies of all `tops` patterns with all refs
@ -273,9 +278,12 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta):
For an in-place variant, see `Pattern.flatten`.
Args:
tops: The pattern(s) to flattern.
tops: The pattern(s) to flatten.
flatten_ports: If `True`, keep ports from any referenced
patterns; otherwise discard them.
dangling_ok: If `True`, no error will be thrown if any
ref points to a name which is not present in the library.
Default False.
Returns:
{name: flat_pattern} mapping for all flattened patterns.
@ -288,26 +296,37 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta):
def flatten_single(name: str) -> None:
flattened[name] = None
pat = self[name].deepcopy()
refs_by_target = tuple((target, tuple(refs)) for target, refs in pat.refs.items())
for target in pat.refs:
for target, refs in refs_by_target:
if target is None:
continue
if dangling_ok and target not in self:
continue
if target not in flattened:
flatten_single(target)
target_pat = flattened[target]
if target_pat is None:
raise PatternError(f'Circular reference in {name} to {target}')
if target_pat.is_empty(): # avoid some extra allocations
ports_only = flatten_ports and bool(target_pat.ports)
if target_pat.is_empty() and not ports_only: # avoid some extra allocations
continue
for ref in pat.refs[target]:
for ref in refs:
if flatten_ports and ref.repetition is not None and target_pat.ports:
raise PatternError(
f'Cannot flatten ports from repeated ref to {target!r}; '
'flatten with flatten_ports=False or expand/rename the ports manually first.'
)
p = ref.as_pattern(pattern=target_pat)
if not flatten_ports:
p.ports.clear()
pat.append(p)
pat.refs.clear()
for target in set(pat.refs.keys()) & set(self.keys()):
del pat.refs[target]
flattened[name] = pat
for top in tops:
@ -405,6 +424,21 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta):
"""
return self[self.top()]
@staticmethod
def _dangling_refs_error(dangling: set[str], context: str) -> LibraryError:
dangling_list = sorted(dangling)
return LibraryError(f'Dangling refs found while {context}: ' + pformat(dangling_list))
def _raw_child_graph(self) -> tuple[dict[str, set[str]], set[str]]:
existing = set(self.keys())
graph: dict[str, set[str]] = {}
dangling: set[str] = set()
for name, pat in self.items():
children = {child for child, refs in pat.refs.items() if child is not None and refs}
graph[name] = children
dangling |= children - existing
return graph, dangling
def dfs(
self,
pattern: 'Pattern',
@ -459,9 +493,11 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta):
memo = {}
if transform is None or transform is True:
transform = numpy.zeros(4)
transform = numpy.array([0, 0, 0, 0, 1], dtype=float)
elif transform is not False:
transform = numpy.asarray(transform, dtype=float)
if transform.size == 4:
transform = numpy.append(transform, 1.0)
original_pattern = pattern
@ -504,50 +540,99 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta):
raise LibraryError('visit_* functions returned a new `Pattern` object'
' but no top-level name was provided in `hierarchy`')
del cast('ILibrary', self)[name]
cast('ILibrary', self)[name] = pattern
return self
def child_graph(self) -> dict[str, set[str | None]]:
def child_graph(
self,
dangling: dangling_mode_t = 'error',
) -> dict[str, set[str]]:
"""
Return a mapping from pattern name to a set of all child patterns
(patterns it references).
Only non-empty ref lists with non-`None` targets are treated as graph edges.
Args:
dangling: How refs to missing targets are handled. `'error'` raises,
`'ignore'` drops those edges, and `'include'` exposes them as
synthetic leaf nodes.
Returns:
Mapping from pattern name to a set of all pattern names it references.
"""
graph = {name: set(pat.refs.keys()) for name, pat in self.items()}
graph, dangling_refs = self._raw_child_graph()
if dangling == 'error':
if dangling_refs:
raise self._dangling_refs_error(dangling_refs, 'building child graph')
return graph
if dangling == 'ignore':
existing = set(graph)
return {name: {child for child in children if child in existing} for name, children in graph.items()}
for target in dangling_refs:
graph.setdefault(target, set())
return graph
def parent_graph(self) -> dict[str, set[str]]:
def parent_graph(
self,
dangling: dangling_mode_t = 'error',
) -> dict[str, set[str]]:
"""
Return a mapping from pattern name to a set of all parent patterns
(patterns which reference it).
Args:
dangling: How refs to missing targets are handled. `'error'` raises,
`'ignore'` drops those targets, and `'include'` adds them as
synthetic keys whose values are their existing parents.
Returns:
Mapping from pattern name to a set of all patterns which reference it.
"""
igraph: dict[str, set[str]] = {name: set() for name in self}
for name, pat in self.items():
for child, reflist in pat.refs.items():
if reflist and child is not None:
igraph[child].add(name)
child_graph, dangling_refs = self._raw_child_graph()
if dangling == 'error' and dangling_refs:
raise self._dangling_refs_error(dangling_refs, 'building parent graph')
existing = set(child_graph)
igraph: dict[str, set[str]] = {name: set() for name in existing}
for parent, children in child_graph.items():
for child in children:
if child in existing:
igraph[child].add(parent)
elif dangling == 'include':
igraph.setdefault(child, set()).add(parent)
return igraph
def child_order(self) -> list[str]:
def child_order(
self,
dangling: dangling_mode_t = 'error',
) -> list[str]:
"""
Return a topologically sorted list of all contained pattern names.
Return a topologically sorted list of graph node names.
Child (referenced) patterns will appear before their parents.
Args:
dangling: Passed to `child_graph()`.
Return:
Topologically sorted list of pattern names.
"""
return cast('list[str]', list(TopologicalSorter(self.child_graph()).static_order()))
try:
return cast('list[str]', list(TopologicalSorter(self.child_graph(dangling=dangling)).static_order()))
except CycleError as exc:
cycle = exc.args[1] if len(exc.args) > 1 else None
if cycle is None:
raise LibraryError('Cycle found while building child order') from exc
raise LibraryError(f'Cycle found while building child order: {cycle}') from exc
def find_refs_local(
self,
name: str,
parent_graph: dict[str, set[str]] | None = None,
dangling: dangling_mode_t = 'error',
) -> dict[str, list[NDArray[numpy.float64]]]:
"""
Find the location and orientation of all refs pointing to `name`.
@ -560,6 +645,8 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta):
The provided graph may be for a superset of `self` (i.e. it may
contain additional patterns which are not present in self; they
will be ignored).
dangling: How refs to missing targets are handled if `parent_graph`
is not provided. `'include'` also allows querying missing names.
Returns:
Mapping of {parent_name: transform_list}, where transform_list
@ -568,8 +655,18 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta):
"""
instances = defaultdict(list)
if parent_graph is None:
parent_graph = self.parent_graph()
for parent in parent_graph[name]:
graph_mode = 'ignore' if dangling == 'ignore' else 'include'
parent_graph = self.parent_graph(dangling=graph_mode)
if name not in self:
if name not in parent_graph:
return instances
if dangling == 'error':
raise self._dangling_refs_error({name}, f'finding local refs for {name!r}')
if dangling == 'ignore':
return instances
for parent in parent_graph.get(name, set()):
if parent not in self: # parent_graph may be a for a superset of self
continue
for ref in self[parent].refs[name]:
@ -582,6 +679,7 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta):
name: str,
order: list[str] | None = None,
parent_graph: dict[str, set[str]] | None = None,
dangling: dangling_mode_t = 'error',
) -> dict[tuple[str, ...], NDArray[numpy.float64]]:
"""
Find the absolute (top-level) location and orientation of all refs (including
@ -598,18 +696,28 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta):
The provided graph may be for a superset of `self` (i.e. it may
contain additional patterns which are not present in self; they
will be ignored).
dangling: How refs to missing targets are handled if `order` or
`parent_graph` are not provided. `'include'` also allows
querying missing names.
Returns:
Mapping of `{hierarchy: transform_list}`, where `hierarchy` is a tuple of the form
`(toplevel_pattern, lvl1_pattern, ..., name)` and `transform_list` is an Nx4 ndarray
with rows `(x_offset, y_offset, rotation_ccw_rad, mirror_across_x)`.
"""
if name not in self:
return {}
graph_mode = 'ignore' if dangling == 'ignore' else 'include'
if order is None:
order = self.child_order()
order = self.child_order(dangling=graph_mode)
if parent_graph is None:
parent_graph = self.parent_graph()
parent_graph = self.parent_graph(dangling=graph_mode)
if name not in self:
if name not in parent_graph:
return {}
if dangling == 'error':
raise self._dangling_refs_error({name}, f'finding global refs for {name!r}')
if dangling == 'ignore':
return {}
self_keys = set(self.keys())
@ -618,16 +726,16 @@ class ILibraryView(Mapping[str, 'Pattern'], metaclass=ABCMeta):
NDArray[numpy.float64]
]]]
transforms = defaultdict(list)
for parent, vals in self.find_refs_local(name, parent_graph=parent_graph).items():
for parent, vals in self.find_refs_local(name, parent_graph=parent_graph, dangling=dangling).items():
transforms[parent] = [((name,), numpy.concatenate(vals))]
for next_name in order:
if next_name not in transforms:
continue
if not parent_graph[next_name] & self_keys:
if not parent_graph.get(next_name, set()) & self_keys:
continue
outers = self.find_refs_local(next_name, parent_graph=parent_graph)
outers = self.find_refs_local(next_name, parent_graph=parent_graph, dangling=dangling)
inners = transforms.pop(next_name)
for parent, outer in outers.items():
for path, inner in inners:
@ -675,6 +783,33 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta):
def _merge(self, key_self: str, other: Mapping[str, 'Pattern'], key_other: str) -> None:
pass
def resolve(
self,
other: 'Abstract | str | Pattern | TreeView',
append: bool = False,
) -> 'Abstract | Pattern':
"""
Resolve another device (name, Abstract, Pattern, or TreeView) into an Abstract or Pattern.
If it is a TreeView, it is first added into this library.
Args:
other: The device to resolve.
append: If True and `other` is an `Abstract`, returns the full `Pattern` from the library.
Returns:
An `Abstract` or `Pattern` object.
"""
from .pattern import Pattern #noqa: PLC0415
if not isinstance(other, (str, Abstract, Pattern)):
# We got a TreeView; add it into self and grab its topcell as an Abstract
other = self << other
if isinstance(other, str):
other = self.abstract(other)
if append and isinstance(other, Abstract):
other = self[other.name]
return other
def rename(
self,
old_name: str,
@ -693,6 +828,11 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta):
Returns:
self
"""
if old_name not in self:
raise LibraryError(f'"{old_name}" does not exist in the library.')
if old_name == new_name:
return self
self[new_name] = self[old_name]
del self[old_name]
if move_references:
@ -717,6 +857,9 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta):
Returns:
self
"""
if old_target == new_target:
return self
for pattern in self.values():
if old_target in pattern.refs:
pattern.refs[new_target].extend(pattern.refs[old_target])
@ -756,7 +899,7 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta):
Returns:
(name, pattern) tuple
"""
from .pattern import Pattern
from .pattern import Pattern #noqa: PLC0415
pat = Pattern()
self[name] = pat
return name, pat
@ -790,18 +933,23 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta):
(default).
Returns:
A mapping of `{old_name: new_name}` for all `old_name`s in `other`. Unchanged
names map to themselves.
A mapping of `{old_name: new_name}` for all names in `other` which were
renamed while being added. Unchanged names are omitted.
Raises:
`LibraryError` if a duplicate name is encountered even after applying `rename_theirs()`.
"""
from .pattern import map_targets
from .pattern import map_targets #noqa: PLC0415
duplicates = set(self.keys()) & set(other.keys())
if not duplicates:
for key in other:
self._merge(key, other, key)
if mutate_other:
temp = other
else:
temp = Library(copy.deepcopy(dict(other)))
for key in temp:
self._merge(key, temp, key)
return {}
if mutate_other:
@ -902,7 +1050,7 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta):
# This currently simplifies globally (same shape in different patterns is
# merged into the same ref target).
from .pattern import Pattern
from .pattern import Pattern #noqa: PLC0415
if exclude_types is None:
exclude_types = ()
@ -911,6 +1059,18 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta):
def label2name(label: tuple) -> str: # noqa: ARG001
return self.get_name(SINGLE_USE_PREFIX + 'shape')
used_names = set(self.keys())
def reserve_target_name(label: tuple) -> str:
base_name = label2name(label)
name = base_name
ii = sum(1 for nn in used_names if nn.startswith(base_name)) if base_name in used_names else 0
while name in used_names or name == '':
name = base_name + b64suffix(ii)
ii += 1
used_names.add(name)
return name
shape_counts: MutableMapping[tuple, int] = defaultdict(int)
shape_funcs = {}
@ -927,6 +1087,7 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta):
shape_counts[label] += 1
shape_pats = {}
target_names = {}
for label, count in shape_counts.items():
if count < threshold:
continue
@ -935,6 +1096,7 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta):
shape_pat = Pattern()
shape_pat.shapes[label[-1]] += [shape_func()]
shape_pats[label] = shape_pat
target_names[label] = reserve_target_name(label)
# ## Second pass ##
for pat in tuple(self.values()):
@ -959,14 +1121,14 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta):
# For repeated shapes, create a `Pattern` holding a normalized shape object,
# and add `pat.refs` entries for each occurrence in pat. Also, note down that
# we should delete the `pat.shapes` entries for which we made `Ref`s.
shapes_to_remove = []
for label, shape_entries in shape_table.items():
layer = label[-1]
target = label2name(label)
target = target_names[label]
shapes_to_remove = []
for ii, values in shape_entries:
offset, scale, rotation, mirror_x = values
pat.ref(target=target, offset=offset, scale=scale,
rotation=rotation, mirrored=(mirror_x, False))
rotation=rotation, mirrored=mirror_x)
shapes_to_remove.append(ii)
# Remove any shapes for which we have created refs.
@ -974,7 +1136,7 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta):
del pat.shapes[layer][ii]
for ll, pp in shape_pats.items():
self[label2name(ll)] = pp
self[target_names[ll]] = pp
return self
@ -995,7 +1157,7 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta):
Returns:
self
"""
from .pattern import Pattern
from .pattern import Pattern #noqa: PLC0415
if name_func is None:
def name_func(_pat: Pattern, _shape: Shape | Label) -> str:
@ -1029,6 +1191,25 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta):
return self
def resolve_repeated_refs(self, name: str | None = None) -> Self:
"""
Expand all repeated references into multiple individual references.
Alters the library in-place.
Args:
name: If specified, only resolve repeated refs in this pattern.
Otherwise, resolve in all patterns.
Returns:
self
"""
if name is not None:
self[name].resolve_repeated_refs()
else:
for pat in self.values():
pat.resolve_repeated_refs()
return self
def subtree(
self,
tops: str | Sequence[str],
@ -1058,17 +1239,19 @@ class ILibrary(ILibraryView, MutableMapping[str, 'Pattern'], metaclass=ABCMeta):
def prune_empty(
self,
repeat: bool = True,
dangling: dangling_mode_t = 'error',
) -> set[str]:
"""
Delete any empty patterns (i.e. where `Pattern.is_empty` returns `True`).
Args:
repeat: Also recursively delete any patterns which only contain(ed) empty patterns.
dangling: Passed to `parent_graph()`.
Returns:
A set containing the names of all deleted patterns
"""
parent_graph = self.parent_graph()
parent_graph = self.parent_graph(dangling=dangling)
empty = {name for name, pat in self.items() if pat.is_empty()}
trimmed = set()
while empty:
@ -1198,7 +1381,7 @@ class Library(ILibrary):
Returns:
The newly created `Library` and the newly created `Pattern`
"""
from .pattern import Pattern
from .pattern import Pattern #noqa: PLC0415
tree = cls()
pat = Pattern()
tree[name] = pat
@ -1214,12 +1397,12 @@ class LazyLibrary(ILibrary):
"""
mapping: dict[str, Callable[[], 'Pattern']]
cache: dict[str, 'Pattern']
_lookups_in_progress: set[str]
_lookups_in_progress: list[str]
def __init__(self) -> None:
self.mapping = {}
self.cache = {}
self._lookups_in_progress = set()
self._lookups_in_progress = []
def __setitem__(
self,
@ -1250,16 +1433,20 @@ class LazyLibrary(ILibrary):
return self.cache[key]
if key in self._lookups_in_progress:
chain = ' -> '.join(self._lookups_in_progress + [key])
raise LibraryError(
f'Detected multiple simultaneous lookups of "{key}".\n'
f'Detected circular reference or recursive lookup of "{key}".\n'
f'Lookup chain: {chain}\n'
'This may be caused by an invalid (cyclical) reference, or buggy code.\n'
'If you are lazy-loading a file, try a non-lazy load and check for reference cycles.' # TODO give advice on finding cycles
'If you are lazy-loading a file, try a non-lazy load and check for reference cycles.'
)
self._lookups_in_progress.add(key)
func = self.mapping[key]
pat = func()
self._lookups_in_progress.remove(key)
self._lookups_in_progress.append(key)
try:
func = self.mapping[key]
pat = func()
finally:
self._lookups_in_progress.pop()
self.cache[key] = pat
return pat
@ -1302,6 +1489,11 @@ class LazyLibrary(ILibrary):
Returns:
self
"""
if old_name not in self.mapping:
raise LibraryError(f'"{old_name}" does not exist in the library.')
if old_name == new_name:
return self
self[new_name] = self.mapping[old_name] # copy over function
if old_name in self.cache:
self.cache[new_name] = self.cache[old_name]
@ -1323,6 +1515,9 @@ class LazyLibrary(ILibrary):
Returns:
self
"""
if old_target == new_target:
return self
self.precache()
for pattern in self.cache.values():
if old_target in pattern.refs:

View file

@ -26,6 +26,7 @@ from .traits import AnnotatableImpl, Scalable, Mirrorable, Rotatable, Positionab
from .ports import Port, PortList
logger = logging.getLogger(__name__)
@ -37,8 +38,8 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
or provide equivalent functions.
`Pattern` also stores a dict of `Port`s, which can be used to "snap" together points.
See `Pattern.plug()` and `Pattern.place()`, as well as the helper classes
`builder.Builder`, `builder.Pather`, `builder.RenderPather`, and `ports.PortsList`.
See `Pattern.plug()` and `Pattern.place()`, as well as `builder.Pather`
and `ports.PortsList`.
For convenience, ports can be read out using square brackets:
- `pattern['A'] == Port((0, 0), 0)`
@ -171,7 +172,8 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
return s
def __copy__(self) -> 'Pattern':
logger.warning('Making a shallow copy of a Pattern... old shapes are re-referenced!')
logger.warning('Making a shallow copy of a Pattern... old shapes/refs/labels are re-referenced! '
'Consider using .deepcopy() if this was not intended.')
new = Pattern(
annotations=copy.deepcopy(self.annotations),
ports=copy.deepcopy(self.ports),
@ -198,7 +200,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
def __lt__(self, other: 'Pattern') -> bool:
self_nonempty_targets = [target for target, reflist in self.refs.items() if reflist]
other_nonempty_targets = [target for target, reflist in self.refs.items() if reflist]
other_nonempty_targets = [target for target, reflist in other.refs.items() if reflist]
self_tgtkeys = tuple(sorted((target is None, target) for target in self_nonempty_targets))
other_tgtkeys = tuple(sorted((target is None, target) for target in other_nonempty_targets))
@ -212,7 +214,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
return refs_ours < refs_theirs
self_nonempty_layers = [ll for ll, elems in self.shapes.items() if elems]
other_nonempty_layers = [ll for ll, elems in self.shapes.items() if elems]
other_nonempty_layers = [ll for ll, elems in other.shapes.items() if elems]
self_layerkeys = tuple(sorted(layer2key(ll) for ll in self_nonempty_layers))
other_layerkeys = tuple(sorted(layer2key(ll) for ll in other_nonempty_layers))
@ -221,21 +223,21 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
for _, _, layer in self_layerkeys:
shapes_ours = tuple(sorted(self.shapes[layer]))
shapes_theirs = tuple(sorted(self.shapes[layer]))
shapes_theirs = tuple(sorted(other.shapes[layer]))
if shapes_ours != shapes_theirs:
return shapes_ours < shapes_theirs
self_nonempty_txtlayers = [ll for ll, elems in self.labels.items() if elems]
other_nonempty_txtlayers = [ll for ll, elems in self.labels.items() if elems]
other_nonempty_txtlayers = [ll for ll, elems in other.labels.items() if elems]
self_txtlayerkeys = tuple(sorted(layer2key(ll) for ll in self_nonempty_txtlayers))
other_txtlayerkeys = tuple(sorted(layer2key(ll) for ll in other_nonempty_txtlayers))
if self_txtlayerkeys != other_txtlayerkeys:
return self_txtlayerkeys < other_txtlayerkeys
for _, _, layer in self_layerkeys:
for _, _, layer in self_txtlayerkeys:
labels_ours = tuple(sorted(self.labels[layer]))
labels_theirs = tuple(sorted(self.labels[layer]))
labels_theirs = tuple(sorted(other.labels[layer]))
if labels_ours != labels_theirs:
return labels_ours < labels_theirs
@ -252,7 +254,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
return False
self_nonempty_targets = [target for target, reflist in self.refs.items() if reflist]
other_nonempty_targets = [target for target, reflist in self.refs.items() if reflist]
other_nonempty_targets = [target for target, reflist in other.refs.items() if reflist]
self_tgtkeys = tuple(sorted((target is None, target) for target in self_nonempty_targets))
other_tgtkeys = tuple(sorted((target is None, target) for target in other_nonempty_targets))
@ -266,7 +268,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
return False
self_nonempty_layers = [ll for ll, elems in self.shapes.items() if elems]
other_nonempty_layers = [ll for ll, elems in self.shapes.items() if elems]
other_nonempty_layers = [ll for ll, elems in other.shapes.items() if elems]
self_layerkeys = tuple(sorted(layer2key(ll) for ll in self_nonempty_layers))
other_layerkeys = tuple(sorted(layer2key(ll) for ll in other_nonempty_layers))
@ -275,21 +277,21 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
for _, _, layer in self_layerkeys:
shapes_ours = tuple(sorted(self.shapes[layer]))
shapes_theirs = tuple(sorted(self.shapes[layer]))
shapes_theirs = tuple(sorted(other.shapes[layer]))
if shapes_ours != shapes_theirs:
return False
self_nonempty_txtlayers = [ll for ll, elems in self.labels.items() if elems]
other_nonempty_txtlayers = [ll for ll, elems in self.labels.items() if elems]
other_nonempty_txtlayers = [ll for ll, elems in other.labels.items() if elems]
self_txtlayerkeys = tuple(sorted(layer2key(ll) for ll in self_nonempty_txtlayers))
other_txtlayerkeys = tuple(sorted(layer2key(ll) for ll in other_nonempty_txtlayers))
if self_txtlayerkeys != other_txtlayerkeys:
return False
for _, _, layer in self_layerkeys:
for _, _, layer in self_txtlayerkeys:
labels_ours = tuple(sorted(self.labels[layer]))
labels_theirs = tuple(sorted(self.labels[layer]))
labels_theirs = tuple(sorted(other.labels[layer]))
if labels_ours != labels_theirs:
return False
@ -332,7 +334,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
))
self.ports = dict(sorted(self.ports.items()))
self.annotations = dict(sorted(self.annotations.items()))
self.annotations = dict(sorted(self.annotations.items())) if self.annotations is not None else None
return self
@ -347,6 +349,16 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
Returns:
self
"""
annotation_conflicts: set[str] = set()
if other_pattern.annotations is not None and self.annotations is not None:
annotation_conflicts = set(self.annotations.keys()) & set(other_pattern.annotations.keys())
if annotation_conflicts:
raise PatternError(f'Annotation keys overlap: {annotation_conflicts}')
port_conflicts = set(self.ports.keys()) & set(other_pattern.ports.keys())
if port_conflicts:
raise PatternError(f'Port names overlap: {port_conflicts}')
for target, rseq in other_pattern.refs.items():
self.refs[target].extend(rseq)
for layer, sseq in other_pattern.shapes.items():
@ -354,14 +366,10 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
for layer, lseq in other_pattern.labels.items():
self.labels[layer].extend(lseq)
annotation_conflicts = set(self.annotations.keys()) & set(other_pattern.annotations.keys())
if annotation_conflicts:
raise PatternError(f'Annotation keys overlap: {annotation_conflicts}')
self.annotations.update(other_pattern.annotations)
port_conflicts = set(self.ports.keys()) & set(other_pattern.ports.keys())
if port_conflicts:
raise PatternError(f'Port names overlap: {port_conflicts}')
if other_pattern.annotations is not None:
if self.annotations is None:
self.annotations = {}
self.annotations.update(other_pattern.annotations)
self.ports.update(other_pattern.ports)
return self
@ -415,7 +423,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
elif default_keep:
pat.refs = copy.copy(self.refs)
if annotations is not None:
if annotations is not None and self.annotations is not None:
pat.annotations = {k: v for k, v in self.annotations.items() if annotations(k, v)}
elif default_keep:
pat.annotations = copy.copy(self.annotations)
@ -496,6 +504,61 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
]
return polys
def layer_as_polygons(
self,
layer: layer_t,
flatten: bool = True,
library: Mapping[str, 'Pattern'] | None = None,
) -> list[Polygon]:
"""
Collect all geometry effectively on a given layer as a list of polygons.
If `flatten=True`, it recursively gathers shapes on `layer` from all `self.refs`.
`Repetition` objects are expanded, and non-polygon shapes are converted
to `Polygon` approximations.
Args:
layer: The layer to collect geometry from.
flatten: If `True`, include geometry from referenced patterns.
library: Required if `flatten=True` to resolve references.
Returns:
A list of `Polygon` objects.
"""
if flatten and self.has_refs() and library is None:
raise PatternError("Must provide a library to layer_as_polygons() when flatten=True")
polys: list[Polygon] = []
# Local shapes
for shape in self.shapes.get(layer, []):
for p in shape.to_polygons():
# expand repetitions
if p.repetition is not None:
for offset in p.repetition.displacements:
polys.append(p.deepcopy().translate(offset).set_repetition(None))
else:
polys.append(p.deepcopy())
if flatten and self.has_refs():
assert library is not None
for target, refs in self.refs.items():
if target is None:
continue
target_pat = library[target]
for ref in refs:
# Get polygons from target pattern on the same layer
ref_polys = target_pat.layer_as_polygons(layer, flatten=True, library=library)
# Apply ref transformations
for p in ref_polys:
p_pat = ref.as_pattern(Pattern(shapes={layer: [p]}))
# as_pattern expands repetition of the ref itself
# but we need to pull the polygons back out
for p_transformed in p_pat.shapes[layer]:
polys.append(cast('Polygon', p_transformed))
return polys
def referenced_patterns(self) -> set[str | None]:
"""
Get all pattern namers referenced by this pattern. Non-recursive.
@ -581,7 +644,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
bounds = numpy.vstack((numpy.min(corners, axis=0),
numpy.max(corners, axis=0))) * ref.scale + [ref.offset]
if ref.repetition is not None:
bounds += ref.repetition.get_bounds()
bounds += ref.repetition.get_bounds_nonempty()
else:
# Non-manhattan rotation, have to figure out bounds by rotating the pattern
@ -632,6 +695,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
"""
for entry in chain(chain_elements(self.shapes, self.labels, self.refs), self.ports.values()):
cast('Positionable', entry).translate(offset)
self._log_bulk_update(f"translate({offset!r})")
return self
def scale_elements(self, c: float) -> Self:
@ -685,7 +749,9 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self:
"""
Rotate the Pattern around the a location.
Extrinsic transformation: Rotate the Pattern around the a location in the
container's coordinate system. This affects all elements' offsets and
their repetition grids.
Args:
pivot: (x, y) location to rotate around
@ -699,11 +765,14 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
self.rotate_elements(rotation)
self.rotate_element_centers(rotation)
self.translate_elements(+pivot)
self._log_bulk_update(f"rotate_around({pivot}, {rotation})")
return self
def rotate_element_centers(self, rotation: float) -> Self:
"""
Rotate the offsets of all shapes, labels, refs, and ports around (0, 0)
Extrinsic transformation part: Rotate the offsets and repetition grids of all
shapes, labels, refs, and ports around (0, 0) in the container's
coordinate system.
Args:
rotation: Angle to rotate by (counter-clockwise, radians)
@ -714,11 +783,15 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
for entry in chain(chain_elements(self.shapes, self.refs, self.labels), self.ports.values()):
old_offset = cast('Positionable', entry).offset
cast('Positionable', entry).offset = numpy.dot(rotation_matrix_2d(rotation), old_offset)
if isinstance(entry, Repeatable) and entry.repetition is not None:
entry.repetition.rotate(rotation)
return self
def rotate_elements(self, rotation: float) -> Self:
"""
Rotate each shape, ref, and port around its origin (offset)
Intrinsic transformation part: Rotate each shape, ref, label, and port around its
origin (offset) in the container's coordinate system. This does NOT
affect their repetition grids.
Args:
rotation: Angle to rotate by (counter-clockwise, radians)
@ -726,54 +799,61 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
Returns:
self
"""
for entry in chain(chain_elements(self.shapes, self.refs), self.ports.values()):
cast('Rotatable', entry).rotate(rotation)
for entry in chain(chain_elements(self.shapes, self.refs, self.labels), self.ports.values()):
if isinstance(entry, Rotatable):
entry.rotate(rotation)
return self
def mirror_element_centers(self, across_axis: int = 0) -> Self:
def mirror_element_centers(self, axis: int = 0) -> Self:
"""
Mirror the offsets of all shapes, labels, and refs across an axis
Extrinsic transformation part: Mirror the offsets and repetition grids of all
shapes, labels, refs, and ports relative to the container's origin.
Args:
across_axis: Axis to mirror across
(0: mirror across x axis, 1: mirror across y axis)
axis: Axis to mirror across (0: x-axis, 1: y-axis)
Returns:
self
"""
for entry in chain(chain_elements(self.shapes, self.refs, self.labels), self.ports.values()):
cast('Positionable', entry).offset[across_axis - 1] *= -1
cast('Positionable', entry).offset[1 - axis] *= -1
if isinstance(entry, Repeatable) and entry.repetition is not None:
entry.repetition.mirror(axis)
return self
def mirror_elements(self, across_axis: int = 0) -> Self:
def mirror_elements(self, axis: int = 0) -> Self:
"""
Mirror each shape, ref, and pattern across an axis, relative
to its offset
Intrinsic transformation part: Mirror each shape, ref, label, and port relative
to its offset. This does NOT affect their repetition grids.
Args:
across_axis: Axis to mirror across
(0: mirror across x axis, 1: mirror across y axis)
axis: Axis to mirror across
0: mirror across x axis (flip y),
1: mirror across y axis (flip x)
Returns:
self
"""
for entry in chain(chain_elements(self.shapes, self.refs), self.ports.values()):
cast('Mirrorable', entry).mirror(across_axis)
for entry in chain(chain_elements(self.shapes, self.refs, self.labels), self.ports.values()):
if isinstance(entry, Mirrorable):
entry.mirror(axis=axis)
self._log_bulk_update(f"mirror_elements({axis})")
return self
def mirror(self, across_axis: int = 0) -> Self:
def mirror(self, axis: int = 0) -> Self:
"""
Mirror the Pattern across an axis
Extrinsic transformation: Mirror the Pattern across an axis through its origin.
This affects all elements' offsets and their internal orientations.
Args:
across_axis: Axis to mirror across
(0: mirror across x axis, 1: mirror across y axis)
axis: Axis to mirror across (0: x-axis, 1: y-axis).
Returns:
self
"""
self.mirror_elements(across_axis)
self.mirror_element_centers(across_axis)
self.mirror_elements(axis=axis)
self.mirror_element_centers(axis=axis)
self._log_bulk_update(f"mirror({axis})")
return self
def copy(self) -> Self:
@ -784,7 +864,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
Returns:
A deep copy of the current Pattern.
"""
return copy.deepcopy(self)
return self.deepcopy()
def deepcopy(self) -> Self:
"""
@ -927,6 +1007,28 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
del self.labels[layer]
return self
def resolve_repeated_refs(self) -> Self:
"""
Expand all repeated references into multiple individual references.
Alters the current pattern in-place.
Returns:
self
"""
new_refs: defaultdict[str | None, list[Ref]] = defaultdict(list)
for target, rseq in self.refs.items():
for ref in rseq:
if ref.repetition is None:
new_refs[target].append(ref)
else:
for dd in ref.repetition.displacements:
new_ref = ref.deepcopy()
new_ref.offset = ref.offset + dd
new_ref.repetition = None
new_refs[target].append(new_ref)
self.refs = new_refs
return self
def prune_refs(self) -> Self:
"""
Remove empty ref lists in `self.refs`.
@ -978,10 +1080,16 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
if target_pat is None:
raise PatternError(f'Circular reference in {name} to {target}')
if target_pat.is_empty(): # avoid some extra allocations
ports_only = flatten_ports and bool(target_pat.ports)
if target_pat.is_empty() and not ports_only: # avoid some extra allocations
continue
for ref in refs:
if flatten_ports and ref.repetition is not None and target_pat.ports:
raise PatternError(
f'Cannot flatten ports from repeated ref to {target!r}; '
'flatten with flatten_ports=False or expand/rename the ports manually first.'
)
p = ref.as_pattern(pattern=target_pat)
if not flatten_ports:
p.ports.clear()
@ -1000,6 +1108,8 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
line_color: str = 'k',
fill_color: str = 'none',
overdraw: bool = False,
filename: str | None = None,
ports: bool = False,
) -> None:
"""
Draw a picture of the Pattern and wait for the user to inspect it
@ -1010,15 +1120,18 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
klayout or a different GDS viewer!
Args:
offset: Coordinates to offset by before drawing
line_color: Outlines are drawn with this color (passed to `matplotlib.collections.PolyCollection`)
fill_color: Interiors are drawn with this color (passed to `matplotlib.collections.PolyCollection`)
overdraw: Whether to create a new figure or draw on a pre-existing one
library: Mapping of {name: Pattern} for resolving references. Required if `self.has_refs()`.
offset: Coordinates to offset by before drawing.
line_color: Outlines are drawn with this color.
fill_color: Interiors are drawn with this color.
overdraw: Whether to create a new figure or draw on a pre-existing one.
filename: If provided, save the figure to this file instead of showing it.
ports: If True, annotate the plot with arrows representing the ports.
"""
# TODO: add text labels to visualize()
try:
from matplotlib import pyplot # type: ignore
import matplotlib.collections # type: ignore
from matplotlib import pyplot # type: ignore #noqa: PLC0415
import matplotlib.collections # type: ignore #noqa: PLC0415
except ImportError:
logger.exception('Pattern.visualize() depends on matplotlib!\n'
+ 'Make sure to install masque with the [visualize] option to pull in the needed dependencies.')
@ -1027,48 +1140,155 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
if self.has_refs() and library is None:
raise PatternError('Must provide a library when visualizing a pattern with refs')
offset = numpy.asarray(offset, dtype=float)
# Cache for {Pattern object ID: List of local polygon vertex arrays}
# Polygons are stored relative to the pattern's origin (offset included)
poly_cache: dict[int, list[NDArray[numpy.float64]]] = {}
def get_local_polys(pat: 'Pattern') -> list[NDArray[numpy.float64]]:
pid = id(pat)
if pid not in poly_cache:
polys = []
for shape in chain.from_iterable(pat.shapes.values()):
for ss in shape.to_polygons():
# Shape.to_polygons() returns Polygons with their own offsets and vertices.
# We need to expand any shape-level repetition here.
v_base = ss.vertices + ss.offset
if ss.repetition is not None:
for disp in ss.repetition.displacements:
polys.append(v_base + disp)
else:
polys.append(v_base)
poly_cache[pid] = polys
return poly_cache[pid]
all_polygons: list[NDArray[numpy.float64]] = []
port_info: list[tuple[str, NDArray[numpy.float64], float]] = []
def collect_polys_recursive(
pat: 'Pattern',
c_offset: NDArray[numpy.float64],
c_rotation: float,
c_mirrored: bool,
c_scale: float,
) -> None:
# Current transform: T(c_offset) * R(c_rotation) * M(c_mirrored) * S(c_scale)
# 1. Transform and collect local polygons
local_polys = get_local_polys(pat)
if local_polys:
rot_mat = rotation_matrix_2d(c_rotation)
for v in local_polys:
vt = v * c_scale
if c_mirrored:
vt = vt.copy()
vt[:, 1] *= -1
vt = (rot_mat @ vt.T).T + c_offset
all_polygons.append(vt)
# 2. Collect ports if requested
if ports:
for name, p in pat.ports.items():
pt_v = p.offset * c_scale
if c_mirrored:
pt_v = pt_v.copy()
pt_v[1] *= -1
pt_v = rotation_matrix_2d(c_rotation) @ pt_v + c_offset
if p.rotation is not None:
pt_rot = p.rotation
if c_mirrored:
pt_rot = -pt_rot
pt_rot += c_rotation
port_info.append((name, pt_v, pt_rot))
# 3. Recurse into refs
for target, refs in pat.refs.items():
if target is None:
continue
assert library is not None
target_pat = library[target]
for ref in refs:
# Ref order of operations: mirror, rotate, scale, translate, repeat
# Combined scale and mirror
r_scale = c_scale * ref.scale
r_mirrored = c_mirrored ^ ref.mirrored
# Combined rotation: push c_mirrored and c_rotation through ref.rotation
r_rot_relative = -ref.rotation if c_mirrored else ref.rotation
r_rotation = c_rotation + r_rot_relative
# Offset composition helper
def get_full_offset(rel_offset: NDArray[numpy.float64]) -> NDArray[numpy.float64]:
o = rel_offset * c_scale
if c_mirrored:
o = o.copy()
o[1] *= -1
return rotation_matrix_2d(c_rotation) @ o + c_offset
if ref.repetition is not None:
for disp in ref.repetition.displacements:
collect_polys_recursive(
target_pat,
get_full_offset(ref.offset + disp),
r_rotation,
r_mirrored,
r_scale
)
else:
collect_polys_recursive(
target_pat,
get_full_offset(ref.offset),
r_rotation,
r_mirrored,
r_scale
)
# Start recursive collection
collect_polys_recursive(self, numpy.asarray(offset, dtype=float), 0.0, False, 1.0)
# Plotting
if not overdraw:
figure = pyplot.figure()
pyplot.axis('equal')
else:
figure = pyplot.gcf()
axes = figure.gca()
polygons = []
for shape in chain.from_iterable(self.shapes.values()):
polygons += [offset + s.offset + s.vertices for s in shape.to_polygons()]
if all_polygons:
mpl_poly_collection = matplotlib.collections.PolyCollection(
all_polygons,
facecolors = fill_color,
edgecolors = line_color,
)
axes.add_collection(mpl_poly_collection)
mpl_poly_collection = matplotlib.collections.PolyCollection(
polygons,
facecolors=fill_color,
edgecolors=line_color,
)
axes.add_collection(mpl_poly_collection)
pyplot.axis('equal')
if ports:
for port_name, pt_v, pt_rot in port_info:
p1 = pt_v
angle = pt_rot
size = 1.0 # arrow size
p2 = p1 + size * numpy.array([numpy.cos(angle), numpy.sin(angle)])
for target, refs in self.refs.items():
if target is None:
continue
if not refs:
continue
assert library is not None
target_pat = library[target]
for ref in refs:
ref.as_pattern(target_pat).visualize(
library=library,
offset=offset,
overdraw=True,
line_color=line_color,
fill_color=fill_color,
axes.annotate(
port_name,
xy = tuple(p1),
xytext = tuple(p2),
arrowprops = dict(arrowstyle="->", color='g', linewidth=1),
color = 'g',
fontsize = 8,
)
axes.autoscale_view()
axes.set_aspect('equal')
if not overdraw:
pyplot.xlabel('x')
pyplot.ylabel('y')
pyplot.show()
axes.set_xlabel('x')
axes.set_ylabel('y')
if filename:
figure.savefig(filename)
else:
figure.show()
# @overload
# def place(
@ -1111,6 +1331,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
port_map: dict[str, str | None] | None = None,
skip_port_check: bool = False,
append: bool = False,
skip_geometry: bool = False,
) -> Self:
"""
Instantiate or append the pattern `other` into the current pattern, adding its
@ -1142,6 +1363,10 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
append: If `True`, `other` is appended instead of being referenced.
Note that this does not flatten `other`, so its refs will still
be refs (now inside `self`).
skip_geometry: If `True`, the operation only updates the port list and
skips adding any geometry (shapes, labels, or references). This
allows the pattern assembly to proceed for port-tracking purposes
even when layout generation is suppressed.
Returns:
self
@ -1156,7 +1381,27 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
port_map = {}
if not skip_port_check:
self.check_ports(other.ports.keys(), map_in=None, map_out=port_map)
port_map, overwrite_targets = self._resolve_insert_mapping(
other.ports.keys(),
map_in=None,
map_out=port_map,
allow_conflicts=skip_geometry,
)
for target in overwrite_targets:
self.ports.pop(target, None)
if not skip_geometry:
if append:
if isinstance(other, Abstract):
raise PatternError('Must provide a full `Pattern` (not an `Abstract`) when appending!')
if other.annotations is not None and self.annotations is not None:
annotation_conflicts = set(self.annotations.keys()) & set(other.annotations.keys())
if annotation_conflicts:
raise PatternError(f'Annotation keys overlap: {annotation_conflicts}')
else:
if isinstance(other, Pattern):
raise PatternError('Must provide an `Abstract` (not a `Pattern`) when creating a reference. '
'Use `append=True` if you intended to append the full geometry.')
ports = {}
for name, port in other.ports.items():
@ -1166,16 +1411,19 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
ports[new_name] = port
for name, port in ports.items():
p = port.deepcopy()
pp = port.deepcopy()
if mirrored:
p.mirror()
p.rotate_around(pivot, rotation)
p.translate(offset)
self.ports[name] = p
pp.mirror()
pp.offset[1] *= -1
pp.rotate_around(pivot, rotation)
pp.translate(offset)
self.ports[name] = pp
self._log_port_update(name)
if skip_geometry:
return self
if append:
if isinstance(other, Abstract):
raise PatternError('Must provide a full `Pattern` (not an `Abstract`) when appending!')
other_copy = other.deepcopy()
other_copy.ports.clear()
if mirrored:
@ -1184,7 +1432,6 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
other_copy.translate_elements(offset)
self.append(other_copy)
else:
assert not isinstance(other, Pattern)
ref = Ref(mirrored=mirrored)
ref.rotate_around(pivot, rotation)
ref.translate(offset)
@ -1199,7 +1446,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
# map_out: dict[str, str | None] | None,
# *,
# mirrored: bool,
# inherit_name: bool,
# thru: bool | str,
# set_rotation: bool | None,
# append: Literal[False],
# ) -> Self:
@ -1213,7 +1460,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
# map_out: dict[str, str | None] | None,
# *,
# mirrored: bool,
# inherit_name: bool,
# thru: bool | str,
# set_rotation: bool | None,
# append: bool,
# ) -> Self:
@ -1226,10 +1473,11 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
map_out: dict[str, str | None] | None = None,
*,
mirrored: bool = False,
inherit_name: bool = True,
thru: bool | str = True,
set_rotation: bool | None = None,
append: bool = False,
ok_connections: Iterable[tuple[str, str]] = (),
skip_geometry: bool = False,
) -> Self:
"""
Instantiate or append a pattern into the current pattern, connecting
@ -1237,7 +1485,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
ports specified by `map_out`.
Examples:
======list, ===
=========
- `my_pat.plug(subdevice, {'A': 'C', 'B': 'B'}, map_out={'D': 'myport'})`
instantiates `subdevice` into `my_pat`, plugging ports 'A' and 'B'
of `my_pat` into ports 'C' and 'B' of `subdevice`. The connected ports
@ -1247,7 +1495,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
- `my_pat.plug(wire, {'myport': 'A'})` places port 'A' of `wire` at 'myport'
of `my_pat`.
If `wire` has only two ports (e.g. 'A' and 'B'), no `map_out` argument is
provided, and the `inherit_name` argument is not explicitly set to `False`,
provided, and the `thru` argument is not explicitly set to `False`,
the unconnected port of `wire` is automatically renamed to 'myport'. This
allows easy extension of existing ports without changing their names or
having to provide `map_out` each time `plug` is called.
@ -1260,11 +1508,15 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
new names for ports in `other`.
mirrored: Enables mirroring `other` across the x axis prior to connecting
any ports.
inherit_name: If `True`, and `map_in` specifies only a single port,
and `map_out` is `None`, and `other` has only two ports total,
then automatically renames the output port of `other` to the
name of the port from `self` that appears in `map_in`. This
makes it easy to extend a pattern with simple 2-port devices
thru: If map_in specifies only a single port, `thru` provides a mechainsm
to avoid repeating the port name. Eg, for `map_in={'myport': 'A'}`,
- If True (default), and `other` has only two ports total, and map_out
doesn't specify a name for the other port, its name is set to the key
in `map_in`, i.e. 'myport'.
- If a string, `map_out[thru]` is set to the key in `map_in` (i.e. 'myport').
An error is raised if that entry already exists.
This makes it easy to extend a pattern with simple 2-port devices
(e.g. wires) without providing `map_out` each time `plug` is
called. See "Examples" above for more info. Default `True`.
set_rotation: If the necessary rotation cannot be determined from
@ -1280,6 +1532,11 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
any other ptypte. Non-allowed ptype connections will emit a
warning. Order is ignored, i.e. `(a, b)` is equivalent to
`(b, a)`.
skip_geometry: If `True`, only ports are updated and geometry is
skipped. If a valid transform cannot be found (e.g. due to
misaligned ports), a 'best-effort' dummy transform is used
to ensure new ports are still added at approximate locations,
allowing downstream routing to continue.
Returns:
self
@ -1292,44 +1549,88 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
`PortError` if the specified port mapping is not achieveable (the ports
do not line up)
"""
# If asked to inherit a name, check that all conditions are met
if (inherit_name
and not map_out
and len(map_in) == 1
and len(other.ports) == 2):
out_port_name = next(iter(set(other.ports.keys()) - set(map_in.values())))
map_out = {out_port_name: next(iter(map_in.keys()))}
if map_out is None:
map_out = {}
map_out = copy.deepcopy(map_out)
self.check_ports(other.ports.keys(), map_in, map_out)
translation, rotation, pivot = self.find_transform(
other,
# If asked to inherit a name, check that all conditions are met
if isinstance(thru, str):
if not len(map_in) == 1:
raise PatternError(f'Got {thru=} but have multiple map_in entries; don\'t know which one to use')
if thru in map_out:
raise PatternError(f'Got {thru=} but tha port already exists in map_out')
map_out[thru] = next(iter(map_in.keys()))
elif (bool(thru)
and len(map_in) == 1
and not map_out
and len(other.ports) == 2
):
out_port_name = next(iter(set(other.ports.keys()) - set(map_in.values())))
map_out = {out_port_name: next(iter(map_in.keys()))}
map_out, overwrite_targets = self._resolve_insert_mapping(
other.ports.keys(),
map_in,
mirrored=mirrored,
set_rotation=set_rotation,
ok_connections=ok_connections,
map_out,
allow_conflicts=skip_geometry,
)
if not skip_geometry:
if append:
if isinstance(other, Abstract):
raise PatternError('Must provide a full `Pattern` (not an `Abstract`) when appending!')
if other.annotations is not None and self.annotations is not None:
annotation_conflicts = set(self.annotations.keys()) & set(other.annotations.keys())
if annotation_conflicts:
raise PatternError(f'Annotation keys overlap: {annotation_conflicts}')
elif isinstance(other, Pattern):
raise PatternError('Must provide an `Abstract` (not a `Pattern`) when creating a reference. '
'Use `append=True` if you intended to append the full geometry.')
try:
translation, rotation, pivot = self.find_transform(
other,
map_in,
mirrored = mirrored,
set_rotation = set_rotation,
ok_connections = ok_connections,
)
except PortError:
if not skip_geometry:
raise
logger.warning("Port transform failed for dead device. Using dummy transform.")
if map_in:
ki, vi = next(iter(map_in.items()))
s_port = self.ports[ki]
o_port = other.ports[vi].deepcopy()
if mirrored:
o_port.mirror()
o_port.offset[1] *= -1
translation = s_port.offset - o_port.offset
rotation = (s_port.rotation - o_port.rotation - pi) if (s_port.rotation is not None and o_port.rotation is not None) else 0
pivot = o_port.offset
else:
translation = numpy.zeros(2)
rotation = 0.0
pivot = numpy.zeros(2)
for target in overwrite_targets:
self.ports.pop(target, None)
# get rid of plugged ports
for ki, vi in map_in.items():
del self.ports[ki]
self._log_port_removal(ki)
map_out[vi] = None
if isinstance(other, Pattern):
assert append, 'Got a name (not an abstract) but was asked to reference (not append)'
self.place(
other,
offset=translation,
rotation=rotation,
pivot=pivot,
mirrored=mirrored,
port_map=map_out,
skip_port_check=True,
append=append,
offset = translation,
rotation = rotation,
pivot = pivot,
mirrored = mirrored,
port_map = map_out,
skip_port_check = True,
append = append,
skip_geometry = skip_geometry,
)
return self
@ -1363,7 +1664,7 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
current device.
Args:
source: A collection of ports (e.g. Pattern, Builder, or dict)
source: A collection of ports (e.g. Pattern, Pather, or dict)
from which to create the interface.
in_prefix: Prepended to port names for newly-created ports with
reversed directions compared to the current device.
@ -1391,9 +1692,13 @@ class Pattern(PortList, AnnotatableImpl, Mirrorable):
else:
raise PatternError(f'Unable to get ports from {type(source)}: {source}')
if port_map:
if port_map is not None:
if isinstance(port_map, dict):
missing_inkeys = set(port_map.keys()) - set(orig_ports.keys())
port_targets = list(port_map.values())
duplicate_targets = {vv for vv in port_targets if port_targets.count(vv) > 1}
if duplicate_targets:
raise PortError(f'Duplicate targets in `port_map`: {duplicate_targets}')
mapped_ports = {port_map[k]: v for k, v in orig_ports.items() if k in port_map}
else:
port_set = set(port_map)

View file

@ -1,9 +1,8 @@
from typing import overload, Self, NoReturn, Any
from collections.abc import Iterable, KeysView, ValuesView, Mapping
import warnings
import traceback
import logging
import functools
import copy
from collections import Counter
from abc import ABCMeta, abstractmethod
from itertools import chain
@ -12,16 +11,17 @@ import numpy
from numpy import pi
from numpy.typing import ArrayLike, NDArray
from .traits import PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable
from .utils import rotate_offsets_around
from .error import PortError
from .traits import PositionableImpl, PivotableImpl, Copyable, Mirrorable, Flippable
from .utils import rotate_offsets_around, rotation_matrix_2d
from .error import PortError, format_stacktrace
logger = logging.getLogger(__name__)
port_logger = logging.getLogger('masque.ports')
@functools.total_ordering
class Port(PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable):
class Port(PivotableImpl, PositionableImpl, Mirrorable, Flippable, Copyable):
"""
A point at which a `Device` can be snapped to another `Device`.
@ -64,7 +64,7 @@ class Port(PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable):
return self._rotation
@rotation.setter
def rotation(self, val: float) -> None:
def rotation(self, val: float | None) -> None:
if val is None:
self._rotation = None
else:
@ -93,6 +93,12 @@ class Port(PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable):
def copy(self) -> Self:
return self.deepcopy()
def __deepcopy__(self, memo: dict | None = None) -> Self:
memo = {} if memo is None else memo
new = copy.copy(self)
new._offset = self._offset.copy()
return new
def get_bounds(self) -> NDArray[numpy.float64]:
return numpy.vstack((self.offset, self.offset))
@ -101,8 +107,28 @@ class Port(PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable):
self.ptype = ptype
return self
def mirror(self, axis: int = 0) -> Self:
def flip_across(self, axis: int | None = None, *, x: float | None = None, y: float | None = None) -> Self:
"""
Mirror the object across a line in the container's coordinate system.
Note this operation is performed relative to the pattern's origin and modifies the port's offset.
Args:
axis: Axis to mirror across. 0 mirrors across y=0. 1 mirrors across x=0.
x: Vertical line x=val to mirror across.
y: Horizontal line y=val to mirror across.
Returns:
self
"""
axis, pivot = self._check_flip_args(axis=axis, x=x, y=y)
self.translate(-pivot)
self.mirror(axis)
self.offset[1 - axis] *= -1
self.translate(+pivot)
return self
def mirror(self, axis: int = 0) -> Self:
if self.rotation is not None:
self.rotation *= -1
self.rotation += axis * pi
@ -117,6 +143,34 @@ class Port(PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable):
self.rotation = rotation
return self
def describe(self) -> str:
"""
Returns a human-readable description of the port's state including cardinal directions.
"""
deg = numpy.rad2deg(self.rotation) if self.rotation is not None else None
cardinal = ""
travel_dir = ""
if self.rotation is not None:
dirs = {0: "East (+x)", 90: "North (+y)", 180: "West (-x)", 270: "South (-y)"}
# normalize to [0, 360)
deg_norm = deg % 360
# Find closest cardinal
closest = min(dirs.keys(), key=lambda x: abs((deg_norm - x + 180) % 360 - 180))
if numpy.isclose((deg_norm - closest + 180) % 360 - 180, 0, atol=1e-3):
cardinal = f" ({dirs[closest]})"
# Travel direction (rotation + 180)
t_deg = (deg_norm + 180) % 360
closest_t = min(dirs.keys(), key=lambda x: abs((t_deg - x + 180) % 360 - 180))
if numpy.isclose((t_deg - closest_t + 180) % 360 - 180, 0, atol=1e-3):
travel_dir = f" (Travel -> {dirs[closest_t]})"
deg_text = 'any' if deg is None else f'{deg:g}'
return f"pos=({self.x:g}, {self.y:g}), rot={deg_text}{cardinal}{travel_dir}"
def __repr__(self) -> str:
if self.rotation is None:
rot = 'any'
@ -145,6 +199,28 @@ class Port(PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable):
and self.rotation == other.rotation
)
def measure_travel(self, destination: 'Port') -> tuple[NDArray[numpy.float64], float | None]:
"""
Find the (travel, jog) distances and rotation angle from the current port to the provided
`destination` port.
Travel is along the source port's axis (into the device interior), and jog is perpendicular,
with left of the travel direction corresponding to a positive jog.
Args:
(self): Source `Port`
destination: Destination `Port`
Returns
[travel, jog], rotation
"""
angle_in = self.rotation
angle_out = destination.rotation
assert angle_in is not None
dxy = rotation_matrix_2d(-angle_in) @ (destination.offset - self.offset)
angle = ((angle_out - angle_in) % (2 * pi)) if angle_out is not None else None
return dxy, angle
class PortList(metaclass=ABCMeta):
__slots__ = () # Allow subclasses to use __slots__
@ -160,6 +236,19 @@ class PortList(metaclass=ABCMeta):
def ports(self, value: dict[str, Port]) -> None:
pass
def _log_port_update(self, name: str) -> None:
""" Log the current state of the named port """
port_logger.debug("Port %s: %s", name, self.ports[name].describe())
def _log_port_removal(self, name: str) -> None:
""" Log that the named port has been removed """
port_logger.debug("Port %s: removed", name)
def _log_bulk_update(self, label: str) -> None:
""" Log all current ports at DEBUG level """
for name, port in self.ports.items():
port_logger.debug("%s: Port %s: %s", label, name, port)
@overload
def __getitem__(self, key: str) -> Port:
pass
@ -184,6 +273,12 @@ class PortList(metaclass=ABCMeta):
else: # noqa: RET505
return {k: self.ports[k] for k in key}
def measure_travel(self, src: str, dst: str) -> tuple[NDArray[numpy.float64], float | None]:
"""
Convenience wrapper for measuring travel between two named ports.
"""
return self[src].measure_travel(self[dst])
def __contains__(self, key: str) -> NoReturn:
raise NotImplementedError('PortsList.__contains__ is left unimplemented. Use `key in container.ports` instead.')
@ -213,6 +308,7 @@ class PortList(metaclass=ABCMeta):
raise PortError(f'Port {name} already exists.')
assert name not in self.ports
self.ports[name] = value
self._log_port_update(name)
return self
def rename_ports(
@ -234,17 +330,147 @@ class PortList(metaclass=ABCMeta):
Returns:
self
"""
self._rename_ports_impl(mapping, overwrite=overwrite)
return self
@staticmethod
def _normalize_target_mapping(
ordered_targets: Iterable[tuple[str, str | None]],
explicit_map: Mapping[str, str | None] | None = None,
) -> dict[str, str | None]:
ordered_targets = list(ordered_targets)
normalized = {} if explicit_map is None else copy.deepcopy(dict(explicit_map))
winners = {
target: source
for source, target in ordered_targets
if target is not None
}
for source, target in ordered_targets:
if target is not None and winners[target] != source:
normalized[source] = None
return normalized
def _resolve_insert_mapping(
self,
other_names: Iterable[str],
map_in: Mapping[str, str] | None = None,
map_out: Mapping[str, str | None] | None = None,
*,
allow_conflicts: bool = False,
) -> tuple[dict[str, str | None], set[str]]:
if map_in is None:
map_in = {}
normalized_map_out = {} if map_out is None else copy.deepcopy(dict(map_out))
other_names = list(other_names)
other = set(other_names)
missing_inkeys = set(map_in.keys()) - set(self.ports.keys())
if missing_inkeys:
raise PortError(f'`map_in` keys not present in device: {missing_inkeys}')
missing_invals = set(map_in.values()) - other
if missing_invals:
raise PortError(f'`map_in` values not present in other device: {missing_invals}')
map_in_counts = Counter(map_in.values())
conflicts_in = {kk for kk, vv in map_in_counts.items() if vv > 1}
if conflicts_in:
raise PortError(f'Duplicate values in `map_in`: {conflicts_in}')
missing_outkeys = set(normalized_map_out.keys()) - other
if missing_outkeys:
raise PortError(f'`map_out` keys not present in other device: {missing_outkeys}')
connected_outkeys = set(normalized_map_out.keys()) & set(map_in.values())
if connected_outkeys:
raise PortError(f'`map_out` keys conflict with connected ports: {connected_outkeys}')
orig_remaining = set(self.ports.keys()) - set(map_in.keys())
connected = set(map_in.values())
if allow_conflicts:
ordered_targets = [
(name, normalized_map_out.get(name, name))
for name in other_names
if name not in connected
]
normalized_map_out = self._normalize_target_mapping(ordered_targets, normalized_map_out)
final_targets = {
normalized_map_out.get(name, name)
for name in other_names
if name not in connected and normalized_map_out.get(name, name) is not None
}
overwrite_targets = {target for target in final_targets if target in orig_remaining}
return normalized_map_out, overwrite_targets
other_remaining = other - set(normalized_map_out.keys()) - connected
mapped_vals = set(normalized_map_out.values())
mapped_vals.discard(None)
conflicts_final = orig_remaining & (other_remaining | mapped_vals)
if conflicts_final:
raise PortError(f'Device ports conflict with existing ports: {conflicts_final}')
conflicts_partial = other_remaining & mapped_vals
if conflicts_partial:
raise PortError(f'`map_out` targets conflict with non-mapped outputs: {conflicts_partial}')
map_out_counts = Counter(normalized_map_out.values())
map_out_counts[None] = 0
conflicts_out = {kk for kk, vv in map_out_counts.items() if vv > 1}
if conflicts_out:
raise PortError(f'Duplicate targets in `map_out`: {conflicts_out}')
return normalized_map_out, set()
def _rename_ports_impl(
self,
mapping: Mapping[str, str | None],
*,
overwrite: bool = False,
allow_collisions: bool = False,
) -> dict[str, str]:
if not overwrite:
duplicates = (set(self.ports.keys()) - set(mapping.keys())) & set(mapping.values())
if duplicates:
raise PortError(f'Unrenamed ports would be overwritten: {duplicates}')
missing = set(mapping) - set(self.ports)
if missing:
raise PortError(f'Ports to rename were not found: {missing}')
renamed_targets = [vv for vv in mapping.values() if vv is not None]
if not allow_collisions:
duplicate_targets = {vv for vv in renamed_targets if renamed_targets.count(vv) > 1}
if duplicate_targets:
raise PortError(f'Renamed ports would collide: {duplicate_targets}')
renamed = {vv: self.ports.pop(kk) for kk, vv in mapping.items()}
if None in renamed:
del renamed[None]
winners = {
target: source
for source, target in mapping.items()
if target is not None
}
overwritten = {
target
for target, source in winners.items()
if target in self.ports and target not in mapping and target != source
}
for kk, vv in mapping.items():
if vv is None or vv != kk:
self._log_port_removal(kk)
source_ports = {kk: self.ports.pop(kk) for kk in mapping}
for target in overwritten:
self.ports.pop(target, None)
renamed = {
vv: source_ports[kk]
for kk, vv in mapping.items()
if vv is not None and winners[vv] == kk
}
self.ports.update(renamed) # type: ignore
return self
for vv in winners:
self._log_port_update(vv)
return winners
def add_port_pair(
self,
@ -266,12 +492,16 @@ class PortList(metaclass=ABCMeta):
Returns:
self
"""
if names[0] == names[1]:
raise PortError(f'Port names must be distinct: {names[0]!r}')
new_ports = {
names[0]: Port(offset, rotation=rotation, ptype=ptype),
names[1]: Port(offset, rotation=rotation + pi, ptype=ptype),
}
self.check_ports(names)
self.ports.update(new_ports)
self._log_port_update(names[0])
self._log_port_update(names[1])
return self
def plugged(
@ -294,7 +524,19 @@ class PortList(metaclass=ABCMeta):
Raises:
`PortError` if the ports are not properly aligned.
"""
if not connections:
raise PortError('Must provide at least one port connection')
missing_a = set(connections) - set(self.ports)
if missing_a:
raise PortError(f'Connection source ports were not found: {missing_a}')
missing_b = set(connections.values()) - set(self.ports)
if missing_b:
raise PortError(f'Connection destination ports were not found: {missing_b}')
a_names, b_names = list(zip(*connections.items(), strict=True))
used_names = list(chain(a_names, b_names))
duplicate_names = {name for name in used_names if used_names.count(name) > 1}
if duplicate_names:
raise PortError(f'Each port may appear in at most one connection: {duplicate_names}')
a_ports = [self.ports[pp] for pp in a_names]
b_ports = [self.ports[pp] for pp in b_names]
@ -305,11 +547,11 @@ class PortList(metaclass=ABCMeta):
if type_conflicts.any():
msg = 'Ports have conflicting types:\n'
for nn, (k, v) in enumerate(connections.items()):
for nn, (kk, vv) in enumerate(connections.items()):
if type_conflicts[nn]:
msg += f'{k} | {a_types[nn]}:{b_types[nn]} | {v}\n'
msg = ''.join(traceback.format_stack()) + '\n' + msg
warnings.warn(msg, stacklevel=2)
msg += f'{kk} | {a_types[nn]}:{b_types[nn]} | {vv}\n'
msg += '\nStack trace:\n' + format_stacktrace()
logger.warning(msg)
a_offsets = numpy.array([pp.offset for pp in a_ports])
b_offsets = numpy.array([pp.offset for pp in b_ports])
@ -326,21 +568,22 @@ class PortList(metaclass=ABCMeta):
if not numpy.allclose(rotations, 0):
rot_deg = numpy.rad2deg(rotations)
msg = 'Port orientations do not match:\n'
for nn, (k, v) in enumerate(connections.items()):
for nn, (kk, vv) in enumerate(connections.items()):
if not numpy.isclose(rot_deg[nn], 0):
msg += f'{k} | {rot_deg[nn]:g} | {v}\n'
msg += f'{kk} | {rot_deg[nn]:g} | {vv}\n'
raise PortError(msg)
translations = a_offsets - b_offsets
if not numpy.allclose(translations, 0):
msg = 'Port translations do not match:\n'
for nn, (k, v) in enumerate(connections.items()):
for nn, (kk, vv) in enumerate(connections.items()):
if not numpy.allclose(translations[nn], 0):
msg += f'{k} | {translations[nn]} | {v}\n'
msg += f'{kk} | {translations[nn]} | {vv}\n'
raise PortError(msg)
for pp in chain(a_names, b_names):
del self.ports[pp]
self._log_port_removal(pp)
return self
def check_ports(
@ -371,45 +614,7 @@ class PortList(metaclass=ABCMeta):
`PortError` if there are any duplicate names after `map_in` and `map_out`
are applied.
"""
if map_in is None:
map_in = {}
if map_out is None:
map_out = {}
other = set(other_names)
missing_inkeys = set(map_in.keys()) - set(self.ports.keys())
if missing_inkeys:
raise PortError(f'`map_in` keys not present in device: {missing_inkeys}')
missing_invals = set(map_in.values()) - other
if missing_invals:
raise PortError(f'`map_in` values not present in other device: {missing_invals}')
missing_outkeys = set(map_out.keys()) - other
if missing_outkeys:
raise PortError(f'`map_out` keys not present in other device: {missing_outkeys}')
orig_remaining = set(self.ports.keys()) - set(map_in.keys())
other_remaining = other - set(map_out.keys()) - set(map_in.values())
mapped_vals = set(map_out.values())
mapped_vals.discard(None)
conflicts_final = orig_remaining & (other_remaining | mapped_vals)
if conflicts_final:
raise PortError(f'Device ports conflict with existing ports: {conflicts_final}')
conflicts_partial = other_remaining & mapped_vals
if conflicts_partial:
raise PortError(f'`map_out` targets conflict with non-mapped outputs: {conflicts_partial}')
map_out_counts = Counter(map_out.values())
map_out_counts[None] = 0
conflicts_out = {k for k, v in map_out_counts.items() if v > 1}
if conflicts_out:
raise PortError(f'Duplicate targets in `map_out`: {conflicts_out}')
self._resolve_insert_mapping(other_names, map_in, map_out)
return self
def find_transform(
@ -438,7 +643,7 @@ class PortList(metaclass=ABCMeta):
`set_rotation` must remain `None`.
ok_connections: Set of "allowed" ptype combinations. Identical
ptypes are always allowed to connect, as is `'unk'` with
any other ptypte. Non-allowed ptype connections will emit a
any other ptypte. Non-allowed ptype connections will log a
warning. Order is ignored, i.e. `(a, b)` is equivalent to
`(b, a)`.
@ -449,15 +654,17 @@ class PortList(metaclass=ABCMeta):
The rotation should be performed before the translation.
"""
if not map_in:
raise PortError('Must provide at least one port connection')
s_ports = self[map_in.keys()]
o_ports = other[map_in.values()]
return self.find_port_transform(
s_ports=s_ports,
o_ports=o_ports,
map_in=map_in,
mirrored=mirrored,
set_rotation=set_rotation,
ok_connections=ok_connections,
s_ports = s_ports,
o_ports = o_ports,
map_in = map_in,
mirrored = mirrored,
set_rotation = set_rotation,
ok_connections = ok_connections,
)
@staticmethod
@ -489,7 +696,7 @@ class PortList(metaclass=ABCMeta):
`set_rotation` must remain `None`.
ok_connections: Set of "allowed" ptype combinations. Identical
ptypes are always allowed to connect, as is `'unk'` with
any other ptypte. Non-allowed ptype connections will emit a
any other ptypte. Non-allowed ptype connections will log a
warning. Order is ignored, i.e. `(a, b)` is equivalent to
`(b, a)`.
@ -500,6 +707,8 @@ class PortList(metaclass=ABCMeta):
The rotation should be performed before the translation.
"""
if not map_in:
raise PortError('Must provide at least one port connection')
s_offsets = numpy.array([p.offset for p in s_ports.values()])
o_offsets = numpy.array([p.offset for p in o_ports.values()])
s_types = [p.ptype for p in s_ports.values()]
@ -520,16 +729,16 @@ class PortList(metaclass=ABCMeta):
for st, ot in zip(s_types, o_types, strict=True)])
if type_conflicts.any():
msg = 'Ports have conflicting types:\n'
for nn, (k, v) in enumerate(map_in.items()):
for nn, (kk, vv) in enumerate(map_in.items()):
if type_conflicts[nn]:
msg += f'{k} | {s_types[nn]}:{o_types[nn]} | {v}\n'
msg = ''.join(traceback.format_stack()) + '\n' + msg
warnings.warn(msg, stacklevel=2)
msg += f'{kk} | {s_types[nn]}:{o_types[nn]} | {vv}\n'
msg += '\nStack trace:\n' + format_stacktrace()
logger.warning(msg)
rotations = numpy.mod(s_rotations - o_rotations - pi, 2 * pi)
if not has_rot.any():
if set_rotation is None:
PortError('Must provide set_rotation if rotation is indeterminate')
raise PortError('Must provide set_rotation if rotation is indeterminate')
rotations[:] = set_rotation
else:
rotations[~has_rot] = rotations[has_rot][0]
@ -546,8 +755,11 @@ class PortList(metaclass=ABCMeta):
translations = s_offsets - o_offsets
if not numpy.allclose(translations[:1], translations):
msg = 'Port translations do not match:\n'
common_translation = numpy.min(translations, axis=0)
msg += f'Common: {common_translation} \n'
msg += 'Deltas:\n'
for nn, (kk, vv) in enumerate(map_in.items()):
msg += f'{kk} | {translations[nn]} | {vv}\n'
msg += f'{kk} | {translations[nn] - common_translation} | {vv}\n'
raise PortError(msg)
return translations[0], rotations[0], o_offsets[0]

View file

@ -11,11 +11,12 @@ import numpy
from numpy import pi
from numpy.typing import NDArray, ArrayLike
from .utils import annotations_t, rotation_matrix_2d, annotations_eq, annotations_lt, rep2key
from .utils import annotations_t, rotation_matrix_2d, annotations_eq, annotations_lt, rep2key, SupportsBool
from .repetition import Repetition
from .traits import (
PositionableImpl, RotatableImpl, ScalableImpl,
Mirrorable, PivotableImpl, Copyable, RepeatableImpl, AnnotatableImpl,
PivotableImpl, Copyable, RepeatableImpl, AnnotatableImpl,
FlippableImpl,
)
@ -25,8 +26,9 @@ if TYPE_CHECKING:
@functools.total_ordering
class Ref(
PositionableImpl, RotatableImpl, ScalableImpl, Mirrorable,
PivotableImpl, Copyable, RepeatableImpl, AnnotatableImpl,
FlippableImpl, PivotableImpl, RepeatableImpl, AnnotatableImpl,
PositionableImpl, RotatableImpl, ScalableImpl,
Copyable,
):
"""
`Ref` provides basic support for nesting Pattern objects within each other.
@ -42,7 +44,7 @@ class Ref(
__slots__ = (
'_mirrored',
# inherited
'_offset', '_rotation', 'scale', '_repetition', '_annotations',
'_offset', '_rotation', '_scale', '_repetition', '_annotations',
)
_mirrored: bool
@ -50,11 +52,11 @@ class Ref(
# Mirrored property
@property
def mirrored(self) -> bool: # mypy#3004, setter should be SupportsBool
def mirrored(self) -> bool:
return self._mirrored
@mirrored.setter
def mirrored(self, val: bool) -> None:
def mirrored(self, val: SupportsBool) -> None:
self._mirrored = bool(val)
def __init__(
@ -84,24 +86,48 @@ class Ref(
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
@classmethod
def _from_raw(
cls,
*,
offset: NDArray[numpy.float64],
rotation: float,
mirrored: bool,
scale: float,
repetition: Repetition | None,
annotations: annotations_t | None,
) -> Self:
new = cls.__new__(cls)
new._offset = offset
new._rotation = rotation % (2 * pi)
new._scale = scale
new._mirrored = mirrored
new._repetition = repetition
new._annotations = annotations
return new
def __copy__(self) -> 'Ref':
new = Ref(
offset=self.offset.copy(),
rotation=self.rotation,
scale=self.scale,
mirrored=self.mirrored,
repetition=copy.deepcopy(self.repetition),
annotations=copy.deepcopy(self.annotations),
repetition=self.repetition,
annotations=self.annotations,
)
return new
def __deepcopy__(self, memo: dict | None = None) -> 'Ref':
memo = {} if memo is None else memo
new = copy.copy(self)
#new.repetition = copy.deepcopy(self.repetition, memo)
#new.annotations = copy.deepcopy(self.annotations, memo)
new._offset = self._offset.copy()
new.repetition = copy.deepcopy(self.repetition, memo)
new.annotations = copy.deepcopy(self.annotations, memo)
return new
def copy(self) -> 'Ref':
return self.deepcopy()
def __lt__(self, other: 'Ref') -> bool:
if (self.offset != other.offset).any():
return tuple(self.offset) < tuple(other.offset)
@ -116,6 +142,8 @@ class Ref(
return annotations_lt(self.annotations, other.annotations)
def __eq__(self, other: Any) -> bool:
if type(self) is not type(other):
return False
return (
numpy.array_equal(self.offset, other.offset)
and self.mirrored == other.mirrored
@ -160,16 +188,16 @@ class Ref(
return pattern
def rotate(self, rotation: float) -> Self:
"""
Intrinsic transformation: Rotate the target pattern relative to this Ref's
origin. This does NOT affect the repetition grid.
"""
self.rotation += rotation
if self.repetition is not None:
self.repetition.rotate(rotation)
return self
def mirror(self, axis: int = 0) -> Self:
self.mirror_target(axis)
self.rotation *= -1
if self.repetition is not None:
self.repetition.mirror(axis)
return self
def mirror_target(self, axis: int = 0) -> Self:
@ -187,10 +215,11 @@ class Ref(
xys = self.offset[None, :]
if self.repetition is not None:
xys = xys + self.repetition.displacements
transforms = numpy.empty((xys.shape[0], 4))
transforms = numpy.empty((xys.shape[0], 5))
transforms[:, :2] = xys
transforms[:, 2] = self.rotation
transforms[:, 3] = self.mirrored
transforms[:, 4] = self.scale
return transforms
def get_bounds_single(
@ -227,7 +256,10 @@ class Ref(
bounds = numpy.vstack((numpy.min(corners, axis=0),
numpy.max(corners, axis=0))) * self.scale + [self.offset]
return bounds
return self.as_pattern(pattern=pattern).get_bounds(library)
single_ref = self.deepcopy()
single_ref.repetition = None
return single_ref.as_pattern(pattern=pattern).get_bounds(library)
def __repr__(self) -> str:
rotation = f' r{numpy.rad2deg(self.rotation):g}' if self.rotation != 0 else ''

View file

@ -34,7 +34,7 @@ class Repetition(Copyable, Rotatable, Mirrorable, Scalable, Bounded, metaclass=A
pass
@abstractmethod
def __le__(self, other: 'Repetition') -> bool:
def __lt__(self, other: 'Repetition') -> bool:
pass
@abstractmethod
@ -64,7 +64,7 @@ class Grid(Repetition):
_a_count: int
""" Number of instances along the direction specified by the `a_vector` """
_b_vector: NDArray[numpy.float64] | None
_b_vector: NDArray[numpy.float64]
""" Vector `[x, y]` specifying a second lattice vector for the grid.
Specifies center-to-center spacing between adjacent elements.
Can be `None` for a 1D array.
@ -113,6 +113,22 @@ class Grid(Repetition):
self.a_count = a_count
self.b_count = b_count
@classmethod
def _from_raw(
cls: type[GG],
*,
a_vector: NDArray[numpy.float64],
a_count: int,
b_vector: NDArray[numpy.float64],
b_count: int,
) -> GG:
new = cls.__new__(cls)
new._a_vector = a_vector
new._b_vector = b_vector
new._a_count = int(a_count)
new._b_count = int(b_count)
return new
@classmethod
def aligned(
cls: type[GG],
@ -184,6 +200,8 @@ class Grid(Repetition):
def a_count(self, val: int) -> None:
if val != int(val):
raise PatternError('a_count must be convertable to an int!')
if int(val) < 1:
raise PatternError(f'Repetition has too-small a_count: {val}')
self._a_count = int(val)
# b_count property
@ -195,13 +213,12 @@ class Grid(Repetition):
def b_count(self, val: int) -> None:
if val != int(val):
raise PatternError('b_count must be convertable to an int!')
if int(val) < 1:
raise PatternError(f'Repetition has too-small b_count: {val}')
self._b_count = int(val)
@property
def displacements(self) -> NDArray[numpy.float64]:
if self.b_vector is None:
return numpy.arange(self.a_count)[:, None] * self.a_vector[None, :]
aa, bb = numpy.meshgrid(numpy.arange(self.a_count), numpy.arange(self.b_count), indexing='ij')
return (aa.flatten()[:, None] * self.a_vector[None, :]
+ bb.flatten()[:, None] * self.b_vector[None, :]) # noqa
@ -291,7 +308,7 @@ class Grid(Repetition):
return False
return True
def __le__(self, other: Repetition) -> bool:
def __lt__(self, other: Repetition) -> bool:
if type(self) is not type(other):
return repr(type(self)) < repr(type(other))
other = cast('Grid', other)
@ -301,12 +318,8 @@ class Grid(Repetition):
return self.b_count < other.b_count
if not numpy.array_equal(self.a_vector, other.a_vector):
return tuple(self.a_vector) < tuple(other.a_vector)
if self.b_vector is None:
return other.b_vector is not None
if other.b_vector is None:
return False
if not numpy.array_equal(self.b_vector, other.b_vector):
return tuple(self.a_vector) < tuple(other.a_vector)
return tuple(self.b_vector) < tuple(other.b_vector)
return False
@ -327,12 +340,27 @@ class Arbitrary(Repetition):
"""
@property
def displacements(self) -> Any: # mypy#3004 NDArray[numpy.float64]:
def displacements(self) -> NDArray[numpy.float64]:
return self._displacements
@displacements.setter
def displacements(self, val: ArrayLike) -> None:
vala = numpy.array(val, dtype=float)
try:
vala = numpy.array(val, dtype=float)
except (TypeError, ValueError) as exc:
raise PatternError('displacements must be convertible to an Nx2 ndarray') from exc
if vala.size == 0:
self._displacements = numpy.empty((0, 2), dtype=float)
return
if vala.ndim == 1:
if vala.size != 2:
raise PatternError('displacements must be convertible to an Nx2 ndarray')
vala = vala.reshape(1, 2)
elif vala.ndim != 2 or vala.shape[1] != 2:
raise PatternError('displacements must be convertible to an Nx2 ndarray')
order = numpy.lexsort(vala.T[::-1]) # sortrows
self._displacements = vala[order]
@ -350,11 +378,11 @@ class Arbitrary(Repetition):
return (f'<Arbitrary {len(self.displacements)}pts >')
def __eq__(self, other: Any) -> bool:
if not type(other) is not type(self):
if type(other) is not type(self):
return False
return numpy.array_equal(self.displacements, other.displacements)
def __le__(self, other: Repetition) -> bool:
def __lt__(self, other: Repetition) -> bool:
if type(self) is not type(other):
return repr(type(self)) < repr(type(other))
other = cast('Arbitrary', other)
@ -391,7 +419,9 @@ class Arbitrary(Repetition):
Returns:
self
"""
self.displacements[1 - axis] *= -1
new_displacements = self.displacements.copy()
new_displacements[:, 1 - axis] *= -1
self.displacements = new_displacements
return self
def get_bounds(self) -> NDArray[numpy.float64] | None:
@ -402,6 +432,8 @@ class Arbitrary(Repetition):
Returns:
`[[x_min, y_min], [x_max, y_max]]` or `None`
"""
if self.displacements.size == 0:
return None
xy_min = numpy.min(self.displacements, axis=0)
xy_max = numpy.max(self.displacements, axis=0)
return numpy.array((xy_min, xy_max))
@ -416,6 +448,5 @@ class Arbitrary(Repetition):
Returns:
self
"""
self.displacements *= c
self.displacements = self.displacements * c
return self

View file

@ -10,6 +10,8 @@ from .shape import (
)
from .polygon import Polygon as Polygon
from .poly_collection import PolyCollection as PolyCollection
from .rect_collection import RectCollection as RectCollection
from .circle import Circle as Circle
from .ellipse import Ellipse as Ellipse
from .arc import Arc as Arc

View file

@ -10,10 +10,11 @@ from . import Shape, Polygon, normalized_shape_tuple, DEFAULT_POLY_NUM_VERTICES
from ..error import PatternError
from ..repetition import Repetition
from ..utils import is_scalar, annotations_t, annotations_lt, annotations_eq, rep2key
from ..traits import PositionableImpl
@functools.total_ordering
class Arc(Shape):
class Arc(PositionableImpl, Shape):
"""
An elliptical arc, formed by cutting off an elliptical ring with two rays which exit from its
center. It has a position, two radii, a start and stop angle, a rotation, and a width.
@ -42,7 +43,7 @@ class Arc(Shape):
# radius properties
@property
def radii(self) -> Any: # mypy#3004 NDArray[numpy.float64]:
def radii(self) -> NDArray[numpy.float64]:
"""
Return the radii `[rx, ry]`
"""
@ -53,8 +54,8 @@ class Arc(Shape):
val = numpy.array(val, dtype=float).flatten()
if not val.size == 2:
raise PatternError('Radii must have length 2')
if not val.min() >= 0:
raise PatternError('Radii must be non-negative')
if not val.min() > 0:
raise PatternError('Radii must be positive')
self._radii = val
@property
@ -63,8 +64,8 @@ class Arc(Shape):
@radius_x.setter
def radius_x(self, val: float) -> None:
if not val >= 0:
raise PatternError('Radius must be non-negative')
if not val > 0:
raise PatternError('Radius must be positive')
self._radii[0] = val
@property
@ -73,13 +74,13 @@ class Arc(Shape):
@radius_y.setter
def radius_y(self, val: float) -> None:
if not val >= 0:
raise PatternError('Radius must be non-negative')
if not val > 0:
raise PatternError('Radius must be positive')
self._radii[1] = val
# arc start/stop angle properties
@property
def angles(self) -> Any: # mypy#3004 NDArray[numpy.float64]:
def angles(self) -> NDArray[numpy.float64]:
"""
Return the start and stop angles `[a_start, a_stop]`.
Angles are measured from x-axis after rotation
@ -157,28 +158,37 @@ class Arc(Shape):
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0,
repetition: Repetition | None = None,
annotations: annotations_t | None = None,
raw: bool = False,
annotations: annotations_t = None,
) -> None:
if raw:
assert isinstance(radii, numpy.ndarray)
assert isinstance(angles, numpy.ndarray)
assert isinstance(offset, numpy.ndarray)
self._radii = radii
self._angles = angles
self._width = width
self._offset = offset
self._rotation = rotation
self._repetition = repetition
self._annotations = annotations if annotations is not None else {}
else:
self.radii = radii
self.angles = angles
self.width = width
self.offset = offset
self.rotation = rotation
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
self.radii = radii
self.angles = angles
self.width = width
self.offset = offset
self.rotation = rotation
self.repetition = repetition
self.annotations = annotations
@classmethod
def _from_raw(
cls,
*,
radii: NDArray[numpy.float64],
angles: NDArray[numpy.float64],
width: float,
offset: NDArray[numpy.float64],
rotation: float,
annotations: annotations_t = None,
repetition: Repetition | None = None,
) -> 'Arc':
new = cls.__new__(cls)
new._radii = radii
new._angles = angles
new._width = width
new._offset = offset
new._rotation = rotation % (2 * pi)
new._repetition = repetition
new._annotations = annotations
return new
def __deepcopy__(self, memo: dict | None = None) -> 'Arc':
memo = {} if memo is None else memo
@ -186,6 +196,7 @@ class Arc(Shape):
new._offset = self._offset.copy()
new._radii = self._radii.copy()
new._angles = self._angles.copy()
new._repetition = copy.deepcopy(self._repetition, memo)
new._annotations = copy.deepcopy(self._annotations)
return new
@ -229,6 +240,8 @@ class Arc(Shape):
if (num_vertices is None) and (max_arclen is None):
raise PatternError('Max number of points and arclength left unspecified'
+ ' (default was also overridden)')
if max_arclen is not None and (numpy.isnan(max_arclen) or max_arclen <= 0):
raise PatternError('Max arclength must be positive and not NaN')
r0, r1 = self.radii
@ -255,29 +268,38 @@ class Arc(Shape):
return arc_lengths, tt
wh = self.width / 2.0
arclen_limits: list[float] = []
if max_arclen is not None:
arclen_limits.append(max_arclen)
if num_vertices is not None:
n_pts = numpy.ceil(max(self.radii + wh) / min(self.radii) * num_vertices * 100).astype(int)
perimeter_inner = get_arclens(n_pts, *a_ranges[0], dr=-wh)[0].sum()
perimeter_outer = get_arclens(n_pts, *a_ranges[1], dr= wh)[0].sum()
implied_arclen = (perimeter_outer + perimeter_inner + self.width * 2) / num_vertices
max_arclen = min(implied_arclen, max_arclen if max_arclen is not None else numpy.inf)
assert max_arclen is not None
if not (numpy.isnan(implied_arclen) or implied_arclen <= 0):
arclen_limits.append(implied_arclen)
if not arclen_limits:
raise PatternError('Arc polygonization could not determine a valid max_arclen')
max_arclen = min(arclen_limits)
def get_thetas(inner: bool) -> NDArray[numpy.float64]:
""" Figure out the parameter values at which we should place vertices to meet the arclength constraint"""
dr = -wh if inner else wh
n_pts = numpy.ceil(2 * pi * max(self.radii + dr) / max_arclen).astype(int)
n_pts = max(2, int(numpy.ceil(2 * pi * max(self.radii + dr) / max_arclen)))
arc_lengths, thetas = get_arclens(n_pts, *a_ranges[0 if inner else 1], dr=dr)
keep = [0]
removable = (numpy.cumsum(arc_lengths) <= max_arclen)
start = 1
start = 0
while start < arc_lengths.size:
next_to_keep = start + numpy.where(removable)[0][-1] # TODO: any chance we haven't sampled finely enough?
removable = (numpy.cumsum(arc_lengths[start:]) <= max_arclen)
if not removable.any():
next_to_keep = start + 1
else:
next_to_keep = start + numpy.where(removable)[0][-1] + 1
keep.append(next_to_keep)
removable = (numpy.cumsum(arc_lengths[next_to_keep + 1:]) <= max_arclen)
start = next_to_keep + 1
start = next_to_keep
if keep[-1] != thetas.size - 1:
keep.append(thetas.size - 1)
@ -309,81 +331,54 @@ class Arc(Shape):
return [poly]
def get_bounds_single(self) -> NDArray[numpy.float64]:
"""
Equation for rotated ellipse is
`x = x0 + a * cos(t) * cos(rot) - b * sin(t) * sin(phi)`
`y = y0 + a * cos(t) * sin(rot) + b * sin(t) * cos(rot)`
where `t` is our parameter.
Differentiating and solving for 0 slope wrt. `t`, we find
`tan(t) = -+ b/a cot(phi)`
where -+ is for x, y cases, so that's where the extrema are.
If the extrema are innaccessible due to arc constraints, check the arc endpoints instead.
"""
a_ranges = cast('_array2x2_t', self._angles_to_parameters())
sin_r = numpy.sin(self.rotation)
cos_r = numpy.cos(self.rotation)
mins = []
maxs = []
def point(rx: float, ry: float, tt: float) -> NDArray[numpy.float64]:
return numpy.array((
rx * numpy.cos(tt) * cos_r - ry * numpy.sin(tt) * sin_r,
rx * numpy.cos(tt) * sin_r + ry * numpy.sin(tt) * cos_r,
))
def points_in_interval(rx: float, ry: float, a0: float, a1: float) -> list[NDArray[numpy.float64]]:
candidates = [a0, a1]
if rx != 0 and ry != 0:
tx = numpy.arctan2(-ry * sin_r, rx * cos_r)
ty = numpy.arctan2(ry * cos_r, rx * sin_r)
candidates.extend((tx, tx + pi, ty, ty + pi))
lo = min(a0, a1)
hi = max(a0, a1)
pts = []
for base in candidates:
k_min = int(numpy.floor((lo - base) / (2 * pi))) - 1
k_max = int(numpy.ceil((hi - base) / (2 * pi))) + 1
for kk in range(k_min, k_max + 1):
tt = base + kk * 2 * pi
if lo <= tt <= hi:
pts.append(point(rx, ry, tt))
return pts
pts = []
for aa, sgn in zip(a_ranges, (-1, +1), strict=True):
wh = sgn * self.width / 2
rx = self.radius_x + wh
ry = self.radius_y + wh
if rx == 0 or ry == 0:
# Single point, at origin
mins.append([0, 0])
maxs.append([0, 0])
pts.append(numpy.zeros(2))
continue
pts.extend(points_in_interval(rx, ry, aa[0], aa[1]))
a0, a1 = aa
a0_offset = a0 - (a0 % (2 * pi))
sin_r = numpy.sin(self.rotation)
cos_r = numpy.cos(self.rotation)
sin_a = numpy.sin(aa)
cos_a = numpy.cos(aa)
# Cutoff angles
xpt = (-self.rotation) % (2 * pi) + a0_offset
ypt = (pi / 2 - self.rotation) % (2 * pi) + a0_offset
xnt = (xpt - pi) % (2 * pi) + a0_offset
ynt = (ypt - pi) % (2 * pi) + a0_offset
# Points along coordinate axes
rx2_inv = 1 / (rx * rx)
ry2_inv = 1 / (ry * ry)
xr = numpy.abs(cos_r * cos_r * rx2_inv + sin_r * sin_r * ry2_inv) ** -0.5
yr = numpy.abs(-sin_r * -sin_r * rx2_inv + cos_r * cos_r * ry2_inv) ** -0.5
# Arc endpoints
xn, xp = sorted(rx * cos_r * cos_a - ry * sin_r * sin_a)
yn, yp = sorted(rx * sin_r * cos_a + ry * cos_r * sin_a)
# If our arc subtends a coordinate axis, use the extremum along that axis
if a0 < xpt < a1 or a0 < xpt + 2 * pi < a1:
xp = xr
if a0 < xnt < a1 or a0 < xnt + 2 * pi < a1:
xn = -xr
if a0 < ypt < a1 or a0 < ypt + 2 * pi < a1:
yp = yr
if a0 < ynt < a1 or a0 < ynt + 2 * pi < a1:
yn = -yr
mins.append([xn, yn])
maxs.append([xp, yp])
return numpy.vstack((numpy.min(mins, axis=0) + self.offset,
numpy.max(maxs, axis=0) + self.offset))
all_pts = numpy.asarray(pts) + self.offset
return numpy.vstack((numpy.min(all_pts, axis=0),
numpy.max(all_pts, axis=0)))
def rotate(self, theta: float) -> 'Arc':
self.rotation += theta
return self
def mirror(self, axis: int = 0) -> 'Arc':
self.offset[axis - 1] *= -1
self.rotation *= -1
self.rotation += axis * pi
self.angles *= -1
@ -412,15 +407,15 @@ class Arc(Shape):
start_angle -= pi
rotation += pi
angles = (start_angle, start_angle + delta_angle)
norm_angles = (start_angle, start_angle + delta_angle)
rotation %= 2 * pi
width = self.width
return ((type(self), radii, angles, width / norm_value),
return ((type(self), tuple(radii.tolist()), norm_angles, width / norm_value),
(self.offset, scale / norm_value, rotation, False),
lambda: Arc(
radii=radii * norm_value,
angles=angles,
angles=norm_angles,
width=width * norm_value,
))
@ -463,13 +458,18 @@ class Arc(Shape):
`[[a_min_inner, a_max_inner], [a_min_outer, a_max_outer]]`
"""
aa = []
d_angle = self.angles[1] - self.angles[0]
if abs(d_angle) >= 2 * pi:
# Full ring
return numpy.tile([0, 2 * pi], (2, 1)).astype(float)
for sgn in (-1, +1):
wh = sgn * self.width / 2.0
rx = self.radius_x + wh
ry = self.radius_y + wh
a0, a1 = (numpy.arctan2(rx * numpy.sin(ai), ry * numpy.cos(ai)) for ai in self.angles)
sign = numpy.sign(self.angles[1] - self.angles[0])
sign = numpy.sign(d_angle)
if sign != numpy.sign(a1 - a0):
a1 += sign * 2 * pi

View file

@ -10,10 +10,11 @@ from . import Shape, Polygon, normalized_shape_tuple, DEFAULT_POLY_NUM_VERTICES
from ..error import PatternError
from ..repetition import Repetition
from ..utils import is_scalar, annotations_t, annotations_lt, annotations_eq, rep2key
from ..traits import PositionableImpl
@functools.total_ordering
class Circle(Shape):
class Circle(PositionableImpl, Shape):
"""
A circle, which has a position and radius.
"""
@ -48,25 +49,34 @@ class Circle(Shape):
*,
offset: ArrayLike = (0.0, 0.0),
repetition: Repetition | None = None,
annotations: annotations_t | None = None,
raw: bool = False,
annotations: annotations_t = None,
) -> None:
if raw:
assert isinstance(offset, numpy.ndarray)
self._radius = radius
self._offset = offset
self._repetition = repetition
self._annotations = annotations if annotations is not None else {}
else:
self.radius = radius
self.offset = offset
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
self.radius = radius
self.offset = offset
self.repetition = repetition
self.annotations = annotations
@classmethod
def _from_raw(
cls,
*,
radius: float,
offset: NDArray[numpy.float64],
annotations: annotations_t = None,
repetition: Repetition | None = None,
) -> 'Circle':
new = cls.__new__(cls)
new._radius = radius
new._offset = offset
new._repetition = repetition
new._annotations = annotations
return new
def __deepcopy__(self, memo: dict | None = None) -> 'Circle':
memo = {} if memo is None else memo
new = copy.copy(self)
new._offset = self._offset.copy()
new._repetition = copy.deepcopy(self._repetition, memo)
new._annotations = copy.deepcopy(self._annotations)
return new
@ -107,7 +117,7 @@ class Circle(Shape):
n += [num_vertices]
if max_arclen is not None:
n += [2 * pi * self.radius / max_arclen]
num_vertices = int(round(max(n)))
num_vertices = max(3, int(round(max(n))))
thetas = numpy.linspace(2 * pi, 0, num_vertices, endpoint=False)
xs = numpy.cos(thetas) * self.radius
ys = numpy.sin(thetas) * self.radius
@ -123,7 +133,6 @@ class Circle(Shape):
return self
def mirror(self, axis: int = 0) -> 'Circle': # noqa: ARG002 (axis unused)
self.offset[axis - 1] *= -1
return self
def scale_by(self, c: float) -> 'Circle':

View file

@ -11,10 +11,11 @@ from . import Shape, Polygon, normalized_shape_tuple, DEFAULT_POLY_NUM_VERTICES
from ..error import PatternError
from ..repetition import Repetition
from ..utils import is_scalar, rotation_matrix_2d, annotations_t, annotations_lt, annotations_eq, rep2key
from ..traits import PositionableImpl
@functools.total_ordering
class Ellipse(Shape):
class Ellipse(PositionableImpl, Shape):
"""
An ellipse, which has a position, two radii, and a rotation.
The rotation gives the angle from x-axis, counterclockwise, to the first (x) radius.
@ -33,7 +34,7 @@ class Ellipse(Shape):
# radius properties
@property
def radii(self) -> Any: # mypy#3004 NDArray[numpy.float64]:
def radii(self) -> NDArray[numpy.float64]:
"""
Return the radii `[rx, ry]`
"""
@ -41,7 +42,7 @@ class Ellipse(Shape):
@radii.setter
def radii(self, val: ArrayLike) -> None:
val = numpy.array(val).flatten()
val = numpy.array(val, dtype=float).flatten()
if not val.size == 2:
raise PatternError('Radii must have length 2')
if not val.min() >= 0:
@ -93,29 +94,38 @@ class Ellipse(Shape):
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0,
repetition: Repetition | None = None,
annotations: annotations_t | None = None,
raw: bool = False,
annotations: annotations_t = None,
) -> None:
if raw:
assert isinstance(radii, numpy.ndarray)
assert isinstance(offset, numpy.ndarray)
self._radii = radii
self._offset = offset
self._rotation = rotation
self._repetition = repetition
self._annotations = annotations if annotations is not None else {}
else:
self.radii = radii
self.offset = offset
self.rotation = rotation
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
self.radii = radii
self.offset = offset
self.rotation = rotation
self.repetition = repetition
self.annotations = annotations
@classmethod
def _from_raw(
cls,
*,
radii: NDArray[numpy.float64],
offset: NDArray[numpy.float64],
rotation: float,
annotations: annotations_t = None,
repetition: Repetition | None = None,
) -> Self:
new = cls.__new__(cls)
new._radii = radii
new._offset = offset
new._rotation = rotation % pi
new._repetition = repetition
new._annotations = annotations
return new
def __deepcopy__(self, memo: dict | None = None) -> Self:
memo = {} if memo is None else memo
new = copy.copy(self)
new._offset = self._offset.copy()
new._radii = self._radii.copy()
new._repetition = copy.deepcopy(self._repetition, memo)
new._annotations = copy.deepcopy(self._annotations)
return new
@ -167,7 +177,7 @@ class Ellipse(Shape):
n += [num_vertices]
if max_arclen is not None:
n += [perimeter / max_arclen]
num_vertices = int(round(max(n)))
num_vertices = max(3, int(round(max(n))))
thetas = numpy.linspace(2 * pi, 0, num_vertices, endpoint=False)
sin_th, cos_th = (numpy.sin(thetas), numpy.cos(thetas))
@ -179,16 +189,19 @@ class Ellipse(Shape):
return [poly]
def get_bounds_single(self) -> NDArray[numpy.float64]:
rot_radii = numpy.dot(rotation_matrix_2d(self.rotation), self.radii)
return numpy.vstack((self.offset - rot_radii[0],
self.offset + rot_radii[1]))
cos_r = numpy.cos(self.rotation)
sin_r = numpy.sin(self.rotation)
x_extent = numpy.sqrt((self.radius_x * cos_r) ** 2 + (self.radius_y * sin_r) ** 2)
y_extent = numpy.sqrt((self.radius_x * sin_r) ** 2 + (self.radius_y * cos_r) ** 2)
extents = numpy.array((x_extent, y_extent))
return numpy.vstack((self.offset - extents,
self.offset + extents))
def rotate(self, theta: float) -> Self:
self.rotation += theta
return self
def mirror(self, axis: int = 0) -> Self:
self.offset[axis - 1] *= -1
self.rotation *= -1
self.rotation += axis * pi
return self
@ -206,7 +219,7 @@ class Ellipse(Shape):
radii = self.radii[::-1] / self.radius_y
scale = self.radius_y
angle = (self.rotation + pi / 2) % pi
return ((type(self), radii),
return ((type(self), tuple(radii.tolist())),
(self.offset, scale / norm_value, angle, False),
lambda: Ellipse(radii=radii * norm_value))

View file

@ -1,4 +1,4 @@
from typing import Any, cast
from typing import Any, cast, Self
from collections.abc import Sequence
import copy
import functools
@ -24,14 +24,22 @@ class PathCap(Enum):
# # defined by path.cap_extensions
def __lt__(self, other: Any) -> bool:
return self.value == other.value
if self.__class__ is not other.__class__:
return self.__class__.__name__ < other.__class__.__name__
# Order: Flush, Square, Circle, SquareCustom
order = {
PathCap.Flush: 0,
PathCap.Square: 1,
PathCap.Circle: 2,
PathCap.SquareCustom: 3,
}
return order[self] < order[other]
@functools.total_ordering
class Path(Shape):
"""
A path, consisting of a bunch of vertices (Nx2 ndarray), a width, an end-cap shape,
and an offset.
A path, consisting of a bunch of vertices (Nx2 ndarray), a width, and an end-cap shape.
Note that the setter for `Path.vertices` will create a copy of the passed vertex coordinates.
@ -40,7 +48,7 @@ class Path(Shape):
__slots__ = (
'_vertices', '_width', '_cap', '_cap_extensions',
# Inherited
'_offset', '_repetition', '_annotations',
'_repetition', '_annotations',
)
_vertices: NDArray[numpy.float64]
_width: float
@ -80,14 +88,14 @@ class Path(Shape):
def cap(self, val: PathCap) -> None:
self._cap = PathCap(val)
if self.cap != PathCap.SquareCustom:
self.cap_extensions = None
elif self.cap_extensions is None:
self._cap_extensions = None
elif self._cap_extensions is None:
# just got set to SquareCustom
self.cap_extensions = numpy.zeros(2)
self._cap_extensions = numpy.zeros(2)
# cap_extensions property
@property
def cap_extensions(self) -> Any | None: # mypy#3004 NDArray[numpy.float64]]:
def cap_extensions(self) -> NDArray[numpy.float64] | None:
"""
Path end-cap extension
@ -113,7 +121,7 @@ class Path(Shape):
# vertices property
@property
def vertices(self) -> Any: # mypy#3004 NDArray[numpy.float64]]:
def vertices(self) -> NDArray[numpy.float64]:
"""
Vertices of the path (Nx2 ndarray: `[[x0, y0], [x1, y1], ...]`
@ -160,6 +168,28 @@ class Path(Shape):
raise PatternError('Wrong number of vertices')
self.vertices[:, 1] = val
# Offset property for `Positionable`
@property
def offset(self) -> NDArray[numpy.float64]:
"""
[x, y] offset
"""
return numpy.zeros(2)
@offset.setter
def offset(self, val: ArrayLike) -> None:
if numpy.any(val):
raise PatternError('Path offset is forced to (0, 0)')
def set_offset(self, val: ArrayLike) -> Self:
if numpy.any(val):
raise PatternError('Path offset is forced to (0, 0)')
return self
def translate(self, offset: ArrayLike) -> Self:
self._vertices += numpy.atleast_2d(offset)
return self
def __init__(
self,
vertices: ArrayLike,
@ -170,46 +200,57 @@ class Path(Shape):
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0,
repetition: Repetition | None = None,
annotations: annotations_t | None = None,
raw: bool = False,
annotations: annotations_t = None,
) -> None:
self._cap_extensions = None # Since .cap setter might access it
if raw:
assert isinstance(vertices, numpy.ndarray)
assert isinstance(offset, numpy.ndarray)
assert isinstance(cap_extensions, numpy.ndarray) or cap_extensions is None
self._vertices = vertices
self._offset = offset
self._repetition = repetition
self._annotations = annotations if annotations is not None else {}
self._width = width
self._cap = cap
self._cap_extensions = cap_extensions
self.vertices = vertices
self.repetition = repetition
self.annotations = annotations
self._cap = cap
if cap == PathCap.SquareCustom and cap_extensions is None:
self._cap_extensions = numpy.zeros(2)
else:
self.vertices = vertices
self.offset = offset
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
self.width = width
self.cap = cap
self.cap_extensions = cap_extensions
self.rotate(rotation)
self.width = width
if rotation:
self.rotate(rotation)
if numpy.any(offset):
self.translate(offset)
@classmethod
def _from_raw(
cls,
*,
vertices: NDArray[numpy.float64],
width: float,
cap: PathCap,
cap_extensions: NDArray[numpy.float64] | None = None,
annotations: annotations_t = None,
repetition: Repetition | None = None,
) -> Self:
new = cls.__new__(cls)
new._vertices = vertices
new._width = width
new._cap = cap
new._cap_extensions = cap_extensions
new._repetition = repetition
new._annotations = annotations
return new
def __deepcopy__(self, memo: dict | None = None) -> 'Path':
memo = {} if memo is None else memo
new = copy.copy(self)
new._offset = self._offset.copy()
new._vertices = self._vertices.copy()
new._cap = copy.deepcopy(self._cap, memo)
new._cap_extensions = copy.deepcopy(self._cap_extensions, memo)
new._repetition = copy.deepcopy(self._repetition, memo)
new._annotations = copy.deepcopy(self._annotations)
return new
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and numpy.array_equal(self.offset, other.offset)
and numpy.array_equal(self.vertices, other.vertices)
and self.width == other.width
and self.cap == other.cap
@ -234,8 +275,14 @@ class Path(Shape):
if self.cap_extensions is None:
return True
return tuple(self.cap_extensions) < tuple(other.cap_extensions)
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if not numpy.array_equal(self.vertices, other.vertices):
min_len = min(self.vertices.shape[0], other.vertices.shape[0])
eq_mask = self.vertices[:min_len] != other.vertices[:min_len]
eq_lt = self.vertices[:min_len] < other.vertices[:min_len]
eq_lt_masked = eq_lt[eq_mask]
if eq_lt_masked.size > 0:
return eq_lt_masked.flat[0]
return self.vertices.shape[0] < other.vertices.shape[0]
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
@ -286,13 +333,34 @@ class Path(Shape):
) -> list['Polygon']:
extensions = self._calculate_cap_extensions()
v = remove_colinear_vertices(self.vertices, closed_path=False)
v = remove_colinear_vertices(self.vertices, closed_path=False, preserve_uturns=True)
dv = numpy.diff(v, axis=0)
dvdir = dv / numpy.sqrt((dv * dv).sum(axis=1))[:, None]
norms = numpy.sqrt((dv * dv).sum(axis=1))
# Filter out zero-length segments if any remained after remove_colinear_vertices
valid = (norms > 1e-18)
if not numpy.all(valid):
# This shouldn't happen much if remove_colinear_vertices is working
v = v[numpy.append(valid, True)]
dv = numpy.diff(v, axis=0)
norms = norms[valid]
if dv.shape[0] == 0:
# All vertices were the same. It's a point.
if self.width == 0:
return [Polygon(vertices=numpy.zeros((3, 2)))] # Area-less degenerate
if self.cap == PathCap.Circle:
return Circle(radius=self.width / 2, offset=v[0]).to_polygons(num_vertices=num_vertices, max_arclen=max_arclen)
if self.cap == PathCap.Square:
return [Polygon.square(side_length=self.width, offset=v[0])]
# Flush or CustomSquare
return [Polygon(vertices=numpy.zeros((3, 2)))]
dvdir = dv / norms[:, None]
if self.width == 0:
verts = numpy.vstack((v, v[::-1]))
return [Polygon(offset=self.offset, vertices=verts)]
return [Polygon(vertices=verts)]
perp = dvdir[:, ::-1] * [[1, -1]] * self.width / 2
@ -307,11 +375,21 @@ class Path(Shape):
bs = v[1:-1] - v[:-2] + perp[1:] - perp[:-1]
ds = v[1:-1] - v[:-2] - perp[1:] + perp[:-1]
rp = numpy.linalg.solve(As, bs[:, :, None])[:, 0]
rn = numpy.linalg.solve(As, ds[:, :, None])[:, 0]
try:
# Vectorized solve for all intersections
# solve supports broadcasting: As (N-2, 2, 2), bs (N-2, 2, 1)
rp = numpy.linalg.solve(As, bs[:, :, None])[:, 0, 0]
rn = numpy.linalg.solve(As, ds[:, :, None])[:, 0, 0]
except numpy.linalg.LinAlgError:
# Fallback to slower lstsq if some segments are parallel (singular matrix)
rp = numpy.zeros(As.shape[0])
rn = numpy.zeros(As.shape[0])
for ii in range(As.shape[0]):
rp[ii] = numpy.linalg.lstsq(As[ii], bs[ii, :, None], rcond=1e-12)[0][0, 0]
rn[ii] = numpy.linalg.lstsq(As[ii], ds[ii, :, None], rcond=1e-12)[0][0, 0]
intersection_p = v[:-2] + rp * dv[:-1] + perp[:-1]
intersection_n = v[:-2] + rn * dv[:-1] - perp[:-1]
intersection_p = v[:-2] + rp[:, None] * dv[:-1] + perp[:-1]
intersection_n = v[:-2] + rn[:, None] * dv[:-1] - perp[:-1]
towards_perp = (dv[1:] * perp[:-1]).sum(axis=1) > 0 # path bends towards previous perp?
# straight = (dv[1:] * perp[:-1]).sum(axis=1) == 0 # path is straight
@ -343,7 +421,7 @@ class Path(Shape):
o1.append(v[-1] - perp[-1])
verts = numpy.vstack((o0, o1[::-1]))
polys = [Polygon(offset=self.offset, vertices=verts)]
polys = [Polygon(vertices=verts)]
if self.cap == PathCap.Circle:
#for vert in v: # not sure if every vertex, or just ends?
@ -355,8 +433,8 @@ class Path(Shape):
def get_bounds_single(self) -> NDArray[numpy.float64]:
if self.cap == PathCap.Circle:
bounds = self.offset + numpy.vstack((numpy.min(self.vertices, axis=0) - self.width / 2,
numpy.max(self.vertices, axis=0) + self.width / 2))
bounds = numpy.vstack((numpy.min(self.vertices, axis=0) - self.width / 2,
numpy.max(self.vertices, axis=0) + self.width / 2))
elif self.cap in (
PathCap.Flush,
PathCap.Square,
@ -379,18 +457,20 @@ class Path(Shape):
return self
def mirror(self, axis: int = 0) -> 'Path':
self.vertices[:, axis - 1] *= -1
self.vertices[:, 1 - axis] *= -1
return self
def scale_by(self, c: float) -> 'Path':
self.vertices *= c
self.width *= c
if self.cap_extensions is not None:
self.cap_extensions *= c
return self
def normalized_form(self, norm_value: float) -> normalized_shape_tuple:
# Note: this function is going to be pretty slow for many-vertexed paths, relative to
# other shapes
offset = self.vertices.mean(axis=0) + self.offset
offset = self.vertices.mean(axis=0)
zeroed_vertices = self.vertices - offset
scale = zeroed_vertices.std()
@ -401,21 +481,22 @@ class Path(Shape):
rotated_vertices = numpy.vstack([numpy.dot(rotation_matrix_2d(-rotation), v)
for v in normed_vertices])
# Reorder the vertices so that the one with lowest x, then y, comes first.
x_min = rotated_vertices[:, 0].argmin()
if not is_scalar(x_min):
y_min = rotated_vertices[x_min, 1].argmin()
x_min = cast('Sequence', x_min)[y_min]
reordered_vertices = numpy.roll(rotated_vertices, -x_min, axis=0)
# Canonical ordering for open paths: pick whichever of (v) or (v[::-1]) is smaller
if tuple(rotated_vertices.flat) > tuple(rotated_vertices[::-1].flat):
reordered_vertices = rotated_vertices[::-1]
else:
reordered_vertices = rotated_vertices
width0 = self.width / norm_value
cap_extensions0 = None if self.cap_extensions is None else tuple(float(v) / norm_value for v in self.cap_extensions)
return ((type(self), reordered_vertices.data.tobytes(), width0, self.cap),
return ((type(self), reordered_vertices.data.tobytes(), width0, self.cap, cap_extensions0),
(offset, scale / norm_value, rotation, False),
lambda: Path(
reordered_vertices * norm_value,
width=self.width * norm_value,
width=width0 * norm_value,
cap=self.cap,
cap_extensions=None if cap_extensions0 is None else tuple(v * norm_value for v in cap_extensions0),
))
def clean_vertices(self) -> 'Path':
@ -445,7 +526,7 @@ class Path(Shape):
Returns:
self
"""
self.vertices = remove_colinear_vertices(self.vertices, closed_path=False)
self.vertices = remove_colinear_vertices(self.vertices, closed_path=False, preserve_uturns=True)
return self
def _calculate_cap_extensions(self) -> NDArray[numpy.float64]:
@ -460,5 +541,5 @@ class Path(Shape):
return extensions
def __repr__(self) -> str:
centroid = self.offset + self.vertices.mean(axis=0)
centroid = self.vertices.mean(axis=0)
return f'<Path centroid {centroid} v{len(self.vertices)} w{self.width} c{self.cap}>'

View file

@ -0,0 +1,235 @@
from typing import Any, cast, Self
from collections.abc import Iterator
import copy
import functools
from itertools import chain
import numpy
from numpy import pi
from numpy.typing import NDArray, ArrayLike
from . import Shape, normalized_shape_tuple
from .polygon import Polygon
from ..error import PatternError
from ..repetition import Repetition
from ..utils import rotation_matrix_2d, annotations_lt, annotations_eq, rep2key, annotations_t
@functools.total_ordering
class PolyCollection(Shape):
"""
A collection of polygons, consisting of concatenated vertex arrays (N_m x 2 ndarray) which specify
implicitly-closed boundaries, and an array of offets specifying the first vertex of each
successive polygon.
A `normalized_form(...)` is available, but is untested and probably fairly slow.
"""
__slots__ = (
'_vertex_lists',
'_vertex_offsets',
# Inherited
'_repetition', '_annotations',
)
_vertex_lists: NDArray[numpy.float64]
""" 2D NDArray ((N+M+...) x 2) of vertices `[[xa0, ya0], [xa1, ya1], ..., [xb0, yb0], [xb1, yb1], ... ]` """
_vertex_offsets: NDArray[numpy.integer[Any]]
""" 1D NDArray specifying the starting offset for each polygon """
@property
def vertex_lists(self) -> NDArray[numpy.float64]:
"""
Vertices of the polygons, ((N+M+...) x 2). Use with `vertex_offsets`.
"""
return self._vertex_lists
@property
def vertex_offsets(self) -> NDArray[numpy.integer[Any]]:
"""
Starting offset (in `vertex_lists`) for each polygon
"""
return self._vertex_offsets
@property
def vertex_slices(self) -> Iterator[slice]:
"""
Iterator which provides slices which index vertex_lists
"""
if self._vertex_offsets.size == 0:
return
for ii, ff in zip(
self._vertex_offsets,
chain(self._vertex_offsets[1:], [self._vertex_lists.shape[0]]),
strict=True,
):
yield slice(int(ii), int(ff))
@property
def polygon_vertices(self) -> Iterator[NDArray[numpy.float64]]:
for slc in self.vertex_slices:
yield self._vertex_lists[slc]
# Offset property for `Positionable`
@property
def offset(self) -> NDArray[numpy.float64]:
"""
[x, y] offset
"""
return numpy.zeros(2)
@offset.setter
def offset(self, _val: ArrayLike) -> None:
raise PatternError('PolyCollection offset is forced to (0, 0)')
def set_offset(self, val: ArrayLike) -> Self:
if numpy.any(val):
raise PatternError('PolyCollection offset is forced to (0, 0)')
return self
def translate(self, offset: ArrayLike) -> Self:
self._vertex_lists += numpy.atleast_2d(offset)
return self
def __init__(
self,
vertex_lists: ArrayLike,
vertex_offsets: ArrayLike,
*,
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0.0,
repetition: Repetition | None = None,
annotations: annotations_t = None,
) -> None:
self._vertex_lists = numpy.asarray(vertex_lists, dtype=float)
self._vertex_offsets = numpy.asarray(vertex_offsets, dtype=numpy.intp)
self.repetition = repetition
self.annotations = annotations
if rotation:
self.rotate(rotation)
if numpy.any(offset):
self.translate(offset)
@classmethod
def _from_raw(
cls,
*,
vertex_lists: NDArray[numpy.float64],
vertex_offsets: NDArray[numpy.integer[Any]],
annotations: annotations_t = None,
repetition: Repetition | None = None,
) -> Self:
new = cls.__new__(cls)
new._vertex_lists = vertex_lists
new._vertex_offsets = vertex_offsets
new._repetition = repetition
new._annotations = annotations
return new
def __deepcopy__(self, memo: dict | None = None) -> Self:
memo = {} if memo is None else memo
new = copy.copy(self)
new._vertex_lists = self._vertex_lists.copy()
new._vertex_offsets = self._vertex_offsets.copy()
new._repetition = copy.deepcopy(self._repetition, memo)
new._annotations = copy.deepcopy(self._annotations)
return new
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and numpy.array_equal(self._vertex_lists, other._vertex_lists)
and numpy.array_equal(self.vertex_offsets, other.vertex_offsets)
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def __lt__(self, other: Shape) -> bool:
if type(self) is not type(other):
if repr(type(self)) != repr(type(other)):
return repr(type(self)) < repr(type(other))
return id(type(self)) < id(type(other))
other = cast('PolyCollection', other)
for vv, oo in zip(self.polygon_vertices, other.polygon_vertices, strict=False):
if not numpy.array_equal(vv, oo):
min_len = min(vv.shape[0], oo.shape[0])
eq_mask = vv[:min_len] != oo[:min_len]
eq_lt = vv[:min_len] < oo[:min_len]
eq_lt_masked = eq_lt[eq_mask]
if eq_lt_masked.size > 0:
return eq_lt_masked.flat[0]
return vv.shape[0] < oo.shape[0]
if len(self.vertex_lists) != len(other.vertex_lists):
return len(self.vertex_lists) < len(other.vertex_lists)
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
def to_polygons(
self,
num_vertices: int | None = None, # unused # noqa: ARG002
max_arclen: float | None = None, # unused # noqa: ARG002
) -> list['Polygon']:
return [Polygon(
vertices = vv,
repetition = copy.deepcopy(self.repetition),
annotations = copy.deepcopy(self.annotations),
) for vv in self.polygon_vertices]
def get_bounds_single(self) -> NDArray[numpy.float64] | None: # TODO note shape get_bounds doesn't include repetition
if self._vertex_lists.size == 0:
return None
return numpy.vstack((numpy.min(self._vertex_lists, axis=0),
numpy.max(self._vertex_lists, axis=0)))
def rotate(self, theta: float) -> Self:
if theta != 0:
rot = rotation_matrix_2d(theta)
self._vertex_lists = numpy.einsum('ij,kj->ki', rot, self._vertex_lists)
return self
def mirror(self, axis: int = 0) -> Self:
self._vertex_lists[:, 1 - axis] *= -1
return self
def scale_by(self, c: float) -> Self:
self._vertex_lists *= c
return self
def normalized_form(self, norm_value: float) -> normalized_shape_tuple:
# Note: this function is going to be pretty slow for many-vertexed polygons, relative to
# other shapes
meanv = self._vertex_lists.mean(axis=0)
zeroed_vertices = self._vertex_lists - [meanv]
offset = meanv
scale = zeroed_vertices.std()
normed_vertices = zeroed_vertices / scale
_, _, vertex_axis = numpy.linalg.svd(zeroed_vertices)
rotation = numpy.arctan2(vertex_axis[0][1], vertex_axis[0][0]) % (2 * pi)
rotated_vertices = numpy.einsum('ij,kj->ki', rotation_matrix_2d(-rotation), normed_vertices)
# TODO consider how to reorder vertices for polycollection
## Reorder the vertices so that the one with lowest x, then y, comes first.
#x_min = rotated_vertices[:, 0].argmin()
#if not is_scalar(x_min):
# y_min = rotated_vertices[x_min, 1].argmin()
# x_min = cast('Sequence', x_min)[y_min]
#reordered_vertices = numpy.roll(rotated_vertices, -x_min, axis=0)
# TODO: normalize mirroring?
return ((type(self), rotated_vertices.data.tobytes() + self.vertex_offsets.tobytes()),
(offset, scale / norm_value, rotation, False),
lambda: PolyCollection(
vertex_lists=rotated_vertices * norm_value,
vertex_offsets=self.vertex_offsets.copy(),
),
)
def __repr__(self) -> str:
centroid = self.vertex_lists.mean(axis=0)
return f'<PolyCollection centroid {centroid} p{len(self.vertex_offsets)}>'

View file

@ -1,4 +1,4 @@
from typing import Any, cast, TYPE_CHECKING
from typing import Any, cast, TYPE_CHECKING, Self, Literal
import copy
import functools
@ -20,7 +20,7 @@ if TYPE_CHECKING:
class Polygon(Shape):
"""
A polygon, consisting of a bunch of vertices (Nx2 ndarray) which specify an
implicitly-closed boundary, and an offset.
implicitly-closed boundary.
Note that the setter for `Polygon.vertices` creates a copy of the
passed vertex coordinates.
@ -30,7 +30,7 @@ class Polygon(Shape):
__slots__ = (
'_vertices',
# Inherited
'_offset', '_repetition', '_annotations',
'_repetition', '_annotations',
)
_vertices: NDArray[numpy.float64]
@ -38,7 +38,7 @@ class Polygon(Shape):
# vertices property
@property
def vertices(self) -> Any: # mypy#3004 NDArray[numpy.float64]:
def vertices(self) -> NDArray[numpy.float64]:
"""
Vertices of the polygon (Nx2 ndarray: `[[x0, y0], [x1, y1], ...]`)
@ -85,6 +85,28 @@ class Polygon(Shape):
raise PatternError('Wrong number of vertices')
self.vertices[:, 1] = val
# Offset property for `Positionable`
@property
def offset(self) -> NDArray[numpy.float64]:
"""
[x, y] offset
"""
return numpy.zeros(2)
@offset.setter
def offset(self, val: ArrayLike) -> None:
if numpy.any(val):
raise PatternError('Polygon offset is forced to (0, 0)')
def set_offset(self, val: ArrayLike) -> Self:
if numpy.any(val):
raise PatternError('Polygon offset is forced to (0, 0)')
return self
def translate(self, offset: ArrayLike) -> Self:
self._vertices += numpy.atleast_2d(offset)
return self
def __init__(
self,
vertices: ArrayLike,
@ -92,36 +114,41 @@ class Polygon(Shape):
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0.0,
repetition: Repetition | None = None,
annotations: annotations_t | None = None,
raw: bool = False,
annotations: annotations_t = None,
) -> None:
if raw:
assert isinstance(vertices, numpy.ndarray)
assert isinstance(offset, numpy.ndarray)
self._vertices = vertices
self._offset = offset
self._repetition = repetition
self._annotations = annotations if annotations is not None else {}
else:
self.vertices = vertices
self.offset = offset
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
self.vertices = vertices
self.repetition = repetition
self.annotations = annotations
if rotation:
self.rotate(rotation)
if numpy.any(offset):
self.translate(offset)
@classmethod
def _from_raw(
cls,
*,
vertices: NDArray[numpy.float64],
annotations: annotations_t = None,
repetition: Repetition | None = None,
) -> Self:
new = cls.__new__(cls)
new._vertices = vertices
new._repetition = repetition
new._annotations = annotations
return new
def __deepcopy__(self, memo: dict | None = None) -> 'Polygon':
memo = {} if memo is None else memo
new = copy.copy(self)
new._offset = self._offset.copy()
new._vertices = self._vertices.copy()
new._repetition = copy.deepcopy(self._repetition, memo)
new._annotations = copy.deepcopy(self._annotations)
return new
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and numpy.array_equal(self.offset, other.offset)
and numpy.array_equal(self.vertices, other.vertices)
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
@ -141,8 +168,6 @@ class Polygon(Shape):
if eq_lt_masked.size > 0:
return eq_lt_masked.flat[0]
return self.vertices.shape[0] < other.vertices.shape[0]
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
@ -239,6 +264,11 @@ class Polygon(Shape):
Returns:
A Polygon object containing the requested rectangle
"""
if sum(int(pp is None) for pp in (xmin, xmax, xctr, lx)) != 2:
raise PatternError('Exactly two of xmin, xctr, xmax, lx must be provided!')
if sum(int(pp is None) for pp in (ymin, ymax, yctr, ly)) != 2:
raise PatternError('Exactly two of ymin, yctr, ymax, ly must be provided!')
if lx is None:
if xctr is None:
assert xmin is not None
@ -248,11 +278,11 @@ class Polygon(Shape):
elif xmax is None:
assert xmin is not None
assert xctr is not None
lx = 2 * (xctr - xmin)
lx = 2.0 * (xctr - xmin)
elif xmin is None:
assert xctr is not None
assert xmax is not None
lx = 2 * (xmax - xctr)
lx = 2.0 * (xmax - xctr)
else:
raise PatternError('Two of xmin, xctr, xmax, lx must be None!')
else: # noqa: PLR5501
@ -278,11 +308,11 @@ class Polygon(Shape):
elif ymax is None:
assert ymin is not None
assert yctr is not None
ly = 2 * (yctr - ymin)
ly = 2.0 * (yctr - ymin)
elif ymin is None:
assert yctr is not None
assert ymax is not None
ly = 2 * (ymax - yctr)
ly = 2.0 * (ymax - yctr)
else:
raise PatternError('Two of ymin, yctr, ymax, ly must be None!')
else: # noqa: PLR5501
@ -299,7 +329,7 @@ class Polygon(Shape):
else:
raise PatternError('Two of ymin, yctr, ymax, ly must be None!')
poly = Polygon.rectangle(lx, ly, offset=(xctr, yctr), repetition=repetition)
poly = Polygon.rectangle(abs(lx), abs(ly), offset=(xctr, yctr), repetition=repetition)
return poly
@staticmethod
@ -363,8 +393,8 @@ class Polygon(Shape):
return [copy.deepcopy(self)]
def get_bounds_single(self) -> NDArray[numpy.float64]: # TODO note shape get_bounds doesn't include repetition
return numpy.vstack((self.offset + numpy.min(self.vertices, axis=0),
self.offset + numpy.max(self.vertices, axis=0)))
return numpy.vstack((numpy.min(self.vertices, axis=0),
numpy.max(self.vertices, axis=0)))
def rotate(self, theta: float) -> 'Polygon':
if theta != 0:
@ -372,7 +402,7 @@ class Polygon(Shape):
return self
def mirror(self, axis: int = 0) -> 'Polygon':
self.vertices[:, axis - 1] *= -1
self.vertices[:, 1 - axis] *= -1
return self
def scale_by(self, c: float) -> 'Polygon':
@ -384,7 +414,7 @@ class Polygon(Shape):
# other shapes
meanv = self.vertices.mean(axis=0)
zeroed_vertices = self.vertices - meanv
offset = meanv + self.offset
offset = meanv
scale = zeroed_vertices.std()
normed_vertices = zeroed_vertices / scale
@ -395,11 +425,15 @@ class Polygon(Shape):
for v in normed_vertices])
# Reorder the vertices so that the one with lowest x, then y, comes first.
x_min = rotated_vertices[:, 0].argmin()
if not is_scalar(x_min):
y_min = rotated_vertices[x_min, 1].argmin()
x_min = cast('Sequence', x_min)[y_min]
reordered_vertices = numpy.roll(rotated_vertices, -x_min, axis=0)
x_min_val = rotated_vertices[:, 0].min()
x_min_inds = numpy.where(rotated_vertices[:, 0] == x_min_val)[0]
if x_min_inds.size > 1:
y_min_val = rotated_vertices[x_min_inds, 1].min()
tie_breaker = numpy.where(rotated_vertices[x_min_inds, 1] == y_min_val)[0][0]
start_ind = x_min_inds[tie_breaker]
else:
start_ind = x_min_inds[0]
reordered_vertices = numpy.roll(rotated_vertices, -start_ind, axis=0)
# TODO: normalize mirroring?
@ -438,5 +472,25 @@ class Polygon(Shape):
return self
def __repr__(self) -> str:
centroid = self.offset + self.vertices.mean(axis=0)
centroid = self.vertices.mean(axis=0)
return f'<Polygon centroid {centroid} v{len(self.vertices)}>'
def boolean(
self,
other: Any,
operation: Literal['union', 'intersection', 'difference', 'xor'] = 'union',
scale: float = 1e6,
) -> list['Polygon']:
"""
Perform a boolean operation using this polygon as the subject.
Args:
other: Polygon, Iterable[Polygon], or raw vertices acting as the CLIP.
operation: 'union', 'intersection', 'difference', 'xor'.
scale: Scaling factor for integer conversion.
Returns:
A list of resulting Polygons.
"""
from ..utils.boolean import boolean #noqa: PLC0415
return boolean([self], other, operation=operation, scale=scale)

View file

@ -0,0 +1,249 @@
from typing import Any, cast, Self
from collections.abc import Iterator
import copy
import functools
import numpy
from numpy import pi
from numpy.typing import NDArray, ArrayLike
from . import Shape, normalized_shape_tuple
from .polygon import Polygon
from ..error import PatternError
from ..repetition import Repetition
from ..utils import annotations_lt, annotations_eq, rep2key, annotations_t
def _normalize_rects(rects: ArrayLike) -> NDArray[numpy.float64]:
arr = numpy.asarray(rects, dtype=float)
if arr.ndim != 2 or arr.shape[1] != 4:
raise PatternError('Rectangles must be an Nx4 array of [xmin, ymin, xmax, ymax]')
if numpy.any(arr[:, 0] > arr[:, 2]) or numpy.any(arr[:, 1] > arr[:, 3]):
raise PatternError('Rectangles must satisfy xmin <= xmax and ymin <= ymax')
if arr.shape[0] <= 1:
return arr
order = numpy.lexsort((arr[:, 3], arr[:, 2], arr[:, 1], arr[:, 0]))
return arr[order]
def _renormalize_rects_in_place(rects: NDArray[numpy.float64]) -> None:
x0 = numpy.minimum(rects[:, 0], rects[:, 2])
x1 = numpy.maximum(rects[:, 0], rects[:, 2])
y0 = numpy.minimum(rects[:, 1], rects[:, 3])
y1 = numpy.maximum(rects[:, 1], rects[:, 3])
rects[:, 0] = x0
rects[:, 1] = y0
rects[:, 2] = x1
rects[:, 3] = y1
@functools.total_ordering
class RectCollection(Shape):
"""
A collection of axis-aligned rectangles, stored as an Nx4 array of
`[xmin, ymin, xmax, ymax]` rows.
"""
__slots__ = (
'_rects',
'_repetition', '_annotations',
)
_rects: NDArray[numpy.float64]
@property
def rects(self) -> NDArray[numpy.float64]:
return self._rects
@rects.setter
def rects(self, val: ArrayLike) -> None:
self._rects = _normalize_rects(val)
@property
def offset(self) -> NDArray[numpy.float64]:
return numpy.zeros(2)
@offset.setter
def offset(self, val: ArrayLike) -> None:
if numpy.any(val):
raise PatternError('RectCollection offset is forced to (0, 0)')
def set_offset(self, val: ArrayLike) -> Self:
if numpy.any(val):
raise PatternError('RectCollection offset is forced to (0, 0)')
return self
def translate(self, offset: ArrayLike) -> Self:
delta = numpy.asarray(offset, dtype=float).reshape(2)
self._rects[:, [0, 2]] += delta[0]
self._rects[:, [1, 3]] += delta[1]
return self
def __init__(
self,
rects: ArrayLike,
*,
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0.0,
repetition: Repetition | None = None,
annotations: annotations_t = None,
) -> None:
self.rects = rects
self.repetition = repetition
self.annotations = annotations
if rotation:
self.rotate(rotation)
if numpy.any(offset):
self.translate(offset)
@classmethod
def _from_raw(
cls,
*,
rects: NDArray[numpy.float64],
annotations: annotations_t = None,
repetition: Repetition | None = None,
) -> Self:
new = cls.__new__(cls)
new._rects = rects
new._repetition = repetition
new._annotations = annotations
return new
@property
def polygon_vertices(self) -> Iterator[NDArray[numpy.float64]]:
for rect in self._rects:
xmin, ymin, xmax, ymax = rect
yield numpy.array([
[xmin, ymin],
[xmin, ymax],
[xmax, ymax],
[xmax, ymin],
], dtype=float)
def __deepcopy__(self, memo: dict | None = None) -> Self:
memo = {} if memo is None else memo
new = copy.copy(self)
new._rects = self._rects.copy()
new._repetition = copy.deepcopy(self._repetition, memo)
new._annotations = copy.deepcopy(self._annotations)
return new
def _sorted_rects(self) -> NDArray[numpy.float64]:
if self._rects.shape[0] <= 1:
return self._rects
order = numpy.lexsort((self._rects[:, 3], self._rects[:, 2], self._rects[:, 1], self._rects[:, 0]))
return self._rects[order]
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and numpy.array_equal(self._sorted_rects(), other._sorted_rects())
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def __lt__(self, other: Shape) -> bool:
if type(self) is not type(other):
if repr(type(self)) != repr(type(other)):
return repr(type(self)) < repr(type(other))
return id(type(self)) < id(type(other))
other = cast('RectCollection', other)
self_rects = self._sorted_rects()
other_rects = other._sorted_rects()
if not numpy.array_equal(self_rects, other_rects):
min_len = min(self_rects.shape[0], other_rects.shape[0])
eq_mask = self_rects[:min_len] != other_rects[:min_len]
eq_lt = self_rects[:min_len] < other_rects[:min_len]
eq_lt_masked = eq_lt[eq_mask]
if eq_lt_masked.size > 0:
return bool(eq_lt_masked.flat[0])
return self_rects.shape[0] < other_rects.shape[0]
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
def to_polygons(
self,
num_vertices: int | None = None, # unused # noqa: ARG002
max_arclen: float | None = None, # unused # noqa: ARG002
) -> list[Polygon]:
return [
Polygon(
vertices=vertices,
repetition=copy.deepcopy(self.repetition),
annotations=copy.deepcopy(self.annotations),
)
for vertices in self.polygon_vertices
]
def get_bounds_single(self) -> NDArray[numpy.float64] | None:
if self._rects.size == 0:
return None
mins = self._rects[:, :2].min(axis=0)
maxs = self._rects[:, 2:].max(axis=0)
return numpy.vstack((mins, maxs))
def rotate(self, theta: float) -> Self:
quarter_turns = int(numpy.rint(theta / (pi / 2)))
if not numpy.isclose(theta, quarter_turns * (pi / 2)):
raise PatternError('RectCollection only supports Manhattan rotations')
turns = quarter_turns % 4
if turns == 0 or self._rects.size == 0:
return self
corners = numpy.stack((
self._rects[:, [0, 1]],
self._rects[:, [0, 3]],
self._rects[:, [2, 3]],
self._rects[:, [2, 1]],
), axis=1)
flat = corners.reshape(-1, 2)
if turns == 1:
rotated = numpy.column_stack((-flat[:, 1], flat[:, 0]))
elif turns == 2:
rotated = -flat
else:
rotated = numpy.column_stack((flat[:, 1], -flat[:, 0]))
corners = rotated.reshape(corners.shape)
self._rects[:, 0] = corners[:, :, 0].min(axis=1)
self._rects[:, 1] = corners[:, :, 1].min(axis=1)
self._rects[:, 2] = corners[:, :, 0].max(axis=1)
self._rects[:, 3] = corners[:, :, 1].max(axis=1)
return self
def mirror(self, axis: int = 0) -> Self:
if axis not in (0, 1):
raise PatternError('Axis must be 0 or 1')
if axis == 0:
self._rects[:, [1, 3]] *= -1
else:
self._rects[:, [0, 2]] *= -1
_renormalize_rects_in_place(self._rects)
return self
def scale_by(self, c: float) -> Self:
self._rects *= c
_renormalize_rects_in_place(self._rects)
return self
def normalized_form(self, norm_value: float) -> normalized_shape_tuple:
rects = self._sorted_rects()
centers = 0.5 * (rects[:, :2] + rects[:, 2:])
offset = centers.mean(axis=0)
zeroed = rects.copy()
zeroed[:, [0, 2]] -= offset[0]
zeroed[:, [1, 3]] -= offset[1]
normed = zeroed / norm_value
return (
(type(self), normed.data.tobytes()),
(offset, 1.0, 0.0, False),
lambda: RectCollection(rects=normed * norm_value),
)
def __repr__(self) -> str:
if self._rects.size == 0:
return '<RectCollection r0>'
centers = 0.5 * (self._rects[:, :2] + self._rects[:, 2:])
centroid = centers.mean(axis=0)
return f'<RectCollection centroid {centroid} r{self._rects.shape[0]}>'

View file

@ -6,8 +6,8 @@ import numpy
from numpy.typing import NDArray, ArrayLike
from ..traits import (
Rotatable, Mirrorable, Copyable, Scalable,
PositionableImpl, PivotableImpl, RepeatableImpl, AnnotatableImpl,
Copyable, Scalable, FlippableImpl,
PivotableImpl, RepeatableImpl, AnnotatableImpl,
)
if TYPE_CHECKING:
@ -26,8 +26,9 @@ normalized_shape_tuple = tuple[
DEFAULT_POLY_NUM_VERTICES = 24
class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
PivotableImpl, RepeatableImpl, AnnotatableImpl, metaclass=ABCMeta):
class Shape(FlippableImpl, PivotableImpl, RepeatableImpl, AnnotatableImpl,
Copyable, Scalable,
metaclass=ABCMeta):
"""
Class specifying functions common to all shapes.
"""
@ -73,7 +74,7 @@ class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
pass
@abstractmethod
def normalized_form(self, norm_value: int) -> normalized_shape_tuple:
def normalized_form(self, norm_value: float) -> normalized_shape_tuple:
"""
Writes the shape in a standardized notation, with offset, scale, and rotation
information separated out from the remaining values.
@ -120,7 +121,7 @@ class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
Returns:
List of `Polygon` objects with grid-aligned edges.
"""
from . import Polygon
from . import Polygon #noqa: PLC0415
gx = numpy.unique(grid_x)
gy = numpy.unique(grid_y)
@ -134,26 +135,28 @@ class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
mins, maxs = bounds
vertex_lists = []
p_verts = polygon.vertices + polygon.offset
p_verts = polygon.vertices
for v, v_next in zip(p_verts, numpy.roll(p_verts, -1, axis=0), strict=True):
dv = v_next - v
# Find x-index bounds for the line # TODO: fix this and err_xmin/xmax for grids smaller than the line / shape
# Find x-index bounds for the line
gxi_range = numpy.digitize([v[0], v_next[0]], gx)
gxi_min = numpy.min(gxi_range - 1).clip(0, len(gx) - 1)
gxi_max = numpy.max(gxi_range).clip(0, len(gx))
gxi_min = int(numpy.min(gxi_range - 1).clip(0, len(gx) - 1))
gxi_max = int(numpy.max(gxi_range).clip(0, len(gx)))
err_xmin = (min(v[0], v_next[0]) - gx[gxi_min]) / (gx[gxi_min + 1] - gx[gxi_min])
err_xmax = (max(v[0], v_next[0]) - gx[gxi_max - 1]) / (gx[gxi_max] - gx[gxi_max - 1])
if gxi_min < len(gx) - 1:
err_xmin = (min(v[0], v_next[0]) - gx[gxi_min]) / (gx[gxi_min + 1] - gx[gxi_min])
if err_xmin >= 0.5:
gxi_min += 1
if err_xmin >= 0.5:
gxi_min += 1
if err_xmax >= 0.5:
gxi_max += 1
if gxi_max > 0 and gxi_max < len(gx):
err_xmax = (max(v[0], v_next[0]) - gx[gxi_max - 1]) / (gx[gxi_max] - gx[gxi_max - 1])
if err_xmax >= 0.5:
gxi_max += 1
if abs(dv[0]) < 1e-20:
# Vertical line, don't calculate slope
xi = [gxi_min, gxi_max - 1]
xi = [gxi_min, max(gxi_min, gxi_max - 1)]
ys = numpy.array([v[1], v_next[1]])
yi = numpy.digitize(ys, gy).clip(1, len(gy) - 1)
err_y = (ys - gy[yi]) / (gy[yi] - gy[yi - 1])
@ -249,9 +252,9 @@ class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
Returns:
List of `Polygon` objects with grid-aligned edges.
"""
from . import Polygon
import skimage.measure # type: ignore
import float_raster
from . import Polygon #noqa: PLC0415
import skimage.measure #noqa: PLC0415
import float_raster #noqa: PLC0415
grx = numpy.unique(grid_x)
gry = numpy.unique(grid_y)
@ -282,7 +285,7 @@ class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
offset = (numpy.where(keep_x)[0][0],
numpy.where(keep_y)[0][0])
rastered = float_raster.raster((polygon.vertices + polygon.offset).T, gx, gy)
rastered = float_raster.raster((polygon.vertices).T, gx, gy)
binary_rastered = (numpy.abs(rastered) >= 0.5)
supersampled = binary_rastered.repeat(2, axis=0).repeat(2, axis=1)

View file

@ -9,8 +9,8 @@ from numpy.typing import NDArray, ArrayLike
from . import Shape, Polygon, normalized_shape_tuple
from ..error import PatternError
from ..repetition import Repetition
from ..traits import RotatableImpl
from ..utils import is_scalar, get_bit, annotations_t, annotations_lt, annotations_eq, rep2key
from ..traits import PositionableImpl, RotatableImpl
from ..utils import is_scalar, get_bit, annotations_t, annotations_lt, annotations_eq, rep2key, SupportsBool
# Loaded on use:
# from freetype import Face
@ -18,7 +18,7 @@ from ..utils import is_scalar, get_bit, annotations_t, annotations_lt, annotatio
@functools.total_ordering
class Text(RotatableImpl, Shape):
class Text(PositionableImpl, RotatableImpl, Shape):
"""
Text (to be printed e.g. as a set of polygons).
This is distinct from non-printed Label objects.
@ -55,11 +55,11 @@ class Text(RotatableImpl, Shape):
self._height = val
@property
def mirrored(self) -> bool: # mypy#3004, should be bool
def mirrored(self) -> bool:
return self._mirrored
@mirrored.setter
def mirrored(self, val: bool) -> None:
def mirrored(self, val: SupportsBool) -> None:
self._mirrored = bool(val)
def __init__(
@ -70,31 +70,48 @@ class Text(RotatableImpl, Shape):
*,
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0.0,
mirrored: bool = False,
repetition: Repetition | None = None,
annotations: annotations_t | None = None,
raw: bool = False,
annotations: annotations_t = None,
) -> None:
if raw:
assert isinstance(offset, numpy.ndarray)
self._offset = offset
self._string = string
self._height = height
self._rotation = rotation
self._repetition = repetition
self._annotations = annotations if annotations is not None else {}
else:
self.offset = offset
self.string = string
self.height = height
self.rotation = rotation
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
self.offset = offset
self.string = string
self.height = height
self.rotation = rotation
self.mirrored = mirrored
self.repetition = repetition
self.annotations = annotations
self.font_path = font_path
@classmethod
def _from_raw(
cls,
*,
string: str,
height: float,
font_path: str,
offset: NDArray[numpy.float64],
rotation: float,
mirrored: bool,
annotations: annotations_t = None,
repetition: Repetition | None = None,
) -> Self:
new = cls.__new__(cls)
new._offset = offset
new._string = string
new._height = height
new._rotation = rotation % (2 * pi)
new._mirrored = mirrored
new._repetition = repetition
new._annotations = annotations
new.font_path = font_path
return new
def __deepcopy__(self, memo: dict | None = None) -> Self:
memo = {} if memo is None else memo
new = copy.copy(self)
new._offset = self._offset.copy()
new._repetition = copy.deepcopy(self._repetition, memo)
new._annotations = copy.deepcopy(self._annotations)
return new
@ -105,6 +122,7 @@ class Text(RotatableImpl, Shape):
and self.string == other.string
and self.height == other.height
and self.font_path == other.font_path
and self.mirrored == other.mirrored
and self.rotation == other.rotation
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
@ -124,6 +142,8 @@ class Text(RotatableImpl, Shape):
return self.font_path < other.font_path
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.mirrored != other.mirrored:
return self.mirrored < other.mirrored
if self.rotation != other.rotation:
return self.rotation < other.rotation
if self.repetition != other.repetition:
@ -146,7 +166,7 @@ class Text(RotatableImpl, Shape):
if self.mirrored:
poly.mirror()
poly.scale_by(self.height)
poly.offset = self.offset + [total_advance, 0]
poly.translate(self.offset + [total_advance, 0])
poly.rotate_around(self.offset, self.rotation)
all_polygons += [poly]
@ -171,22 +191,25 @@ class Text(RotatableImpl, Shape):
(self.offset, self.height / norm_value, rotation, bool(self.mirrored)),
lambda: Text(
string=self.string,
height=self.height * norm_value,
height=norm_value,
font_path=self.font_path,
rotation=rotation,
).mirror2d(across_x=self.mirrored),
)
def get_bounds_single(self) -> NDArray[numpy.float64]:
def get_bounds_single(self) -> NDArray[numpy.float64] | None:
# rotation makes this a huge pain when using slot.advance and glyph.bbox(), so
# just convert to polygons instead
polys = self.to_polygons()
if not polys:
return None
pbounds = numpy.full((len(polys), 2, 2), nan)
for pp, poly in enumerate(polys):
pbounds[pp] = poly.get_bounds_nonempty()
bounds = numpy.vstack((
numpy.min(pbounds[: 0, :], axis=0),
numpy.max(pbounds[: 1, :], axis=0),
numpy.min(pbounds[:, 0, :], axis=0),
numpy.max(pbounds[:, 1, :], axis=0),
))
return bounds
@ -201,9 +224,9 @@ def get_char_as_polygons(
font_path: str,
char: str,
resolution: float = 48 * 64,
) -> tuple[list[list[list[float]]], float]:
from freetype import Face # type: ignore
from matplotlib.path import Path # type: ignore
) -> tuple[list[NDArray[numpy.float64]], float]:
from freetype import Face # type: ignore #noqa: PLC0415
from matplotlib.path import Path # type: ignore #noqa: PLC0415
"""
Get a list of polygons representing a single character.
@ -276,11 +299,12 @@ def get_char_as_polygons(
advance = slot.advance.x / resolution
polygons: list[NDArray[numpy.float64]]
if len(all_verts) == 0:
polygons = []
else:
path = Path(all_verts, all_codes)
path.should_simplify = False
polygons = path.to_polygons()
polygons = [numpy.asarray(poly) for poly in path.to_polygons()]
return polygons, advance

3
masque/test/__init__.py Normal file
View file

@ -0,0 +1,3 @@
"""
Tests (run with `python3 -m pytest -rxPXs | tee results.txt`)
"""

13
masque/test/conftest.py Normal file
View file

@ -0,0 +1,13 @@
"""
Test fixtures
"""
# ruff: noqa: ARG001
from typing import Any
import numpy
FixtureRequest = Any
PRNG = numpy.random.RandomState(12345)

View file

@ -0,0 +1,85 @@
from numpy.testing import assert_allclose
from numpy import pi
from ..abstract import Abstract
from ..ports import Port
from ..ref import Ref
def test_abstract_init() -> None:
ports = {"A": Port((0, 0), 0), "B": Port((10, 0), pi)}
abs_obj = Abstract("test", ports)
assert abs_obj.name == "test"
assert len(abs_obj.ports) == 2
assert abs_obj.ports["A"] is not ports["A"] # Should be deepcopied
def test_abstract_transform() -> None:
abs_obj = Abstract("test", {"A": Port((10, 0), 0)})
# Rotate 90 deg around (0,0)
abs_obj.rotate_around((0, 0), pi / 2)
# (10, 0) rot 0 -> (0, 10) rot pi/2
assert_allclose(abs_obj.ports["A"].offset, [0, 10], atol=1e-10)
assert abs_obj.ports["A"].rotation is not None
assert_allclose(abs_obj.ports["A"].rotation, pi / 2, atol=1e-10)
# Mirror across x axis (axis 0): flips y-offset
abs_obj.mirror(0)
# (0, 10) mirrored(0) -> (0, -10)
# rotation pi/2 mirrored(0) -> -pi/2 == 3pi/2
assert_allclose(abs_obj.ports["A"].offset, [0, -10], atol=1e-10)
assert abs_obj.ports["A"].rotation is not None
assert_allclose(abs_obj.ports["A"].rotation, 3 * pi / 2, atol=1e-10)
def test_abstract_ref_transform() -> None:
abs_obj = Abstract("test", {"A": Port((10, 0), 0)})
ref = Ref(offset=(100, 100), rotation=pi / 2, mirrored=True)
# Apply ref transform
abs_obj.apply_ref_transform(ref)
# Ref order: mirror, rotate, scale, translate
# 1. mirror (across x: y -> -y)
# (10, 0) rot 0 -> (10, 0) rot 0
# 2. rotate pi/2 around (0,0)
# (10, 0) rot 0 -> (0, 10) rot pi/2
# 3. translate (100, 100)
# (0, 10) -> (100, 110)
assert_allclose(abs_obj.ports["A"].offset, [100, 110], atol=1e-10)
assert abs_obj.ports["A"].rotation is not None
assert_allclose(abs_obj.ports["A"].rotation, pi / 2, atol=1e-10)
def test_abstract_ref_transform_scales_offsets() -> None:
abs_obj = Abstract("test", {"A": Port((10, 0), 0)})
ref = Ref(offset=(100, 100), rotation=pi / 2, mirrored=True, scale=2)
abs_obj.apply_ref_transform(ref)
assert_allclose(abs_obj.ports["A"].offset, [100, 120], atol=1e-10)
assert abs_obj.ports["A"].rotation is not None
assert_allclose(abs_obj.ports["A"].rotation, pi / 2, atol=1e-10)
def test_abstract_undo_transform() -> None:
abs_obj = Abstract("test", {"A": Port((100, 110), pi / 2)})
ref = Ref(offset=(100, 100), rotation=pi / 2, mirrored=True)
abs_obj.undo_ref_transform(ref)
assert_allclose(abs_obj.ports["A"].offset, [10, 0], atol=1e-10)
assert abs_obj.ports["A"].rotation is not None
assert_allclose(abs_obj.ports["A"].rotation, 0, atol=1e-10)
def test_abstract_undo_transform_scales_offsets() -> None:
abs_obj = Abstract("test", {"A": Port((100, 120), pi / 2)})
ref = Ref(offset=(100, 100), rotation=pi / 2, mirrored=True, scale=2)
abs_obj.undo_ref_transform(ref)
assert_allclose(abs_obj.ports["A"].offset, [10, 0], atol=1e-10)
assert abs_obj.ports["A"].rotation is not None
assert_allclose(abs_obj.ports["A"].rotation, 0, atol=1e-10)

View file

@ -0,0 +1,77 @@
import pytest
from numpy.testing import assert_equal
from numpy import pi
from ..builder import Pather
from ..builder.tools import PathTool
from ..library import Library
from ..ports import Port
@pytest.fixture
def advanced_pather() -> tuple[Pather, PathTool, Library]:
lib = Library()
# Simple PathTool: 2um width on layer (1,0)
tool = PathTool(layer=(1, 0), width=2, ptype="wire")
p = Pather(lib, tools=tool, auto_render=True, auto_render_append=False)
return p, tool, lib
def test_path_into_straight(advanced_pather: tuple[Pather, PathTool, Library]) -> None:
p, _tool, _lib = advanced_pather
# Facing ports
p.ports["src"] = Port((0, 0), 0, ptype="wire") # Facing East (into device)
# Forward (+pi relative to port) is West (-x).
# Put destination at (-20, 0) pointing East (pi).
p.ports["dst"] = Port((-20, 0), pi, ptype="wire")
p.trace_into("src", "dst")
assert "src" not in p.ports
assert "dst" not in p.ports
# Pather._traceL adds a Reference to the generated pattern
assert len(p.pattern.refs) == 1
def test_path_into_bend(advanced_pather: tuple[Pather, PathTool, Library]) -> None:
p, _tool, _lib = advanced_pather
# Source at (0,0) rot 0 (facing East). Forward is West (-x).
p.ports["src"] = Port((0, 0), 0, ptype="wire")
# Destination at (-20, -20) rot pi (facing West). Forward is East (+x).
# Wait, src forward is -x. dst is at -20, -20.
# To use a single bend, dst should be at some -x, -y and its rotation should be 3pi/2 (facing South).
# Forward for South is North (+y).
p.ports["dst"] = Port((-20, -20), 3 * pi / 2, ptype="wire")
p.trace_into("src", "dst")
assert "src" not in p.ports
assert "dst" not in p.ports
# `trace_into()` now batches its internal legs before auto-rendering so the operation
# can roll back cleanly on later failures.
assert len(p.pattern.refs) == 1
def test_path_into_sbend(advanced_pather: tuple[Pather, PathTool, Library]) -> None:
p, _tool, _lib = advanced_pather
# Facing but offset ports
p.ports["src"] = Port((0, 0), 0, ptype="wire") # Forward is West (-x)
p.ports["dst"] = Port((-20, -10), pi, ptype="wire") # Facing East (rot pi)
p.trace_into("src", "dst")
assert "src" not in p.ports
assert "dst" not in p.ports
def test_path_into_thru(advanced_pather: tuple[Pather, PathTool, Library]) -> None:
p, _tool, _lib = advanced_pather
p.ports["src"] = Port((0, 0), 0, ptype="wire")
p.ports["dst"] = Port((-20, 0), pi, ptype="wire")
p.ports["other"] = Port((10, 10), 0)
p.trace_into("src", "dst", thru="other")
assert "src" in p.ports
assert_equal(p.ports["src"].offset, [10, 10])
assert "other" not in p.ports

View file

@ -0,0 +1,81 @@
import pytest
from numpy.testing import assert_allclose
from numpy import pi
from ..builder import Pather
from ..builder.tools import AutoTool
from ..library import Library
from ..pattern import Pattern
from ..ports import Port
def make_straight(length: float, width: float = 2, ptype: str = "wire") -> Pattern:
pat = Pattern()
pat.rect((1, 0), xmin=0, xmax=length, yctr=0, ly=width)
pat.ports["in"] = Port((0, 0), 0, ptype=ptype)
pat.ports["out"] = Port((length, 0), pi, ptype=ptype)
return pat
@pytest.fixture
def autotool_setup() -> tuple[Pather, AutoTool, Library]:
lib = Library()
# Define a simple bend
bend_pat = Pattern()
# 2x2 bend from (0,0) rot 0 to (2, -2) rot pi/2 (Clockwise)
bend_pat.ports["in"] = Port((0, 0), 0, ptype="wire")
bend_pat.ports["out"] = Port((2, -2), pi / 2, ptype="wire")
lib["bend"] = bend_pat
lib.abstract("bend")
# Define a transition (e.g., via)
via_pat = Pattern()
via_pat.ports["m1"] = Port((0, 0), 0, ptype="wire_m1")
via_pat.ports["m2"] = Port((1, 0), pi, ptype="wire_m2")
lib["via"] = via_pat
via_abs = lib.abstract("via")
tool_m1 = AutoTool(
straights=[
AutoTool.Straight(ptype="wire_m1", fn=lambda length: make_straight(length, ptype="wire_m1"), in_port_name="in", out_port_name="out")
],
bends=[],
sbends=[],
transitions={("wire_m2", "wire_m1"): AutoTool.Transition(via_abs, "m2", "m1")},
default_out_ptype="wire_m1",
)
p = Pather(lib, tools=tool_m1)
# Start with an m2 port
p.ports["start"] = Port((0, 0), pi, ptype="wire_m2")
return p, tool_m1, lib
def test_autotool_transition(autotool_setup: tuple[Pather, AutoTool, Library]) -> None:
p, _tool, _lib = autotool_setup
# Route m1 from an m2 port. Should trigger via.
# length 10. Via length is 1. So straight m1 should be 9.
p.straight("start", 10)
# Start at (0,0) rot pi (facing West).
# Forward (+pi relative to port) is East (+x).
# Via: m2(1,0)pi -> m1(0,0)0.
# Plug via m2 into start(0,0)pi: transformation rot=mod(pi-pi-pi, 2pi)=pi.
# rotate via by pi: m2 at (0,0), m1 at (-1, 0) rot pi.
# Then straight m1 of length 9 from (-1, 0) rot pi -> ends at (8, 0) rot pi.
# Wait, (length, 0) relative to (-1, 0) rot pi:
# transform (9, 0) by pi: (-9, 0).
# (-1, 0) + (-9, 0) = (-10, 0)? No.
# Let's re-calculate.
# start (0,0) rot pi. Direction East.
# via m2 is at (0,0), m1 is at (1,0).
# When via is plugged into start: m2 goes to (0,0).
# since start is pi and m2 is pi, rotation is 0.
# so via m1 is at (1,0) rot 0.
# then straight m1 length 9 from (1,0) rot 0: ends at (10, 0) rot 0.
assert_allclose(p.ports["start"].offset, [10, 0], atol=1e-10)
assert p.ports["start"].ptype == "wire_m1"

View file

@ -0,0 +1,306 @@
import pytest
from numpy.testing import assert_allclose
from numpy import pi
from masque.builder.tools import AutoTool
from masque.pattern import Pattern
from masque.ports import Port
from masque.library import Library
from masque.builder.pather import Pather
def make_straight(length, width=2, ptype="wire"):
pat = Pattern()
pat.rect((1, 0), xmin=0, xmax=length, yctr=0, ly=width)
pat.ports["A"] = Port((0, 0), 0, ptype=ptype)
pat.ports["B"] = Port((length, 0), pi, ptype=ptype)
return pat
def make_bend(R, width=2, ptype="wire", clockwise=True):
pat = Pattern()
# 90 degree arc approximation (just two rects for start and end)
if clockwise:
# (0,0) rot 0 to (R, -R) rot pi/2
pat.rect((1, 0), xmin=0, xmax=R, yctr=0, ly=width)
pat.rect((1, 0), xctr=R, lx=width, ymin=-R, ymax=0)
pat.ports["A"] = Port((0, 0), 0, ptype=ptype)
pat.ports["B"] = Port((R, -R), pi/2, ptype=ptype)
else:
# (0,0) rot 0 to (R, R) rot -pi/2
pat.rect((1, 0), xmin=0, xmax=R, yctr=0, ly=width)
pat.rect((1, 0), xctr=R, lx=width, ymin=0, ymax=R)
pat.ports["A"] = Port((0, 0), 0, ptype=ptype)
pat.ports["B"] = Port((R, R), -pi/2, ptype=ptype)
return pat
@pytest.fixture
def multi_bend_tool():
lib = Library()
# Bend 1: R=2
lib["b1"] = make_bend(2, ptype="wire")
b1_abs = lib.abstract("b1")
# Bend 2: R=5
lib["b2"] = make_bend(5, ptype="wire")
b2_abs = lib.abstract("b2")
tool = AutoTool(
straights=[
# Straight 1: only for length < 10
AutoTool.Straight(ptype="wire", fn=make_straight, in_port_name="A", out_port_name="B", length_range=(0, 10)),
# Straight 2: for length >= 10
AutoTool.Straight(ptype="wire", fn=lambda l: make_straight(l, width=4), in_port_name="A", out_port_name="B", length_range=(10, 1e8))
],
bends=[
AutoTool.Bend(b1_abs, "A", "B", clockwise=True, mirror=True),
AutoTool.Bend(b2_abs, "A", "B", clockwise=True, mirror=True)
],
sbends=[],
transitions={},
default_out_ptype="wire"
)
return tool, lib
@pytest.fixture
def asymmetric_transition_tool() -> AutoTool:
lib = Library()
bend_pat = Pattern()
bend_pat.ports["in"] = Port((0, 0), 0, ptype="core")
bend_pat.ports["out"] = Port((2, -2), pi / 2, ptype="core")
lib["core_bend"] = bend_pat
trans_pat = Pattern()
trans_pat.ports["CORE"] = Port((0, 0), 0, ptype="core")
trans_pat.ports["MID"] = Port((3, 1), pi, ptype="mid")
lib["core_mid"] = trans_pat
return AutoTool(
straights=[
AutoTool.Straight(
ptype="core",
fn=lambda length: make_straight(length, ptype="core"),
in_port_name="A",
out_port_name="B",
length_range=(0, 3),
),
AutoTool.Straight(
ptype="mid",
fn=lambda length: make_straight(length, ptype="mid"),
in_port_name="A",
out_port_name="B",
length_range=(0, 1e8),
),
],
bends=[
AutoTool.Bend(lib.abstract("core_bend"), "in", "out", clockwise=True, mirror=True),
],
sbends=[],
transitions={
("mid", "core"): AutoTool.Transition(lib.abstract("core_mid"), "MID", "CORE"),
},
default_out_ptype="core",
).add_complementary_transitions()
def assert_trace_matches_plan(plan_port: Port, tree: Library, port_names: tuple[str, str] = ("A", "B")) -> None:
pat = tree.top_pattern()
out_port = pat[port_names[1]]
dxy, rot = pat[port_names[0]].measure_travel(out_port)
assert_allclose(dxy, plan_port.offset)
assert rot is not None
assert plan_port.rotation is not None
assert_allclose(rot, plan_port.rotation)
assert out_port.ptype == plan_port.ptype
def test_autotool_planL_selection(multi_bend_tool) -> None:
tool, _ = multi_bend_tool
# Small length: should pick straight 1 and bend 1 (R=2)
# L = straight + R. If L=5, straight=3.
p, data = tool.planL(True, 5)
assert data.straight.length_range == (0, 10)
assert data.straight_length == 3
assert data.bend.abstract.name == "b1"
assert_allclose(p.offset, [5, 2])
# Large length: should pick straight 2 and bend 1 (R=2)
# If L=15, straight=13.
p, data = tool.planL(True, 15)
assert data.straight.length_range == (10, 1e8)
assert data.straight_length == 13
assert_allclose(p.offset, [15, 2])
def test_autotool_planU_consistency(multi_bend_tool) -> None:
tool, lib = multi_bend_tool
# length=10, jog=20.
# U-turn: Straight1 -> Bend1 -> Straight_mid -> Straight3(0) -> Bend2
# X = L1_total - R2 = length
# Y = R1 + L2_mid + R2 = jog
p, data = tool.planU(20, length=10)
assert data.ldata0.straight_length == 7
assert data.ldata0.bend.abstract.name == "b2"
assert data.l2_length == 13
assert data.ldata1.straight_length == 0
assert data.ldata1.bend.abstract.name == "b1"
def test_autotool_traceU_matches_plan_with_asymmetric_transition(asymmetric_transition_tool: AutoTool) -> None:
tool = asymmetric_transition_tool
plan_port, data = tool.planU(12, length=0, in_ptype="core")
assert data.ldata1.in_transition is not None
assert data.ldata1.b_transition is not None
tree = tool.traceU(12, length=0, in_ptype="core")
assert_trace_matches_plan(plan_port, tree)
def test_autotool_planS_double_L(multi_bend_tool) -> None:
tool, lib = multi_bend_tool
# length=20, jog=10. S-bend (ccw1, cw2)
# X = L1_total + R2 = length
# Y = R1 + L2_mid + R2 = jog
p, data = tool.planS(20, 10)
assert_allclose(p.offset, [20, 10])
assert_allclose(p.rotation, pi)
assert data.ldata0.straight_length == 16
assert data.ldata1.straight_length == 0
assert data.l2_length == 6
def test_autotool_traceS_double_l_matches_plan_with_asymmetric_transition(asymmetric_transition_tool: AutoTool) -> None:
tool = asymmetric_transition_tool
plan_port, data = tool.planS(4, 10, in_ptype="core")
assert isinstance(data, AutoTool.UData)
assert data.ldata1.in_transition is not None
assert data.ldata1.b_transition is not None
tree = tool.traceS(4, 10, in_ptype="core")
assert_trace_matches_plan(plan_port, tree)
def test_autotool_planS_pure_sbend_with_transition_dx() -> None:
lib = Library()
def make_straight(length: float) -> Pattern:
pat = Pattern()
pat.ports["A"] = Port((0, 0), 0, ptype="core")
pat.ports["B"] = Port((length, 0), pi, ptype="core")
return pat
def make_sbend(jog: float) -> Pattern:
pat = Pattern()
pat.ports["A"] = Port((0, 0), 0, ptype="core")
pat.ports["B"] = Port((10, jog), pi, ptype="core")
return pat
trans_pat = Pattern()
trans_pat.ports["EXT"] = Port((0, 0), 0, ptype="ext")
trans_pat.ports["CORE"] = Port((5, 0), pi, ptype="core")
lib["xin"] = trans_pat
tool = AutoTool(
straights=[
AutoTool.Straight(
ptype="core",
fn=make_straight,
in_port_name="A",
out_port_name="B",
length_range=(1, 1e8),
)
],
bends=[],
sbends=[
AutoTool.SBend(
ptype="core",
fn=make_sbend,
in_port_name="A",
out_port_name="B",
jog_range=(0, 1e8),
)
],
transitions={
("ext", "core"): AutoTool.Transition(lib.abstract("xin"), "EXT", "CORE"),
},
default_out_ptype="core",
)
p, data = tool.planS(15, 4, in_ptype="ext")
assert_allclose(p.offset, [15, 4])
assert_allclose(p.rotation, pi)
assert data.straight_length == 0
assert data.jog_remaining == 4
assert data.in_transition is not None
def test_renderpather_autotool_double_L(multi_bend_tool) -> None:
tool, lib = multi_bend_tool
rp = Pather(lib, tools=tool, auto_render=False)
rp.ports["A"] = Port((0,0), 0, ptype="wire")
# This should trigger double-L fallback in planS
rp.jog("A", 10, length=20)
# port_rot=0 -> forward is -x. jog=10 (left) is -y.
assert_allclose(rp.ports["A"].offset, [-20, -10])
assert_allclose(rp.ports["A"].rotation, 0) # jog rot is pi relative to input, input rot is pi relative to port.
# Wait, planS returns out_port at (length, jog) rot pi relative to input (0,0) rot 0.
# Input rot relative to port is pi.
# Rotate (length, jog) rot pi by pi: (-length, -jog) rot 0. Correct.
rp.render()
assert len(rp.pattern.refs) > 0
def test_pather_uturn_fallback_no_heuristic(multi_bend_tool) -> None:
tool, lib = multi_bend_tool
class BasicTool(AutoTool):
def planU(self, *args, **kwargs):
raise NotImplementedError()
tool_basic = BasicTool(
straights=tool.straights,
bends=tool.bends,
sbends=tool.sbends,
transitions=tool.transitions,
default_out_ptype=tool.default_out_ptype
)
p = Pather(lib, tools=tool_basic)
p.ports["A"] = Port((0,0), 0, ptype="wire") # facing West (Actually East points Inwards, West is Extension)
# uturn jog=10, length=5.
# R=2. L1 = 5+2=7. L2 = 10-2=8.
p.uturn("A", 10, length=5)
# port_rot=0 -> forward is -x. jog=10 (left) is -y.
# L1=7 along -x -> (-7, 0). Bend1 (ccw) -> rot -pi/2 (South).
# L2=8 along -y -> (-7, -8). Bend2 (ccw) -> rot 0 (East).
# wait. CCW turn from facing South (-y): turn towards East (+x).
# Wait.
# Input facing -x. CCW turn -> face -y.
# Input facing -y. CCW turn -> face +x.
# So final rotation is 0.
# Bend1 (ccw) relative to -x: global offset is (-7, -2)?
# Let's re-run my manual calculation.
# Port rot 0. Wire input rot pi. Wire output relative to input:
# L1=7, R1=2, CCW=True. Output (7, 2) rot pi/2.
# Rotate wire by pi: output (-7, -2) rot 3pi/2.
# Second turn relative to (-7, -2) rot 3pi/2:
# local output (8, 2) rot pi/2.
# global: (-7, -2) + 8*rot(3pi/2)*x + 2*rot(3pi/2)*y
# = (-7, -2) + 8*(0, -1) + 2*(1, 0) = (-7, -2) + (0, -8) + (2, 0) = (-5, -10).
# YES! ACTUAL result was (-5, -10).
assert_allclose(p.ports["A"].offset, [-5, -10])
assert_allclose(p.ports["A"].rotation, pi)

247
masque/test/test_boolean.py Normal file
View file

@ -0,0 +1,247 @@
# ruff: noqa: PLC0415
import pytest
import numpy
from numpy.testing import assert_allclose
from masque.pattern import Pattern
from masque.shapes.polygon import Polygon
from masque.repetition import Grid
from masque.library import Library
from masque.error import PatternError
def _poly_area(poly: Polygon) -> float:
verts = poly.vertices
x = verts[:, 0]
y = verts[:, 1]
return 0.5 * abs(numpy.dot(x, numpy.roll(y, -1)) - numpy.dot(y, numpy.roll(x, -1)))
def test_layer_as_polygons_basic() -> None:
pat = Pattern()
pat.polygon((1, 0), [[0, 0], [1, 0], [1, 1], [0, 1]])
polys = pat.layer_as_polygons((1, 0), flatten=False)
assert len(polys) == 1
assert isinstance(polys[0], Polygon)
assert_allclose(polys[0].vertices, [[0, 0], [1, 0], [1, 1], [0, 1]])
def test_layer_as_polygons_repetition() -> None:
pat = Pattern()
rep = Grid(a_vector=(2, 0), a_count=2)
pat.polygon((1, 0), [[0, 0], [1, 0], [1, 1], [0, 1]], repetition=rep)
polys = pat.layer_as_polygons((1, 0), flatten=False)
assert len(polys) == 2
# First polygon at (0,0)
assert_allclose(polys[0].vertices, [[0, 0], [1, 0], [1, 1], [0, 1]])
# Second polygon at (2,0)
assert_allclose(polys[1].vertices, [[2, 0], [3, 0], [3, 1], [2, 1]])
def test_layer_as_polygons_flatten() -> None:
lib = Library()
child = Pattern()
child.polygon((1, 0), [[0, 0], [1, 0], [1, 1]])
lib['child'] = child
parent = Pattern()
parent.ref('child', offset=(10, 10), rotation=numpy.pi/2)
polys = parent.layer_as_polygons((1, 0), flatten=True, library=lib)
assert len(polys) == 1
# Original child at (0,0) with rot pi/2 is still at (0,0) in its own space?
# No, ref.as_pattern(child) will apply the transform.
# Child (0,0), (1,0), (1,1) rotated pi/2 around (0,0) -> (0,0), (0,1), (-1,1)
# Then offset by (10,10) -> (10,10), (10,11), (9,11)
# Let's verify the vertices
expected = numpy.array([[10, 10], [10, 11], [9, 11]])
assert_allclose(polys[0].vertices, expected, atol=1e-10)
def test_boolean_import_error() -> None:
from masque import boolean
# If pyclipper is not installed, this should raise ImportError
try:
import pyclipper # noqa: F401
pytest.skip("pyclipper is installed, cannot test ImportError")
except ImportError:
with pytest.raises(ImportError, match="Boolean operations require 'pyclipper'"):
boolean([], [], operation='union')
def test_polygon_boolean_shortcut() -> None:
poly = Polygon([[0, 0], [1, 0], [1, 1]])
# This should also raise ImportError if pyclipper is missing
try:
import pyclipper # noqa: F401
pytest.skip("pyclipper is installed")
except ImportError:
with pytest.raises(ImportError, match="Boolean operations require 'pyclipper'"):
poly.boolean(poly)
def test_boolean_intersection_with_pyclipper() -> None:
pytest.importorskip("pyclipper")
from masque.utils.boolean import boolean
result = boolean(
[Polygon([[0, 0], [2, 0], [2, 2], [0, 2]])],
[Polygon([[1, 1], [3, 1], [3, 3], [1, 3]])],
operation='intersection',
)
assert len(result) == 1
assert_allclose(result[0].get_bounds_single(), [[1, 1], [2, 2]], atol=1e-10)
def test_polygon_boolean_shortcut_with_pyclipper() -> None:
pytest.importorskip("pyclipper")
poly = Polygon([[0, 0], [2, 0], [2, 2], [0, 2]])
result = poly.boolean(
Polygon([[1, 1], [3, 1], [3, 3], [1, 3]]),
operation='intersection',
)
assert len(result) == 1
assert_allclose(result[0].get_bounds_single(), [[1, 1], [2, 2]], atol=1e-10)
def test_boolean_union_difference_and_xor_with_pyclipper() -> None:
pytest.importorskip("pyclipper")
from masque.utils.boolean import boolean
rect_a = Polygon([[0, 0], [2, 0], [2, 2], [0, 2]])
rect_b = Polygon([[1, 1], [3, 1], [3, 3], [1, 3]])
union = boolean([rect_a], [rect_b], operation='union')
assert len(union) == 1
assert_allclose(union[0].get_bounds_single(), [[0, 0], [3, 3]], atol=1e-10)
assert_allclose(_poly_area(union[0]), 7, atol=1e-10)
difference = boolean([rect_a], [rect_b], operation='difference')
assert len(difference) == 1
assert_allclose(difference[0].get_bounds_single(), [[0, 0], [2, 2]], atol=1e-10)
assert_allclose(_poly_area(difference[0]), 3, atol=1e-10)
xor = boolean([rect_a], [rect_b], operation='xor')
assert len(xor) == 2
assert_allclose(sorted(_poly_area(poly) for poly in xor), [3, 3], atol=1e-10)
xor_bounds = sorted(tuple(map(tuple, poly.get_bounds_single())) for poly in xor)
assert xor_bounds == [((0.0, 0.0), (2.0, 2.0)), ((1.0, 1.0), (3.0, 3.0))]
def test_boolean_accepts_raw_vertices_and_single_shape_inputs() -> None:
pytest.importorskip("pyclipper")
from masque.utils.boolean import boolean
raw_result = boolean(
[numpy.array([[0, 0], [2, 0], [2, 2], [0, 2]])],
numpy.array([[1, 1], [3, 1], [3, 3], [1, 3]]),
operation='intersection',
)
assert len(raw_result) == 1
assert_allclose(raw_result[0].get_bounds_single(), [[1, 1], [2, 2]], atol=1e-10)
assert_allclose(_poly_area(raw_result[0]), 1, atol=1e-10)
single_shape_result = boolean(
Polygon([[0, 0], [2, 0], [2, 2], [0, 2]]),
Polygon([[1, 1], [3, 1], [3, 3], [1, 3]]),
operation='intersection',
)
assert len(single_shape_result) == 1
assert_allclose(single_shape_result[0].get_bounds_single(), [[1, 1], [2, 2]], atol=1e-10)
def test_boolean_handles_multi_polygon_inputs() -> None:
pytest.importorskip("pyclipper")
from masque.utils.boolean import boolean
result = boolean(
[
Polygon([[0, 0], [2, 0], [2, 2], [0, 2]]),
Polygon([[10, 0], [12, 0], [12, 2], [10, 2]]),
],
[
Polygon([[1, 1], [3, 1], [3, 3], [1, 3]]),
Polygon([[11, 1], [13, 1], [13, 3], [11, 3]]),
],
operation='intersection',
)
assert len(result) == 2
assert_allclose(sorted(_poly_area(poly) for poly in result), [1, 1], atol=1e-10)
result_bounds = sorted(tuple(map(tuple, poly.get_bounds_single())) for poly in result)
assert result_bounds == [((1.0, 1.0), (2.0, 2.0)), ((11.0, 1.0), (12.0, 2.0))]
def test_boolean_difference_preserves_hole_area_via_bridged_polygon() -> None:
pytest.importorskip("pyclipper")
from masque.utils.boolean import boolean
outer = Polygon([[0, 0], [10, 0], [10, 10], [0, 10]])
hole = Polygon([[2, 2], [8, 2], [8, 8], [2, 8]])
result = boolean([outer], [hole], operation='difference')
assert len(result) == 1
assert_allclose(result[0].get_bounds_single(), [[0, 0], [10, 10]], atol=1e-10)
assert_allclose(_poly_area(result[0]), 64, atol=1e-10)
def test_boolean_nested_hole_and_island_case() -> None:
pytest.importorskip("pyclipper")
from masque.utils.boolean import boolean
outer = Polygon([[0, 0], [10, 0], [10, 10], [0, 10]])
hole = Polygon([[2, 2], [8, 2], [8, 8], [2, 8]])
island = Polygon([[4, 4], [6, 4], [6, 6], [4, 6]])
result = boolean([outer, island], [hole], operation='union')
assert len(result) == 1
assert_allclose(result[0].get_bounds_single(), [[0, 0], [10, 10]], atol=1e-10)
assert_allclose(_poly_area(result[0]), 100, atol=1e-10)
def test_boolean_empty_inputs_follow_set_semantics() -> None:
pytest.importorskip("pyclipper")
from masque.utils.boolean import boolean
rect = Polygon([[1, 1], [3, 1], [3, 3], [1, 3]])
union = boolean([], [rect], operation='union')
assert len(union) == 1
assert_allclose(union[0].get_bounds_single(), [[1, 1], [3, 3]], atol=1e-10)
intersection = boolean([], [rect], operation='intersection')
assert intersection == []
difference = boolean([], [rect], operation='difference')
assert difference == []
xor = boolean([], [rect], operation='xor')
assert len(xor) == 1
assert_allclose(xor[0].get_bounds_single(), [[1, 1], [3, 3]], atol=1e-10)
clip_empty_union = boolean([rect], [], operation='union')
assert len(clip_empty_union) == 1
assert_allclose(clip_empty_union[0].get_bounds_single(), [[1, 1], [3, 3]], atol=1e-10)
clip_empty_intersection = boolean([rect], [], operation='intersection')
assert clip_empty_intersection == []
clip_empty_difference = boolean([rect], [], operation='difference')
assert len(clip_empty_difference) == 1
assert_allclose(clip_empty_difference[0].get_bounds_single(), [[1, 1], [3, 3]], atol=1e-10)
clip_empty_xor = boolean([rect], [], operation='xor')
assert len(clip_empty_xor) == 1
assert_allclose(clip_empty_xor[0].get_bounds_single(), [[1, 1], [3, 3]], atol=1e-10)
def test_boolean_invalid_inputs_raise_pattern_error() -> None:
pytest.importorskip("pyclipper")
from masque.utils.boolean import boolean
rect = Polygon([[0, 0], [1, 0], [1, 1], [0, 1]])
for bad in (123, object(), [123]):
with pytest.raises(PatternError, match='Unsupported type'):
boolean([rect], bad, operation='intersection')

163
masque/test/test_builder.py Normal file
View file

@ -0,0 +1,163 @@
import numpy
import pytest
from numpy.testing import assert_equal, assert_allclose
from numpy import pi
from ..builder import Pather
from ..builder.utils import ell
from ..error import BuildError
from ..library import Library
from ..pattern import Pattern
from ..ports import Port
def test_builder_init() -> None:
lib = Library()
b = Pather(lib, name="mypat")
assert b.pattern is lib["mypat"]
assert b.library is lib
def test_builder_place() -> None:
lib = Library()
child = Pattern()
child.ports["A"] = Port((0, 0), 0)
lib["child"] = child
b = Pather(lib)
b.place("child", offset=(10, 20), port_map={"A": "child_A"})
assert "child_A" in b.ports
assert_equal(b.ports["child_A"].offset, [10, 20])
assert "child" in b.pattern.refs
def test_builder_plug() -> None:
lib = Library()
wire = Pattern()
wire.ports["in"] = Port((0, 0), 0)
wire.ports["out"] = Port((10, 0), pi)
lib["wire"] = wire
b = Pather(lib)
b.ports["start"] = Port((100, 100), 0)
# Plug wire's "in" port into builder's "start" port
# Wire's "out" port should be renamed to "start" because thru=True (default) and wire has 2 ports
# builder start: (100, 100) rotation 0
# wire in: (0, 0) rotation 0
# wire out: (10, 0) rotation pi
# Plugging wire in (rot 0) to builder start (rot 0) means wire is rotated by pi (180 deg)
# so wire in is at (100, 100), wire out is at (100 - 10, 100) = (90, 100)
b.plug("wire", map_in={"start": "in"})
assert "start" in b.ports
assert_equal(b.ports["start"].offset, [90, 100])
assert b.ports["start"].rotation is not None
assert_allclose(b.ports["start"].rotation, 0, atol=1e-10)
def test_builder_interface() -> None:
lib = Library()
source = Pattern()
source.ports["P1"] = Port((0, 0), 0)
lib["source"] = source
b = Pather.interface("source", library=lib, name="iface")
assert "in_P1" in b.ports
assert "P1" in b.ports
assert b.pattern is lib["iface"]
def test_builder_set_dead() -> None:
lib = Library()
lib["sub"] = Pattern()
b = Pather(lib)
b.set_dead()
b.place("sub")
assert not b.pattern.has_refs()
def test_builder_dead_ports() -> None:
lib = Library()
pat = Pattern()
pat.ports['A'] = Port((0, 0), 0)
b = Pather(lib, pattern=pat)
b.set_dead()
# Attempt to plug a device where ports don't line up
# A has rotation 0, C has rotation 0. plug() expects opposing rotations (pi difference).
other = Pattern(ports={'C': Port((10, 10), 0), 'D': Port((20, 20), 0)})
# This should NOT raise PortError because b is dead
b.plug(other, map_in={'A': 'C'}, map_out={'D': 'B'})
# Port A should be removed, and Port B (renamed from D) should be added
assert 'A' not in b.ports
assert 'B' in b.ports
# Verify geometry was not added
assert not b.pattern.has_refs()
assert not b.pattern.has_shapes()
def test_dead_plug_best_effort() -> None:
lib = Library()
pat = Pattern()
pat.ports['A'] = Port((0, 0), 0)
b = Pather(lib, pattern=pat)
b.set_dead()
# Device with multiple ports, none of which line up correctly
other = Pattern(ports={
'P1': Port((10, 10), 0), # Wrong rotation (0 instead of pi)
'P2': Port((20, 20), pi) # Correct rotation but wrong offset
})
# Try to plug. find_transform will fail.
# It should fall back to aligning the first pair ('A' and 'P1').
b.plug(other, map_in={'A': 'P1'}, map_out={'P2': 'B'})
assert 'A' not in b.ports
assert 'B' in b.ports
# Dummy transform aligns A (0,0) with P1 (10,10)
# A rotation 0, P1 rotation 0 -> rotation = (0 - 0 - pi) = -pi
# P2 (20,20) rotation pi:
# 1. Translate P2 so P1 is at origin: (20,20) - (10,10) = (10,10)
# 2. Rotate (10,10) by -pi: (-10,-10)
# 3. Translate by s_port.offset (0,0): (-10,-10)
assert_allclose(b.ports['B'].offset, [-10, -10], atol=1e-10)
# P2 rot pi + transform rot -pi = 0
assert b.ports['B'].rotation is not None
assert_allclose(b.ports['B'].rotation, 0, atol=1e-10)
def test_ell_validates_spacing_length() -> None:
ports = {
'A': Port((0, 0), 0),
'B': Port((0, 1), 0),
'C': Port((0, 2), 0),
}
with pytest.raises(BuildError, match='spacing must be scalar or have length 2'):
ell(ports, True, 'min_extension', 5, spacing=[1, 2, 3])
with pytest.raises(BuildError, match='spacing must be scalar or have length 2'):
ell(ports, True, 'min_extension', 5, spacing=[])
def test_ell_handles_array_spacing_when_ccw_none() -> None:
ports = {
'A': Port((0, 0), 0),
'B': Port((0, 1), 0),
}
scalar = ell(ports, None, 'min_extension', 5, spacing=0)
array_zero = ell(ports, None, 'min_extension', 5, spacing=numpy.array([0, 0]))
assert scalar == array_zero
with pytest.raises(BuildError, match='Spacing must be 0 or None'):
ell(ports, None, 'min_extension', 5, spacing=numpy.array([1, 0]))

184
masque/test/test_dxf.py Normal file
View file

@ -0,0 +1,184 @@
import io
import numpy
import ezdxf
from numpy.testing import assert_allclose
from pathlib import Path
from ..pattern import Pattern
from ..library import Library
from ..shapes import Path as MPath, Polygon
from ..repetition import Grid
from ..file import dxf
def _matches_open_path(actual: numpy.ndarray, expected: numpy.ndarray) -> bool:
return bool(
numpy.allclose(actual, expected)
or numpy.allclose(actual, expected[::-1])
)
def _matches_closed_vertices(actual: numpy.ndarray, expected: numpy.ndarray) -> bool:
return {tuple(row) for row in actual.tolist()} == {tuple(row) for row in expected.tolist()}
def test_dxf_roundtrip(tmp_path: Path):
lib = Library()
pat = Pattern()
# 1. Polygon (closed)
poly_verts = numpy.array([[0, 0], [10, 0], [10, 10], [0, 10]])
pat.polygon("1", vertices=poly_verts)
# 2. Path (open, 3 points)
path_verts = numpy.array([[20, 0], [30, 0], [30, 10]])
pat.path("2", vertices=path_verts, width=2)
# 3. Path (open, 2 points) - Testing the fix for 2-point polylines
path2_verts = numpy.array([[40, 0], [50, 10]])
pat.path("3", vertices=path2_verts, width=0) # width 0 to be sure it's not a polygonized path if we're not careful
# 4. Ref with Grid repetition (Manhattan)
subpat = Pattern()
subpat.polygon("sub", vertices=[[0, 0], [1, 0], [1, 1]])
lib["sub"] = subpat
pat.ref("sub", offset=(100, 100), repetition=Grid(a_vector=(10, 0), a_count=2, b_vector=(0, 10), b_count=3))
lib["top"] = pat
dxf_file = tmp_path / "test.dxf"
dxf.writefile(lib, "top", dxf_file)
read_lib, _ = dxf.readfile(dxf_file)
# In DXF read, the top level is usually called "Model"
top_pat = read_lib.get("Model") or read_lib.get("top") or list(read_lib.values())[0]
# Verify Polygon
polys = [s for s in top_pat.shapes["1"] if isinstance(s, Polygon)]
assert len(polys) >= 1
poly_read = polys[0]
assert _matches_closed_vertices(poly_read.vertices, poly_verts)
# Verify 3-point Path
paths = [s for s in top_pat.shapes["2"] if isinstance(s, MPath)]
assert len(paths) >= 1
path_read = paths[0]
assert _matches_open_path(path_read.vertices, path_verts)
assert path_read.width == 2
# Verify 2-point Path
paths2 = [s for s in top_pat.shapes["3"] if isinstance(s, MPath)]
assert len(paths2) >= 1
path2_read = paths2[0]
assert _matches_open_path(path2_read.vertices, path2_verts)
assert path2_read.width == 0
# Verify Ref with Grid
# Finding the sub pattern name might be tricky because of how DXF stores blocks
# but "sub" should be in read_lib
assert "sub" in read_lib
# Check refs in the top pattern
found_grid = False
for target, reflist in top_pat.refs.items():
# DXF names might be case-insensitive or modified, but ezdxf usually preserves them
if target.upper() == "SUB":
for ref in reflist:
if isinstance(ref.repetition, Grid):
assert ref.repetition.a_count == 2
assert ref.repetition.b_count == 3
assert_allclose(ref.repetition.a_vector, (10, 0))
assert_allclose(ref.repetition.b_vector, (0, 10))
found_grid = True
assert found_grid, f"Manhattan Grid repetition should have been preserved. Targets: {list(top_pat.refs.keys())}"
def test_dxf_manhattan_precision(tmp_path: Path):
# Test that float precision doesn't break Manhattan grid detection
lib = Library()
sub = Pattern()
sub.polygon("1", vertices=[[0, 0], [1, 0], [1, 1]])
lib["sub"] = sub
top = Pattern()
# 90 degree rotation: in masque the grid is NOT rotated, so it stays [[10,0],[0,10]]
# In DXF, an array with rotation 90 has basis vectors [[0,10],[-10,0]].
# So a masque grid [[10,0],[0,10]] with ref rotation 90 matches a DXF array.
angle = numpy.pi / 2 # 90 degrees
top.ref("sub", offset=(0, 0), rotation=angle,
repetition=Grid(a_vector=(10, 0), a_count=2, b_vector=(0, 10), b_count=2))
lib["top"] = top
dxf_file = tmp_path / "precision.dxf"
dxf.writefile(lib, "top", dxf_file)
# If the isclose() fix works, this should still be a Grid when read back
read_lib, _ = dxf.readfile(dxf_file)
read_top = read_lib.get("Model") or read_lib.get("top") or list(read_lib.values())[0]
target_name = next(k for k in read_top.refs if k.upper() == "SUB")
ref = read_top.refs[target_name][0]
assert isinstance(ref.repetition, Grid), "Grid should be preserved for 90-degree rotation"
def test_dxf_rotated_grid_roundtrip_preserves_basis_and_counts(tmp_path: Path):
lib = Library()
sub = Pattern()
sub.polygon("1", vertices=[[0, 0], [1, 0], [1, 1]])
lib["sub"] = sub
top = Pattern()
top.ref(
"sub",
offset=(0, 0),
rotation=numpy.pi / 2,
repetition=Grid(a_vector=(10, 0), a_count=3, b_vector=(0, 20), b_count=2),
)
lib["top"] = top
dxf_file = tmp_path / "rotated_grid.dxf"
dxf.writefile(lib, "top", dxf_file)
read_lib, _ = dxf.readfile(dxf_file)
read_top = read_lib.get("Model") or read_lib.get("top") or list(read_lib.values())[0]
target_name = next(k for k in read_top.refs if k.upper() == "SUB")
ref = read_top.refs[target_name][0]
assert isinstance(ref.repetition, Grid)
actual = ref.repetition.displacements
expected = Grid(a_vector=(10, 0), a_count=3, b_vector=(0, 20), b_count=2).displacements
assert_allclose(
actual[numpy.lexsort((actual[:, 1], actual[:, 0]))],
expected[numpy.lexsort((expected[:, 1], expected[:, 0]))],
)
def test_dxf_read_legacy_polyline() -> None:
doc = ezdxf.new()
msp = doc.modelspace()
msp.add_polyline2d([(0, 0), (10, 0), (10, 10)], dxfattribs={"layer": "legacy"}).close(True)
stream = io.StringIO()
doc.write(stream)
stream.seek(0)
read_lib, _ = dxf.read(stream)
top_pat = read_lib.get("Model") or list(read_lib.values())[0]
polys = [shape for shape in top_pat.shapes["legacy"] if isinstance(shape, Polygon)]
assert len(polys) == 1
assert _matches_closed_vertices(polys[0].vertices, numpy.array([[0, 0], [10, 0], [10, 10]]))
def test_dxf_read_ignores_unreferenced_setup_blocks() -> None:
lib = Library({"top": Pattern()})
stream = io.StringIO()
dxf.write(lib, "top", stream)
stream.seek(0)
read_lib, _ = dxf.read(stream)
assert set(read_lib) == {"Model"}

24
masque/test/test_fdfd.py Normal file
View file

@ -0,0 +1,24 @@
# ruff: noqa
# ruff: noqa: ARG001
import dataclasses
import pytest # type: ignore
import numpy
from numpy import pi
from numpy.typing import NDArray
# from numpy.testing import assert_allclose, assert_array_equal
from .. import Pattern, Arc, Circle
def test_circle_mirror():
cc = Circle(radius=4, offset=(10, 20))
cc.flip_across(axis=0) # flip across y=0
assert cc.offset[0] == 10
assert cc.offset[1] == -20
assert cc.radius == 4
cc.flip_across(axis=1) # flip across x=0
assert cc.offset[0] == -10
assert cc.offset[1] == -20
assert cc.radius == 4

View file

@ -0,0 +1,179 @@
from pathlib import Path
from typing import cast
import pytest
from numpy.testing import assert_allclose
from ..pattern import Pattern
from ..library import Library
from ..shapes import Path as MPath, Circle, Polygon, RectCollection
from ..repetition import Grid, Arbitrary
def create_test_library(for_gds: bool = False) -> Library:
lib = Library()
# 1. Polygons
pat_poly = Pattern()
pat_poly.polygon((1, 0), vertices=[[0, 0], [10, 0], [5, 10]])
lib["polygons"] = pat_poly
# 2. Paths with different endcaps
pat_paths = Pattern()
# Flush
pat_paths.path((2, 0), vertices=[[0, 0], [20, 0]], width=2, cap=MPath.Cap.Flush)
# Square
pat_paths.path((2, 1), vertices=[[0, 10], [20, 10]], width=2, cap=MPath.Cap.Square)
# Circle (Only for GDS)
if for_gds:
pat_paths.path((2, 2), vertices=[[0, 20], [20, 20]], width=2, cap=MPath.Cap.Circle)
# SquareCustom
pat_paths.path((2, 3), vertices=[[0, 30], [20, 30]], width=2, cap=MPath.Cap.SquareCustom, cap_extensions=(1, 5))
lib["paths"] = pat_paths
# 3. Circles (only for OASIS or polygonized for GDS)
pat_circles = Pattern()
if for_gds:
# GDS writer calls to_polygons() for non-supported shapes,
# but we can also pre-polygonize
pat_circles.shapes[(3, 0)].append(Circle(radius=5, offset=(10, 10)).to_polygons()[0])
else:
pat_circles.shapes[(3, 0)].append(Circle(radius=5, offset=(10, 10)))
lib["circles"] = pat_circles
# 4. Refs with repetitions
pat_refs = Pattern()
# Simple Ref
pat_refs.ref("polygons", offset=(0, 0))
# Ref with Grid repetition
pat_refs.ref("polygons", offset=(100, 0), repetition=Grid(a_vector=(20, 0), a_count=3, b_vector=(0, 20), b_count=2))
# Ref with Arbitrary repetition
pat_refs.ref("polygons", offset=(0, 100), repetition=Arbitrary(displacements=[[0, 0], [10, 20], [30, -10]]))
lib["refs"] = pat_refs
# 5. Shapes with repetitions (OASIS only, must be wrapped for GDS)
pat_rep_shapes = Pattern()
poly_rep = Polygon(vertices=[[0, 0], [5, 0], [5, 5], [0, 5]], repetition=Grid(a_vector=(10, 0), a_count=5))
pat_rep_shapes.shapes[(4, 0)].append(poly_rep)
lib["rep_shapes"] = pat_rep_shapes
if for_gds:
lib.wrap_repeated_shapes()
return lib
def test_gdsii_full_roundtrip(tmp_path: Path) -> None:
from ..file import gdsii
lib = create_test_library(for_gds=True)
gds_file = tmp_path / "full_test.gds"
gdsii.writefile(lib, gds_file, meters_per_unit=1e-9)
read_lib, _ = gdsii.readfile(gds_file)
# Check existence
for name in lib:
assert name in read_lib
# Check Paths
read_paths = read_lib["paths"]
# Check caps (GDS stores them as path_type)
# Order might be different depending on how they were written,
# but here they should match the order they were added if dict order is preserved.
# Actually, they are grouped by layer.
p_flush = cast("MPath", read_paths.shapes[(2, 0)][0])
assert p_flush.cap == MPath.Cap.Flush
p_square = cast("MPath", read_paths.shapes[(2, 1)][0])
assert p_square.cap == MPath.Cap.Square
p_circle = cast("MPath", read_paths.shapes[(2, 2)][0])
assert p_circle.cap == MPath.Cap.Circle
p_custom = cast("MPath", read_paths.shapes[(2, 3)][0])
assert p_custom.cap == MPath.Cap.SquareCustom
assert p_custom.cap_extensions is not None
assert_allclose(p_custom.cap_extensions, (1, 5))
# Check Refs with repetitions
read_refs = read_lib["refs"]
assert len(read_refs.refs["polygons"]) >= 3 # Simple, Grid (becomes 1 AREF), Arbitrary (becomes 3 SREFs)
# AREF check
arefs = [r for r in read_refs.refs["polygons"] if r.repetition is not None]
assert len(arefs) == 1
assert isinstance(arefs[0].repetition, Grid)
assert arefs[0].repetition.a_count == 3
assert arefs[0].repetition.b_count == 2
# Check wrapped shapes
# lib.wrap_repeated_shapes() created new patterns
# Original pattern "rep_shapes" now should have a Ref
assert len(read_lib["rep_shapes"].refs) > 0
def test_oasis_full_roundtrip(tmp_path: Path) -> None:
pytest.importorskip("fatamorgana")
from ..file import oasis
lib = create_test_library(for_gds=False)
oas_file = tmp_path / "full_test.oas"
oasis.writefile(lib, oas_file, units_per_micron=1000)
read_lib, _ = oasis.readfile(oas_file)
# Check existence
for name in lib:
assert name in read_lib
# Check Circle
read_circles = read_lib["circles"]
assert isinstance(read_circles.shapes[(3, 0)][0], Circle)
assert read_circles.shapes[(3, 0)][0].radius == 5
# Check Path caps
read_paths = read_lib["paths"]
assert cast("MPath", read_paths.shapes[(2, 0)][0]).cap == MPath.Cap.Flush
assert cast("MPath", read_paths.shapes[(2, 1)][0]).cap == MPath.Cap.Square
# OASIS HalfWidth is Square. masque's Square is also HalfWidth extension.
# Wait, Circle cap in OASIS?
# masque/file/oasis.py:
# path_cap_map = {
# PathExtensionScheme.Flush: Path.Cap.Flush,
# PathExtensionScheme.HalfWidth: Path.Cap.Square,
# PathExtensionScheme.Arbitrary: Path.Cap.SquareCustom,
# }
# It seems Circle cap is NOT supported in OASIS by masque currently.
# Let's verify what happens with Circle cap in OASIS write.
# _shapes_to_elements in oasis.py:
# path_type = next(k for k, v in path_cap_map.items() if v == shape.cap)
# This will raise StopIteration if Circle is not in path_cap_map.
# Check Shape repetition
read_rep_shapes = read_lib["rep_shapes"]
poly = read_rep_shapes.shapes[(4, 0)][0]
assert poly.repetition is not None
assert isinstance(poly.repetition, Grid)
assert poly.repetition.a_count == 5
def test_gdsii_rect_collection_roundtrip(tmp_path: Path) -> None:
from ..file import gdsii
lib = Library()
pat = Pattern()
pat.shapes[(5, 0)].append(
RectCollection(
rects=[[0, 0, 10, 5], [20, -5, 30, 10]],
annotations={'1': ['rects']},
)
)
lib['rects'] = pat
gds_file = tmp_path / 'rect_collection.gds'
gdsii.writefile(lib, gds_file, meters_per_unit=1e-9)
read_lib, _ = gdsii.readfile(gds_file)
polys = read_lib['rects'].shapes[(5, 0)]
assert len(polys) == 2
assert all(isinstance(poly, Polygon) for poly in polys)
assert_allclose(polys[0].vertices, [[0, 0], [0, 5], [10, 5], [10, 0]])
assert_allclose(polys[1].vertices, [[20, -5], [20, 10], [30, 10], [30, -5]])
assert polys[0].annotations == {'1': ['rects']}
assert polys[1].annotations == {'1': ['rects']}

80
masque/test/test_gdsii.py Normal file
View file

@ -0,0 +1,80 @@
from pathlib import Path
from typing import cast
import numpy
import pytest
from numpy.testing import assert_equal, assert_allclose
from ..error import LibraryError
from ..pattern import Pattern
from ..library import Library
from ..file import gdsii
from ..shapes import Path as MPath, Polygon
def test_gdsii_roundtrip(tmp_path: Path) -> None:
lib = Library()
# Simple polygon cell
pat1 = Pattern()
pat1.polygon((1, 0), vertices=[[0, 0], [10, 0], [10, 10], [0, 10]])
lib["poly_cell"] = pat1
# Path cell
pat2 = Pattern()
pat2.path((2, 5), vertices=[[0, 0], [100, 0]], width=10)
lib["path_cell"] = pat2
# Cell with Ref
pat3 = Pattern()
pat3.ref("poly_cell", offset=(50, 50), rotation=numpy.pi / 2)
lib["ref_cell"] = pat3
gds_file = tmp_path / "test.gds"
gdsii.writefile(lib, gds_file, meters_per_unit=1e-9)
read_lib, info = gdsii.readfile(gds_file)
assert "poly_cell" in read_lib
assert "path_cell" in read_lib
assert "ref_cell" in read_lib
# Check polygon
read_poly = cast("Polygon", read_lib["poly_cell"].shapes[(1, 0)][0])
# GDSII closes polygons, so it might have an extra vertex or different order
assert len(read_poly.vertices) >= 4
# Check bounds as a proxy for geometry correctness
assert_equal(read_lib["poly_cell"].get_bounds(), [[0, 0], [10, 10]])
# Check path
read_path = cast("MPath", read_lib["path_cell"].shapes[(2, 5)][0])
assert isinstance(read_path, MPath)
assert read_path.width == 10
assert_equal(read_path.vertices, [[0, 0], [100, 0]])
# Check Ref
read_ref = read_lib["ref_cell"].refs["poly_cell"][0]
assert_equal(read_ref.offset, [50, 50])
assert_allclose(read_ref.rotation, numpy.pi / 2, atol=1e-5)
def test_gdsii_annotations(tmp_path: Path) -> None:
lib = Library()
pat = Pattern()
# GDS only supports integer keys in range [1, 126] for properties
pat.polygon((1, 0), vertices=[[0, 0], [1, 0], [1, 1]], annotations={"1": ["hello"]})
lib["cell"] = pat
gds_file = tmp_path / "test_ann.gds"
gdsii.writefile(lib, gds_file, meters_per_unit=1e-9)
read_lib, _ = gdsii.readfile(gds_file)
read_ann = read_lib["cell"].shapes[(1, 0)][0].annotations
assert read_ann is not None
assert read_ann["1"] == ["hello"]
def test_gdsii_check_valid_names_validates_generator_lengths() -> None:
names = (name for name in ("a" * 40,))
with pytest.raises(LibraryError, match="invalid names"):
gdsii.check_valid_names(names)

View file

@ -0,0 +1,507 @@
from pathlib import Path
import numpy
import pytest
pytest.importorskip('pyarrow')
from .. import Ref, Label
from ..library import Library
from ..pattern import Pattern
from ..repetition import Grid
from ..shapes import Path as MPath, Polygon, PolyCollection, RectCollection
from ..file import gdsii, gdsii_arrow
from ..file.gdsii_perf import write_fixture
if not gdsii_arrow.is_available():
pytest.skip('klamath_rs_ext shared library is not available', allow_module_level=True)
def _annotations_key(annotations: dict[str, list[object]] | None) -> tuple[tuple[str, tuple[object, ...]], ...] | None:
if not annotations:
return None
return tuple(sorted((key, tuple(values)) for key, values in annotations.items()))
def _coord_key(values: object) -> tuple[int, ...] | tuple[tuple[int, int], ...]:
arr = numpy.rint(numpy.asarray(values, dtype=float)).astype(int)
if arr.ndim == 1:
return tuple(arr.tolist())
return tuple(tuple(row.tolist()) for row in arr)
def _canonical_polygon_key(vertices: object) -> tuple[tuple[int, int], ...]:
arr = numpy.rint(numpy.asarray(vertices, dtype=float)).astype(int)
rows = [tuple(tuple(row.tolist()) for row in numpy.roll(arr, -shift, axis=0)) for shift in range(arr.shape[0])]
rev = arr[::-1]
rows.extend(tuple(tuple(row.tolist()) for row in numpy.roll(rev, -shift, axis=0)) for shift in range(rev.shape[0]))
return min(rows)
def _shape_key(shape: object, layer: tuple[int, int]) -> list[tuple[object, ...]]:
if isinstance(shape, MPath):
cap_extensions = None if shape.cap_extensions is None else _coord_key(shape.cap_extensions)
return [(
'path',
layer,
_coord_key(shape.vertices),
_coord_key(shape.offset),
int(round(float(shape.width))),
shape.cap.name,
cap_extensions,
_annotations_key(shape.annotations),
)]
keys = []
for poly in shape.to_polygons():
keys.append((
'polygon',
layer,
_canonical_polygon_key(poly.vertices),
_coord_key(poly.offset),
_annotations_key(poly.annotations),
))
return keys
def _ref_keys(target: str, ref: object) -> list[tuple[object, ...]]:
keys = []
for transform in ref.as_transforms():
keys.append((
target,
_coord_key(transform[:2]),
round(float(transform[2]), 8),
round(float(transform[4]), 8),
bool(int(round(float(transform[3])))),
_annotations_key(ref.annotations),
))
return keys
def _label_key(layer: tuple[int, int], label: object) -> tuple[object, ...]:
return (
layer,
label.string,
_coord_key(label.offset),
_annotations_key(label.annotations),
)
def _pattern_summary(pattern: Pattern) -> dict[str, object]:
shape_keys: list[tuple[object, ...]] = []
for layer, shapes in pattern.shapes.items():
for shape in shapes:
shape_keys.extend(_shape_key(shape, layer))
ref_keys: list[tuple[object, ...]] = []
for target, refs in pattern.refs.items():
for ref in refs:
ref_keys.extend(_ref_keys(target, ref))
label_keys = [
_label_key(layer, label)
for layer, labels in pattern.labels.items()
for label in labels
]
return {
'shapes': sorted(shape_keys),
'refs': sorted(ref_keys),
'labels': sorted(label_keys),
}
def _library_summary(lib: Library) -> dict[str, dict[str, object]]:
return {name: _pattern_summary(pattern) for name, pattern in lib.items()}
def _make_arrow_test_library() -> Library:
lib = Library()
leaf = Pattern()
leaf.polygon((1, 0), vertices=[[0, 0], [10, 0], [10, 10], [0, 10]], annotations={'1': ['leaf-poly']})
leaf.polygon((2, 0), vertices=[[40, 0], [50, 0], [50, 10], [40, 10]])
leaf.polygon((1, 0), vertices=[[20, 0], [30, 0], [30, 10], [20, 10]])
leaf.polygon((1, 0), vertices=[[80, 0], [90, 0], [90, 10], [80, 10]])
leaf.polygon((2, 0), vertices=[[60, 0], [70, 0], [70, 10], [60, 10]], annotations={'18': ['leaf-poly-2']})
leaf.label((10, 0), string='LEAF', offset=(3, 4), annotations={'10': ['leaf-label']})
lib['leaf'] = leaf
child = Pattern()
child.path(
(2, 0),
vertices=[[0, 0], [15, 5], [30, 5]],
width=6,
cap=MPath.Cap.SquareCustom,
cap_extensions=(2, 4),
annotations={'2': ['child-path']},
)
child.label((11, 0), string='CHILD', offset=(7, 8), annotations={'11': ['child-label']})
child.ref('leaf', offset=(100, 200), rotation=numpy.pi / 2, mirrored=True, scale=1.25, annotations={'12': ['child-ref']})
lib['child'] = child
sibling = Pattern()
sibling.polygon((3, 0), vertices=[[0, 0], [5, 0], [5, 6], [0, 6]])
sibling.label((12, 0), string='SIB', offset=(1, 2), annotations={'13': ['sib-label']})
sibling.ref(
'leaf',
offset=(-50, 60),
repetition=Grid(a_vector=(20, 0), a_count=3, b_vector=(0, 30), b_count=2),
annotations={'14': ['sib-ref']},
)
lib['sibling'] = sibling
fanout = Pattern()
fanout.ref('leaf', offset=(0, 0))
fanout.ref('child', offset=(10, 0), mirrored=True, rotation=numpy.pi / 6, scale=1.1)
fanout.ref('leaf', offset=(20, 0))
fanout.ref('leaf', offset=(30, 0), repetition=Grid(a_vector=(5, 0), a_count=2, b_vector=(0, 7), b_count=3))
fanout.ref('child', offset=(40, 0), mirrored=True, rotation=numpy.pi / 4, scale=1.2,
repetition=Grid(a_vector=(9, 0), a_count=2, b_vector=(0, 11), b_count=2))
fanout.ref('leaf', offset=(50, 0), repetition=Grid(a_vector=(6, 0), a_count=3, b_vector=(0, 8), b_count=2))
fanout.ref('leaf', offset=(60, 0), annotations={'19': ['fanout-sref']})
fanout.ref('child', offset=(70, 0), repetition=Grid(a_vector=(4, 0), a_count=2, b_vector=(0, 5), b_count=2),
annotations={'20': ['fanout-aref']})
lib['fanout'] = fanout
top = Pattern()
top.ref('child', offset=(500, 600), annotations={'15': ['top-child-ref']})
top.ref('sibling', offset=(-100, 50), rotation=numpy.pi, annotations={'16': ['top-sibling-ref']})
top.ref('fanout', offset=(250, -75))
top.label((13, 0), string='TOP', offset=(0, 0), annotations={'17': ['top-label']})
lib['top'] = top
return lib
def test_gdsii_arrow_matches_gdsii_readfile(tmp_path: Path) -> None:
lib = _make_arrow_test_library()
gds_file = tmp_path / 'arrow_roundtrip.gds'
gdsii.writefile(lib, gds_file, meters_per_unit=1e-9)
canonical_lib, canonical_info = gdsii.readfile(gds_file)
arrow_lib, arrow_info = gdsii_arrow.readfile(gds_file)
assert canonical_info == arrow_info
assert _library_summary(canonical_lib) == _library_summary(arrow_lib)
def test_gdsii_arrow_readfile_arrow_returns_native_payload(tmp_path: Path) -> None:
gds_file = tmp_path / 'many_cells_native.gds'
manifest = write_fixture(gds_file, preset='many_cells', scale=0.001)
libarr, info = gdsii_arrow.readfile_arrow(gds_file)
assert info['name'] == manifest.library_name
assert libarr['lib_name'].as_py() == manifest.library_name
assert len(libarr['cells']) == manifest.cells
assert 0 < len(libarr['layers']) <= manifest.layers
def test_gdsii_arrow_reads_small_perf_fixture(tmp_path: Path) -> None:
gds_file = tmp_path / 'many_cells_smoke.gds'
manifest = write_fixture(gds_file, preset='many_cells', scale=0.001)
lib, info = gdsii_arrow.readfile(gds_file)
assert info['name'] == manifest.library_name
assert len(lib) == manifest.cells
assert 'TOP' in lib
assert sum(len(refs) for refs in lib['TOP'].refs.values()) > 0
def test_gdsii_arrow_degenerate_aref_decodes_as_single_transform(tmp_path: Path) -> None:
lib = Library()
leaf = Pattern()
leaf.polygon((1, 0), vertices=[[0, 0], [5, 0], [5, 5], [0, 5]])
lib['leaf'] = leaf
top = Pattern()
top.ref('leaf', offset=(100, 200), repetition=Grid(a_vector=(7, 0), a_count=1, b_vector=(0, 9), b_count=1))
lib['top'] = top
gds_file = tmp_path / 'degenerate_aref.gds'
gdsii.writefile(lib, gds_file, meters_per_unit=1e-9)
canonical_lib, _ = gdsii.readfile(gds_file)
arrow_lib, _ = gdsii_arrow.readfile(gds_file)
assert _library_summary(arrow_lib) == _library_summary(canonical_lib)
decoded_ref = arrow_lib['top'].refs['leaf'][0]
assert decoded_ref.repetition is None
def test_gdsii_arrow_plain_srefs_decode_without_arbitrary(tmp_path: Path) -> None:
lib = _make_arrow_test_library()
gds_file = tmp_path / 'plain_srefs.gds'
gdsii.writefile(lib, gds_file, meters_per_unit=1e-9)
arrow_lib, _ = gdsii_arrow.readfile(gds_file)
fanout = arrow_lib['fanout']
plain_leaf_refs = [
ref
for ref in fanout.refs['leaf']
if ref.annotations is None and ref.repetition is None
]
assert len(plain_leaf_refs) == 2
assert all(type(ref.repetition) is not Grid for ref in plain_leaf_refs)
def test_gdsii_arrow_degenerate_aref_schema_normalizes_to_sref(tmp_path: Path) -> None:
lib = Library()
leaf = Pattern()
leaf.polygon((1, 0), vertices=[[0, 0], [5, 0], [5, 5], [0, 5]])
lib['leaf'] = leaf
top = Pattern()
top.ref('leaf', offset=(100, 200), repetition=Grid(a_vector=(7, 0), a_count=1, b_vector=(0, 9), b_count=1))
lib['top'] = top
gds_file = tmp_path / 'degenerate_aref_schema.gds'
gdsii.writefile(lib, gds_file, meters_per_unit=1e-9)
libarr = gdsii_arrow._read_to_arrow(gds_file)[0]
cells = libarr['cells'].values
cell_ids = cells.field('id').to_numpy()
cell_names = libarr['cell_names'].as_py()
top_index = next(ii for ii, cell_id in enumerate(cell_ids) if cell_names[cell_id] == 'top')
srefs = cells.field('srefs')[top_index].as_py()
arefs = cells.field('arefs')[top_index].as_py()
assert len(srefs) == 1
assert len(arefs) == 0
assert cell_names[srefs[0]['target']] == 'leaf'
def test_gdsii_arrow_boundary_batch_schema(tmp_path: Path) -> None:
lib = _make_arrow_test_library()
gds_file = tmp_path / 'arrow_batches.gds'
gdsii.writefile(lib, gds_file, meters_per_unit=1e-9)
libarr = gdsii_arrow._read_to_arrow(gds_file)[0]
cells = libarr['cells'].values
cell_ids = cells.field('id').to_numpy()
cell_names = libarr['cell_names'].as_py()
layer_table = [
((int(layer) >> 16) & 0xFFFF, int(layer) & 0xFFFF)
for layer in libarr['layers'].values.to_numpy()
]
leaf_index = next(ii for ii, cell_id in enumerate(cell_ids) if cell_names[cell_id] == 'leaf')
rect_batches = cells.field('rect_batches')[leaf_index].as_py()
boundary_batches = cells.field('boundary_batches')[leaf_index].as_py()
boundary_props = cells.field('boundary_props')[leaf_index].as_py()
assert len(rect_batches) == 2
assert len(boundary_batches) == 0
assert len(boundary_props) == 2
rects_by_layer = {tuple(layer_table[entry['layer']]): entry for entry in rect_batches}
assert rects_by_layer[(1, 0)]['rects'] == [20, 0, 30, 10, 80, 0, 90, 10]
assert rects_by_layer[(2, 0)]['rects'] == [40, 0, 50, 10]
props_by_layer = {tuple(layer_table[entry['layer']]): entry for entry in boundary_props}
assert sorted(props_by_layer) == [(1, 0), (2, 0)]
assert props_by_layer[(1, 0)]['properties'][0]['value'] == 'leaf-poly'
assert props_by_layer[(2, 0)]['properties'][0]['value'] == 'leaf-poly-2'
def test_gdsii_arrow_rect_batch_schema_for_mixed_layer(tmp_path: Path) -> None:
lib = Library()
top = Pattern()
top.shapes[(1, 0)].append(RectCollection(rects=[[0, 0, 10, 10], [20, 0, 30, 10], [40, 0, 50, 10], [60, 0, 70, 10]]))
top.polygon((1, 0), vertices=[[80, 0], [85, 10], [90, 0]])
top.polygon((1, 0), vertices=[[100, 0], [105, 10], [110, 0]])
lib['top'] = top
gds_file = tmp_path / 'arrow_rect_batches.gds'
gdsii.writefile(lib, gds_file, meters_per_unit=1e-9)
libarr = gdsii_arrow._read_to_arrow(gds_file)[0]
cells = libarr['cells'].values
cell_ids = cells.field('id').to_numpy()
cell_names = libarr['cell_names'].as_py()
layer_table = [
((int(layer) >> 16) & 0xFFFF, int(layer) & 0xFFFF)
for layer in libarr['layers'].values.to_numpy()
]
top_index = next(ii for ii, cell_id in enumerate(cell_ids) if cell_names[cell_id] == 'top')
rect_batches = cells.field('rect_batches')[top_index].as_py()
boundary_batches = cells.field('boundary_batches')[top_index].as_py()
assert len(rect_batches) == 1
assert tuple(layer_table[rect_batches[0]['layer']]) == (1, 0)
assert rect_batches[0]['rects'] == [
0, 0, 10, 10,
20, 0, 30, 10,
40, 0, 50, 10,
60, 0, 70, 10,
]
assert len(boundary_batches) == 1
assert tuple(layer_table[boundary_batches[0]['layer']]) == (1, 0)
assert boundary_batches[0]['vertex_offsets'] == [0, 3]
def test_gdsii_arrow_ref_schema(tmp_path: Path) -> None:
lib = _make_arrow_test_library()
gds_file = tmp_path / 'arrow_ref_batches.gds'
gdsii.writefile(lib, gds_file, meters_per_unit=1e-9)
libarr = gdsii_arrow._read_to_arrow(gds_file)[0]
cells = libarr['cells'].values
cell_ids = cells.field('id').to_numpy()
cell_names = libarr['cell_names'].as_py()
fanout_index = next(ii for ii, cell_id in enumerate(cell_ids) if cell_names[cell_id] == 'fanout')
srefs = cells.field('srefs')[fanout_index].as_py()
arefs = cells.field('arefs')[fanout_index].as_py()
sref_props = cells.field('sref_props')[fanout_index].as_py()
aref_props = cells.field('aref_props')[fanout_index].as_py()
sref_target_ids = [entry['target'] for entry in srefs]
sref_targets = [cell_names[target] for target in sref_target_ids]
assert sorted(sref_targets) == ['child', 'leaf', 'leaf']
assert sref_target_ids == sorted(sref_target_ids)
sref_by_target = {}
for entry in srefs:
sref_by_target.setdefault(cell_names[entry['target']], []).append(entry)
assert [entry['invert_y'] for entry in sref_by_target['child']] == [True]
assert [entry['scale'] for entry in sref_by_target['child']] == pytest.approx([1.1])
assert len(sref_by_target['leaf']) == 2
aref_target_ids = [entry['target'] for entry in arefs]
aref_targets = [cell_names[target] for target in aref_target_ids]
assert sorted(aref_targets) == ['child', 'leaf', 'leaf']
assert aref_target_ids == sorted(aref_target_ids)
aref_by_target = {}
for entry in arefs:
aref_by_target.setdefault(cell_names[entry['target']], []).append(entry)
assert [entry['invert_y'] for entry in aref_by_target['child']] == [True]
assert [entry['scale'] for entry in aref_by_target['child']] == pytest.approx([1.2])
assert len(aref_by_target['leaf']) == 2
assert len(sref_props) == 1
assert cell_names[sref_props[0]['target']] == 'leaf'
assert sref_props[0]['properties'][0]['value'] == 'fanout-sref'
assert len(aref_props) == 1
assert cell_names[aref_props[0]['target']] == 'child'
assert aref_props[0]['properties'][0]['value'] == 'fanout-aref'
def test_raw_ref_grid_label_constructors_match_public() -> None:
raw_grid = Grid._from_raw(
a_vector=numpy.array([20, 0]),
a_count=3,
b_vector=numpy.array([0, 30]),
b_count=2,
)
public_grid = Grid(a_vector=(20, 0), a_count=3, b_vector=(0, 30), b_count=2)
assert raw_grid == public_grid
raw_poly = Polygon._from_raw(
vertices=numpy.array([[0.0, 0.0], [5.0, 0.0], [5.0, 5.0], [0.0, 5.0]]),
annotations={'1': ['poly']},
)
public_poly = Polygon(
vertices=[[0, 0], [5, 0], [5, 5], [0, 5]],
annotations={'1': ['poly']},
)
assert raw_poly == public_poly
raw_poly_collection = PolyCollection._from_raw(
vertex_lists=numpy.array([
[0.0, 0.0], [2.0, 0.0], [2.0, 2.0],
[10.0, 10.0], [12.0, 10.0], [12.0, 12.0],
]),
vertex_offsets=numpy.array([0, 3], dtype=numpy.uint32),
annotations={'2': ['pc']},
)
public_poly_collection = PolyCollection(
vertex_lists=[[0, 0], [2, 0], [2, 2], [10, 10], [12, 10], [12, 12]],
vertex_offsets=[0, 3],
annotations={'2': ['pc']},
)
assert raw_poly_collection == public_poly_collection
assert [tuple(s.indices(len(raw_poly_collection.vertex_lists))) for s in raw_poly_collection.vertex_slices] == [(0, 3, 1), (3, 6, 1)]
raw_rect_collection = RectCollection._from_raw(
rects=numpy.array([[10.0, 10.0, 12.0, 12.0], [0.0, 0.0, 5.0, 5.0]]),
annotations={'3': ['rects']},
)
public_rect_collection = RectCollection(
rects=[[0, 0, 5, 5], [10, 10, 12, 12]],
annotations={'3': ['rects']},
)
assert raw_rect_collection == public_rect_collection
raw_ref_empty = Ref._from_raw(
offset=numpy.array([100, 200]),
rotation=numpy.pi / 2,
mirrored=False,
scale=1.0,
repetition=None,
annotations=None,
)
public_ref_empty = Ref(
offset=(100, 200),
rotation=numpy.pi / 2,
mirrored=False,
scale=1.0,
repetition=None,
annotations=None,
)
assert raw_ref_empty.annotations is None
assert raw_ref_empty == public_ref_empty
raw_ref = Ref._from_raw(
offset=numpy.array([100, 200]),
rotation=numpy.pi / 2,
mirrored=True,
scale=1.25,
repetition=raw_grid,
annotations={'12': ['child-ref']},
)
public_ref = Ref(
offset=(100, 200),
rotation=numpy.pi / 2,
mirrored=True,
scale=1.25,
repetition=public_grid,
annotations={'12': ['child-ref']},
)
assert raw_ref == public_ref
assert numpy.array_equal(raw_ref.as_transforms(), public_ref.as_transforms())
raw_label_empty = Label._from_raw(
'LEAF',
offset=numpy.array([3, 4]),
annotations=None,
)
public_label_empty = Label(
'LEAF',
offset=(3, 4),
annotations=None,
)
assert raw_label_empty.annotations is None
assert raw_label_empty == public_label_empty
raw_label = Label._from_raw(
'LEAF',
offset=numpy.array([3, 4]),
annotations={'10': ['leaf-label']},
)
public_label = Label(
'LEAF',
offset=(3, 4),
annotations={'10': ['leaf-label']},
)
assert raw_label == public_label
assert numpy.array_equal(raw_label.get_bounds_single(), public_label.get_bounds_single())

View file

@ -0,0 +1,174 @@
from pathlib import Path
import numpy
import pytest
pytest.importorskip('pyarrow')
from ..library import Library
from ..pattern import Pattern
from ..repetition import Grid
from ..file import gdsii, gdsii_lazy_arrow
from ..file.gdsii_perf import write_fixture
if not gdsii_lazy_arrow.is_available():
pytest.skip('klamath_rs_ext shared library is not available', allow_module_level=True)
def _make_small_library() -> Library:
lib = Library()
leaf = Pattern()
leaf.polygon((1, 0), vertices=[[0, 0], [10, 0], [10, 5], [0, 5]])
lib['leaf'] = leaf
mid = Pattern()
mid.ref('leaf', offset=(10, 20))
mid.ref('leaf', offset=(40, 0), repetition=Grid(a_vector=(12, 0), a_count=2, b_vector=(0, 9), b_count=2))
lib['mid'] = mid
top = Pattern()
top.ref('mid', offset=(100, 200))
lib['top'] = top
return lib
def test_gdsii_lazy_arrow_loads_perf_fixture(tmp_path: Path) -> None:
gds_file = tmp_path / 'many_cells_lazy.gds'
manifest = write_fixture(gds_file, preset='many_cells', scale=0.001)
lib, info = gdsii_lazy_arrow.readfile(gds_file)
assert info['name'] == manifest.library_name
assert len(lib) == manifest.cells
assert lib.top() == 'TOP'
assert 'TOP' in lib.child_graph(dangling='ignore')
def test_gdsii_lazy_arrow_local_and_global_refs(tmp_path: Path) -> None:
gds_file = tmp_path / 'refs.gds'
src = _make_small_library()
gdsii.writefile(src, gds_file, meters_per_unit=1e-9, library_name='lazy-refs')
lib, _ = gdsii_lazy_arrow.readfile(gds_file)
local = lib.find_refs_local('leaf')
assert set(local) == {'mid'}
assert sum(arr.shape[0] for arr in local['mid']) == 5
global_refs = lib.find_refs_global('leaf')
assert {path for path in global_refs} == {('top', 'mid', 'leaf')}
assert global_refs[('top', 'mid', 'leaf')].shape[0] == 5
def test_gdsii_lazy_arrow_untouched_write_is_copy_through(tmp_path: Path) -> None:
gds_file = tmp_path / 'copy_source.gds'
src = _make_small_library()
gdsii.writefile(src, gds_file, meters_per_unit=1e-9, library_name='copy-through')
lib, info = gdsii_lazy_arrow.readfile(gds_file)
out_file = tmp_path / 'copy_out.gds'
gdsii_lazy_arrow.writefile(
lib,
out_file,
meters_per_unit=info['meters_per_unit'],
logical_units_per_unit=info['logical_units_per_unit'],
library_name=info['name'],
)
assert out_file.read_bytes() == gds_file.read_bytes()
def test_gdsii_lazy_overlay_merge_and_write(tmp_path: Path) -> None:
base_a = Library()
leaf_a = Pattern()
leaf_a.polygon((1, 0), vertices=[[0, 0], [8, 0], [8, 8], [0, 8]])
base_a['leaf'] = leaf_a
top_a = Pattern()
top_a.ref('leaf', offset=(0, 0))
base_a['top_a'] = top_a
base_b = Library()
leaf_b = Pattern()
leaf_b.polygon((2, 0), vertices=[[0, 0], [5, 0], [5, 5], [0, 5]])
base_b['leaf'] = leaf_b
top_b = Pattern()
top_b.ref('leaf', offset=(20, 30))
base_b['top_b'] = top_b
gds_a = tmp_path / 'a.gds'
gds_b = tmp_path / 'b.gds'
gdsii.writefile(base_a, gds_a, meters_per_unit=1e-9, library_name='overlay')
gdsii.writefile(base_b, gds_b, meters_per_unit=1e-9, library_name='overlay')
lib_a, _ = gdsii_lazy_arrow.readfile(gds_a)
lib_b, _ = gdsii_lazy_arrow.readfile(gds_b)
overlay = gdsii_lazy_arrow.OverlayLibrary()
overlay.add_source(lib_a)
rename_map = overlay.add_source(lib_b, rename_theirs=lambda lib, name: lib.get_name(name))
renamed_leaf = rename_map['leaf']
assert rename_map == {'leaf': renamed_leaf}
assert renamed_leaf != 'leaf'
assert len(lib_a._cache) == 0
assert len(lib_b._cache) == 0
overlay.move_references('leaf', renamed_leaf)
out_file = tmp_path / 'overlay_out.gds'
gdsii_lazy_arrow.writefile(overlay, out_file)
roundtrip, _ = gdsii.readfile(out_file)
assert set(roundtrip.keys()) == {'leaf', renamed_leaf, 'top_a', 'top_b'}
assert 'top_b' in roundtrip
assert list(roundtrip['top_b'].refs.keys()) == [renamed_leaf]
def test_gdsii_writer_accepts_overlay_library(tmp_path: Path) -> None:
gds_file = tmp_path / 'overlay_source.gds'
src = _make_small_library()
gdsii.writefile(src, gds_file, meters_per_unit=1e-9, library_name='overlay-src')
lib, info = gdsii_lazy_arrow.readfile(gds_file)
overlay = gdsii_lazy_arrow.OverlayLibrary()
overlay.add_source(lib)
overlay.rename('leaf', 'leaf_copy', move_references=True)
out_file = tmp_path / 'overlay_via_eager_writer.gds'
gdsii.writefile(
overlay,
out_file,
meters_per_unit=info['meters_per_unit'],
logical_units_per_unit=info['logical_units_per_unit'],
library_name=info['name'],
)
roundtrip, _ = gdsii.readfile(out_file)
assert set(roundtrip.keys()) == {'leaf_copy', 'mid', 'top'}
assert list(roundtrip['mid'].refs.keys()) == ['leaf_copy']
def test_svg_writer_uses_detached_materialized_copy(tmp_path: Path) -> None:
pytest.importorskip('svgwrite')
from ..file import svg
from ..shapes import Path as MPath
gds_file = tmp_path / 'svg_source.gds'
src = _make_small_library()
src['top'].path((3, 0), vertices=[[0, 0], [0, 20]], width=4)
gdsii.writefile(src, gds_file, meters_per_unit=1e-9, library_name='svg-src')
lib, _ = gdsii_lazy_arrow.readfile(gds_file)
top_pat = lib['top']
assert list(top_pat.refs.keys()) == ['mid']
assert any(isinstance(shape, MPath) for shape in top_pat.shapes[(3, 0)])
svg_path = tmp_path / 'lazy.svg'
svg.writefile(lib, 'top', str(svg_path))
assert svg_path.exists()
assert list(top_pat.refs.keys()) == ['mid']
assert any(isinstance(shape, MPath) for shape in top_pat.shapes[(3, 0)])

View file

@ -0,0 +1,24 @@
from dataclasses import asdict
import json
from pathlib import Path
from ..file import gdsii
from ..file.gdsii_perf import fixture_manifest, write_fixture
def test_gdsii_perf_fixture_smoke(tmp_path: Path) -> None:
output = tmp_path / 'many_cells.gds'
manifest = write_fixture(output, preset='many_cells', scale=0.002)
expected = fixture_manifest(output, preset='many_cells', scale=0.002)
assert output.exists()
assert manifest == expected
sidecar = json.loads(output.with_suffix('.gds.json').read_text())
assert sidecar == asdict(manifest)
read_lib, info = gdsii.readfile(output)
assert info['name'] == manifest.library_name
assert len(read_lib) == manifest.cells
assert 'TOP' in read_lib
assert len(read_lib['TOP'].refs) > 0

54
masque/test/test_label.py Normal file
View file

@ -0,0 +1,54 @@
import copy
from numpy.testing import assert_equal, assert_allclose
from numpy import pi
from ..label import Label
from ..repetition import Grid
from ..utils import annotations_eq
def test_label_init() -> None:
lbl = Label("test", offset=(10, 20))
assert lbl.string == "test"
assert_equal(lbl.offset, [10, 20])
def test_label_transform() -> None:
lbl = Label("test", offset=(10, 0))
# Rotate 90 deg CCW around (0,0)
lbl.rotate_around((0, 0), pi / 2)
assert_allclose(lbl.offset, [0, 10], atol=1e-10)
# Translate
lbl.translate((5, 5))
assert_allclose(lbl.offset, [5, 15], atol=1e-10)
def test_label_repetition() -> None:
rep = Grid(a_vector=(10, 0), a_count=3)
lbl = Label("rep", offset=(0, 0), repetition=rep)
assert lbl.repetition is rep
assert_equal(lbl.get_bounds_single(), [[0, 0], [0, 0]])
# Note: Bounded.get_bounds_nonempty() for labels with repetition doesn't
# seem to automatically include repetition bounds in label.py itself,
# it's handled during pattern bounding.
def test_label_copy() -> None:
l1 = Label("test", offset=(1, 2), annotations={"a": [1]})
l2 = copy.deepcopy(l1)
print(f"l1: string={l1.string}, offset={l1.offset}, repetition={l1.repetition}, annotations={l1.annotations}")
print(f"l2: string={l2.string}, offset={l2.offset}, repetition={l2.repetition}, annotations={l2.annotations}")
print(f"annotations_eq: {annotations_eq(l1.annotations, l2.annotations)}")
assert l1 == l2
assert l1 is not l2
l2.offset[0] = 100
assert l1.offset[0] == 1
def test_label_eq_unrelated_objects_is_false() -> None:
lbl = Label("test")
assert not (lbl == None)
assert not (lbl == object())

483
masque/test/test_library.py Normal file
View file

@ -0,0 +1,483 @@
import pytest
from typing import cast, TYPE_CHECKING
from numpy.testing import assert_allclose
from ..library import Library, LazyLibrary
from ..pattern import Pattern
from ..error import LibraryError, PatternError
from ..ports import Port
from ..repetition import Grid
from ..shapes import Arc, Ellipse, Path, Text
from ..file.utils import preflight
if TYPE_CHECKING:
from ..shapes import Polygon
def test_library_basic() -> None:
lib = Library()
pat = Pattern()
lib["cell1"] = pat
assert "cell1" in lib
assert lib["cell1"] is pat
assert len(lib) == 1
with pytest.raises(LibraryError):
lib["cell1"] = Pattern() # Overwriting not allowed
def test_library_tops() -> None:
lib = Library()
lib["child"] = Pattern()
lib["parent"] = Pattern()
lib["parent"].ref("child")
assert set(lib.tops()) == {"parent"}
assert lib.top() == "parent"
def test_library_dangling() -> None:
lib = Library()
lib["parent"] = Pattern()
lib["parent"].ref("missing")
assert lib.dangling_refs() == {"missing"}
def test_library_dangling_graph_modes() -> None:
lib = Library()
lib["parent"] = Pattern()
lib["parent"].ref("missing")
with pytest.raises(LibraryError, match="Dangling refs found"):
lib.child_graph()
with pytest.raises(LibraryError, match="Dangling refs found"):
lib.parent_graph()
with pytest.raises(LibraryError, match="Dangling refs found"):
lib.child_order()
assert lib.child_graph(dangling="ignore") == {"parent": set()}
assert lib.parent_graph(dangling="ignore") == {"parent": set()}
assert lib.child_order(dangling="ignore") == ["parent"]
assert lib.child_graph(dangling="include") == {"parent": {"missing"}, "missing": set()}
assert lib.parent_graph(dangling="include") == {"parent": set(), "missing": {"parent"}}
assert lib.child_order(dangling="include") == ["missing", "parent"]
def test_find_refs_with_dangling_modes() -> None:
lib = Library()
lib["target"] = Pattern()
mid = Pattern()
mid.ref("target", offset=(2, 0))
lib["mid"] = mid
top = Pattern()
top.ref("mid", offset=(5, 0))
top.ref("missing", offset=(9, 0))
lib["top"] = top
assert lib.find_refs_local("missing", dangling="ignore") == {}
assert lib.find_refs_global("missing", dangling="ignore") == {}
local_missing = lib.find_refs_local("missing", dangling="include")
assert set(local_missing) == {"top"}
assert_allclose(local_missing["top"][0], [[9, 0, 0, 0, 1]])
global_missing = lib.find_refs_global("missing", dangling="include")
assert_allclose(global_missing[("top", "missing")], [[9, 0, 0, 0, 1]])
with pytest.raises(LibraryError, match="missing"):
lib.find_refs_local("missing")
with pytest.raises(LibraryError, match="missing"):
lib.find_refs_global("missing")
global_target = lib.find_refs_global("target")
assert_allclose(global_target[("top", "mid", "target")], [[7, 0, 0, 0, 1]])
def test_preflight_prune_empty_preserves_dangling_policy(caplog: pytest.LogCaptureFixture) -> None:
def make_lib() -> Library:
lib = Library()
lib["empty"] = Pattern()
lib["top"] = Pattern()
lib["top"].ref("missing")
return lib
caplog.set_level("WARNING")
warned = preflight(make_lib(), allow_dangling_refs=None, prune_empty_patterns=True)
assert "empty" not in warned
assert any("Dangling refs found" in record.message for record in caplog.records)
allowed = preflight(make_lib(), allow_dangling_refs=True, prune_empty_patterns=True)
assert "empty" not in allowed
with pytest.raises(LibraryError, match="Dangling refs found"):
preflight(make_lib(), allow_dangling_refs=False, prune_empty_patterns=True)
def test_library_flatten() -> None:
lib = Library()
child = Pattern()
child.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]])
lib["child"] = child
parent = Pattern()
parent.ref("child", offset=(10, 10))
lib["parent"] = parent
flat_lib = lib.flatten("parent")
flat_parent = flat_lib["parent"]
assert not flat_parent.has_refs()
assert len(flat_parent.shapes[(1, 0)]) == 1
# Transformations are baked into vertices for Polygon
assert_vertices = cast("Polygon", flat_parent.shapes[(1, 0)][0]).vertices
assert tuple(assert_vertices[0]) == (10.0, 10.0)
def test_library_flatten_preserves_ports_only_child() -> None:
lib = Library()
child = Pattern(ports={"P1": Port((1, 2), 0)})
lib["child"] = child
parent = Pattern()
parent.ref("child", offset=(10, 10))
lib["parent"] = parent
flat_parent = lib.flatten("parent", flatten_ports=True)["parent"]
assert set(flat_parent.ports) == {"P1"}
assert cast("Port", flat_parent.ports["P1"]).rotation == 0
assert tuple(flat_parent.ports["P1"].offset) == (11.0, 12.0)
def test_library_flatten_repeated_ref_with_ports_raises() -> None:
lib = Library()
child = Pattern(ports={"P1": Port((1, 2), 0)})
child.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]])
lib["child"] = child
parent = Pattern()
parent.ref("child", repetition=Grid(a_vector=(10, 0), a_count=2))
lib["parent"] = parent
with pytest.raises(PatternError, match='Cannot flatten ports from repeated ref'):
lib.flatten("parent", flatten_ports=True)
def test_library_flatten_dangling_ok_nested_preserves_dangling_refs() -> None:
lib = Library()
child = Pattern()
child.ref("missing")
lib["child"] = child
parent = Pattern()
parent.ref("child")
lib["parent"] = parent
flat = lib.flatten("parent", dangling_ok=True)
assert set(flat["child"].refs) == {"missing"}
assert flat["child"].has_refs()
assert set(flat["parent"].refs) == {"missing"}
assert flat["parent"].has_refs()
def test_lazy_library() -> None:
lib = LazyLibrary()
called = 0
def make_pat() -> Pattern:
nonlocal called
called += 1
return Pattern()
lib["lazy"] = make_pat
assert called == 0
pat = lib["lazy"]
assert called == 1
assert isinstance(pat, Pattern)
# Second access should be cached
pat2 = lib["lazy"]
assert called == 1
assert pat is pat2
def test_library_rename() -> None:
lib = Library()
lib["old"] = Pattern()
lib["parent"] = Pattern()
lib["parent"].ref("old")
lib.rename("old", "new", move_references=True)
assert "old" not in lib
assert "new" in lib
assert "new" in lib["parent"].refs
assert "old" not in lib["parent"].refs
@pytest.mark.parametrize("library_cls", (Library, LazyLibrary))
def test_library_rename_self_is_noop(library_cls: type[Library] | type[LazyLibrary]) -> None:
lib = library_cls()
lib["top"] = Pattern()
lib["parent"] = Pattern()
lib["parent"].ref("top")
lib.rename("top", "top", move_references=True)
assert set(lib.keys()) == {"top", "parent"}
assert "top" in lib["parent"].refs
assert len(lib["parent"].refs["top"]) == 1
@pytest.mark.parametrize("library_cls", (Library, LazyLibrary))
def test_library_rename_top_self_is_noop(library_cls: type[Library] | type[LazyLibrary]) -> None:
lib = library_cls()
lib["top"] = Pattern()
lib.rename_top("top")
assert list(lib.keys()) == ["top"]
@pytest.mark.parametrize("library_cls", (Library, LazyLibrary))
def test_library_rename_missing_raises_library_error(library_cls: type[Library] | type[LazyLibrary]) -> None:
lib = library_cls()
lib["top"] = Pattern()
with pytest.raises(LibraryError, match="does not exist"):
lib.rename("missing", "new")
@pytest.mark.parametrize("library_cls", (Library, LazyLibrary))
def test_library_move_references_same_target_is_noop(library_cls: type[Library] | type[LazyLibrary]) -> None:
lib = library_cls()
lib["top"] = Pattern()
lib["parent"] = Pattern()
lib["parent"].ref("top")
lib.move_references("top", "top")
assert "top" in lib["parent"].refs
assert len(lib["parent"].refs["top"]) == 1
def test_library_dfs_can_replace_existing_patterns() -> None:
lib = Library()
child = Pattern()
lib["child"] = child
top = Pattern()
top.ref("child")
lib["top"] = top
replacement_top = Pattern(ports={"T": Port((1, 2), 0)})
replacement_child = Pattern(ports={"C": Port((3, 4), 0)})
def visit_after(pattern: Pattern, hierarchy: tuple[str | None, ...], **kwargs) -> Pattern: # noqa: ARG001
if hierarchy[-1] == "child":
return replacement_child
if hierarchy[-1] == "top":
return replacement_top
return pattern
lib.dfs(lib["top"], visit_after=visit_after, hierarchy=("top",), transform=True)
assert lib["top"] is replacement_top
assert lib["child"] is replacement_child
def test_lazy_library_dfs_can_replace_existing_patterns() -> None:
lib = LazyLibrary()
lib["child"] = lambda: Pattern()
lib["top"] = lambda: Pattern(refs={"child": []})
top = lib["top"]
top.ref("child")
replacement_top = Pattern(ports={"T": Port((1, 2), 0)})
replacement_child = Pattern(ports={"C": Port((3, 4), 0)})
def visit_after(pattern: Pattern, hierarchy: tuple[str | None, ...], **kwargs) -> Pattern: # noqa: ARG001
if hierarchy[-1] == "child":
return replacement_child
if hierarchy[-1] == "top":
return replacement_top
return pattern
lib.dfs(top, visit_after=visit_after, hierarchy=("top",), transform=True)
assert lib["top"] is replacement_top
assert lib["child"] is replacement_child
def test_library_add_no_duplicates_respects_mutate_other_false() -> None:
src_pat = Pattern(ports={"A": Port((0, 0), 0)})
lib = Library({"a": Pattern()})
lib.add({"b": src_pat}, mutate_other=False)
assert lib["b"] is not src_pat
lib["b"].ports["A"].offset[0] = 123
assert tuple(src_pat.ports["A"].offset) == (0.0, 0.0)
def test_library_add_returns_only_renamed_entries() -> None:
lib = Library({"a": Pattern(), "_shape": Pattern()})
assert lib.add({"b": Pattern(), "c": Pattern()}, mutate_other=False) == {}
rename_map = lib.add({"_shape": Pattern(), "keep": Pattern()}, mutate_other=False)
assert set(rename_map) == {"_shape"}
assert rename_map["_shape"] != "_shape"
assert "keep" not in rename_map
def test_library_subtree() -> None:
lib = Library()
lib["a"] = Pattern()
lib["b"] = Pattern()
lib["c"] = Pattern()
lib["a"].ref("b")
sub = lib.subtree("a")
assert "a" in sub
assert "b" in sub
assert "c" not in sub
def test_library_child_order_cycle_raises_library_error() -> None:
lib = Library()
lib["a"] = Pattern()
lib["a"].ref("b")
lib["b"] = Pattern()
lib["b"].ref("a")
with pytest.raises(LibraryError, match="Cycle found while building child order"):
lib.child_order()
def test_library_find_refs_global_cycle_raises_library_error() -> None:
lib = Library()
lib["a"] = Pattern()
lib["a"].ref("a")
with pytest.raises(LibraryError, match="Cycle found while building child order"):
lib.find_refs_global("a")
def test_library_get_name() -> None:
lib = Library()
lib["cell"] = Pattern()
name1 = lib.get_name("cell")
assert name1 != "cell"
assert name1.startswith("cell")
name2 = lib.get_name("other")
assert name2 == "other"
def test_library_dedup_shapes_does_not_merge_custom_capped_paths() -> None:
lib = Library()
pat = Pattern()
pat.shapes[(1, 0)] += [
Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.SquareCustom, cap_extensions=(1, 2)),
Path(vertices=[[20, 0], [30, 0]], width=2, cap=Path.Cap.SquareCustom, cap_extensions=(3, 4)),
]
lib["top"] = pat
lib.dedup(norm_value=1, threshold=2)
assert not lib["top"].refs
assert len(lib["top"].shapes[(1, 0)]) == 2
def test_library_dedup_text_preserves_scale_and_mirror_flag() -> None:
lib = Library()
pat = Pattern()
pat.shapes[(1, 0)] += [
Text("A", 10, "dummy.ttf", offset=(0, 0)),
Text("A", 10, "dummy.ttf", offset=(100, 0)),
]
lib["top"] = pat
lib.dedup(exclude_types=(), norm_value=5, threshold=2)
target_name = next(iter(lib["top"].refs))
refs = lib["top"].refs[target_name]
assert [ref.mirrored for ref in refs] == [False, False]
assert [ref.scale for ref in refs] == [2.0, 2.0]
assert cast("Text", lib[target_name].shapes[(1, 0)][0]).height == 5
flat = lib.flatten("top")["top"]
assert [cast("Text", shape).height for shape in flat.shapes[(1, 0)]] == [10, 10]
def test_library_dedup_handles_arc_and_ellipse_labels() -> None:
lib = Library()
pat = Pattern()
pat.shapes[(1, 0)] += [
Arc(radii=(10, 20), angles=(0, 1), width=2, offset=(0, 0)),
Arc(radii=(10, 20), angles=(0, 1), width=2, offset=(50, 0)),
]
pat.shapes[(2, 0)] += [
Ellipse(radii=(10, 20), offset=(0, 0)),
Ellipse(radii=(10, 20), offset=(50, 0)),
]
lib["top"] = pat
lib.dedup(exclude_types=(), norm_value=1, threshold=2)
assert len(lib["top"].refs) == 2
assert lib["top"].shapes[(1, 0)] == []
assert lib["top"].shapes[(2, 0)] == []
flat = lib.flatten("top")["top"]
assert sum(isinstance(shape, Arc) for shape in flat.shapes[(1, 0)]) == 2
assert sum(isinstance(shape, Ellipse) for shape in flat.shapes[(2, 0)]) == 2
def test_library_dedup_handles_multiple_duplicate_groups() -> None:
from ..shapes import Circle
lib = Library()
pat = Pattern()
pat.shapes[(1, 0)] += [Circle(radius=1, offset=(0, 0)), Circle(radius=1, offset=(10, 0))]
pat.shapes[(2, 0)] += [Path(vertices=[[0, 0], [5, 0]], width=2), Path(vertices=[[10, 0], [15, 0]], width=2)]
lib["top"] = pat
lib.dedup(exclude_types=(), norm_value=1, threshold=2)
assert len(lib["top"].refs) == 2
assert all(len(refs) == 2 for refs in lib["top"].refs.values())
assert len(lib["top"].shapes[(1, 0)]) == 0
assert len(lib["top"].shapes[(2, 0)]) == 0
def test_library_dedup_uses_stable_target_names_per_label() -> None:
from ..shapes import Circle
lib = Library()
p1 = Pattern()
p1.shapes[(1, 0)] += [Circle(radius=1, offset=(0, 0)), Circle(radius=1, offset=(10, 0))]
lib["p1"] = p1
p2 = Pattern()
p2.shapes[(2, 0)] += [Path(vertices=[[0, 0], [5, 0]], width=2), Path(vertices=[[10, 0], [15, 0]], width=2)]
lib["p2"] = p2
lib.dedup(exclude_types=(), norm_value=1, threshold=2)
circle_target = next(iter(lib["p1"].refs))
path_target = next(iter(lib["p2"].refs))
assert circle_target != path_target
assert all(isinstance(shape, Circle) for shapes in lib[circle_target].shapes.values() for shape in shapes)
assert all(isinstance(shape, Path) for shapes in lib[path_target].shapes.values() for shape in shapes)

60
masque/test/test_oasis.py Normal file
View file

@ -0,0 +1,60 @@
import io
from pathlib import Path
import pytest
from numpy.testing import assert_equal
from ..error import PatternError
from ..pattern import Pattern
from ..library import Library
from ..shapes import Path as MPath
def test_oasis_roundtrip(tmp_path: Path) -> None:
# Skip if fatamorgana is not installed
pytest.importorskip("fatamorgana")
from ..file import oasis
lib = Library()
pat1 = Pattern()
pat1.polygon((1, 0), vertices=[[0, 0], [10, 0], [10, 10], [0, 10]])
lib["cell1"] = pat1
oas_file = tmp_path / "test.oas"
# OASIS needs units_per_micron
oasis.writefile(lib, oas_file, units_per_micron=1000)
read_lib, info = oasis.readfile(oas_file)
assert "cell1" in read_lib
# Check bounds
assert_equal(read_lib["cell1"].get_bounds(), [[0, 0], [10, 10]])
def test_oasis_properties_to_annotations_merges_repeated_keys() -> None:
pytest.importorskip("fatamorgana")
import fatamorgana.records as fatrec
from ..file.oasis import properties_to_annotations
annotations = properties_to_annotations(
[
fatrec.Property("k", [1], is_standard=False),
fatrec.Property("k", [2, 3], is_standard=False),
],
{},
{},
)
assert annotations == {"k": [1, 2, 3]}
def test_oasis_write_rejects_circle_path_caps() -> None:
pytest.importorskip("fatamorgana")
from ..file import oasis
lib = Library()
pat = Pattern()
pat.path((1, 0), vertices=[[0, 0], [10, 0]], width=2, cap=MPath.Cap.Circle)
lib["cell1"] = pat
with pytest.raises(PatternError, match="does not support path cap"):
oasis.write(lib, io.BytesIO(), units_per_micron=1000)

View file

@ -0,0 +1,96 @@
from ..utils.pack2d import maxrects_bssf, guillotine_bssf_sas, pack_patterns
from ..library import Library
from ..pattern import Pattern
def test_maxrects_bssf_simple() -> None:
# Pack two 10x10 squares into one 20x10 container
rects = [[10, 10], [10, 10]]
containers = [[0, 0, 20, 10]]
locs, rejects = maxrects_bssf(rects, containers)
assert not rejects
# They should be at (0,0) and (10,0)
assert {tuple(loc) for loc in locs} == {(0.0, 0.0), (10.0, 0.0)}
def test_maxrects_bssf_reject() -> None:
# Try to pack a too-large rectangle
rects = [[10, 10], [30, 30]]
containers = [[0, 0, 20, 20]]
locs, rejects = maxrects_bssf(rects, containers, allow_rejects=True)
assert 1 in rejects # Second rect rejected
assert 0 not in rejects
def test_maxrects_bssf_exact_fill_rejects_remaining() -> None:
rects = [[20, 20], [1, 1]]
containers = [[0, 0, 20, 20]]
locs, rejects = maxrects_bssf(rects, containers, presort=False, allow_rejects=True)
assert tuple(locs[0]) == (0.0, 0.0)
assert rejects == {1}
def test_maxrects_bssf_presort_reject_mapping() -> None:
rects = [[10, 12], [19, 14], [13, 11]]
containers = [[0, 0, 20, 20]]
_locs, rejects = maxrects_bssf(rects, containers, presort=True, allow_rejects=True)
assert rejects == {0, 2}
def test_guillotine_bssf_sas_presort_reject_mapping() -> None:
rects = [[2, 1], [17, 15], [16, 11]]
containers = [[0, 0, 20, 20]]
_locs, rejects = guillotine_bssf_sas(rects, containers, presort=True, allow_rejects=True)
assert rejects == {2}
def test_pack_patterns() -> None:
lib = Library()
p1 = Pattern()
p1.polygon((1, 0), vertices=[[0, 0], [10, 0], [10, 10], [0, 10]])
lib["p1"] = p1
p2 = Pattern()
p2.polygon((1, 0), vertices=[[0, 0], [5, 0], [5, 5], [0, 5]])
lib["p2"] = p2
# Containers: one 20x20
containers = [[0, 0, 20, 20]]
# 2um spacing
pat, rejects = pack_patterns(lib, ["p1", "p2"], containers, spacing=(2, 2))
assert not rejects
assert len(pat.refs) == 2
assert "p1" in pat.refs
assert "p2" in pat.refs
# Check that they don't overlap (simple check via bounds)
# p1 size 10x10, effectively 12x12
# p2 size 5x5, effectively 7x7
# Both should fit in 20x20
def test_pack_patterns_reject_names_match_original_patterns() -> None:
lib = Library()
for name, (lx, ly) in {
"p0": (10, 12),
"p1": (19, 14),
"p2": (13, 11),
}.items():
pat = Pattern()
pat.rect((1, 0), xmin=0, xmax=lx, ymin=0, ymax=ly)
lib[name] = pat
pat, rejects = pack_patterns(lib, ["p0", "p1", "p2"], [[0, 0, 20, 20]], spacing=(0, 0))
assert set(rejects) == {"p0", "p2"}
assert set(pat.refs) == {"p1"}

111
masque/test/test_path.py Normal file
View file

@ -0,0 +1,111 @@
from numpy.testing import assert_equal, assert_allclose
from ..shapes import Path
def test_path_init() -> None:
p = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.Flush)
assert_equal(p.vertices, [[0, 0], [10, 0]])
assert p.width == 2
assert p.cap == Path.Cap.Flush
def test_path_to_polygons_flush() -> None:
p = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.Flush)
polys = p.to_polygons()
assert len(polys) == 1
# Rectangle from (0, -1) to (10, 1)
bounds = polys[0].get_bounds_single()
assert_equal(bounds, [[0, -1], [10, 1]])
def test_path_to_polygons_square() -> None:
p = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.Square)
polys = p.to_polygons()
assert len(polys) == 1
# Square cap adds width/2 = 1 to each end
# Rectangle from (-1, -1) to (11, 1)
bounds = polys[0].get_bounds_single()
assert_equal(bounds, [[-1, -1], [11, 1]])
def test_path_to_polygons_circle() -> None:
p = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.Circle)
polys = p.to_polygons(num_vertices=32)
# Path.to_polygons for Circle cap returns 1 polygon for the path + polygons for the caps
assert len(polys) >= 3
# Combined bounds should be from (-1, -1) to (11, 1)
# But wait, Path.get_bounds_single() handles this more directly
bounds = p.get_bounds_single()
assert_equal(bounds, [[-1, -1], [11, 1]])
def test_path_custom_cap() -> None:
p = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.SquareCustom, cap_extensions=(5, 10))
polys = p.to_polygons()
assert len(polys) == 1
# Extends 5 units at start, 10 at end
# Starts at -5, ends at 20
bounds = polys[0].get_bounds_single()
assert_equal(bounds, [[-5, -1], [20, 1]])
def test_path_bend() -> None:
# L-shaped path
p = Path(vertices=[[0, 0], [10, 0], [10, 10]], width=2)
polys = p.to_polygons()
assert len(polys) == 1
bounds = polys[0].get_bounds_single()
# Outer corner at (11, -1) is not right.
# Segments: (0,0)-(10,0) and (10,0)-(10,10)
# Corners of segment 1: (0,1), (10,1), (10,-1), (0,-1)
# Corners of segment 2: (9,0), (9,10), (11,10), (11,0)
# Bounds should be [[-1 (if start is square), -1], [11, 11]]?
# Flush cap start at (0,0) with width 2 means y from -1 to 1.
# Vertical segment end at (10,10) with width 2 means x from 9 to 11.
# So bounds should be x: [0, 11], y: [-1, 10]
assert_equal(bounds, [[0, -1], [11, 10]])
def test_path_mirror() -> None:
p = Path(vertices=[[10, 5], [20, 10]], width=2)
p.mirror(0) # Mirror across x axis (y -> -y)
assert_equal(p.vertices, [[10, -5], [20, -10]])
def test_path_scale() -> None:
p = Path(vertices=[[0, 0], [10, 0]], width=2)
p.scale_by(2)
assert_equal(p.vertices, [[0, 0], [20, 0]])
assert p.width == 4
def test_path_scale_custom_cap_extensions() -> None:
p = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.SquareCustom, cap_extensions=(1, 2))
p.scale_by(3)
assert_equal(p.vertices, [[0, 0], [30, 0]])
assert p.width == 6
assert p.cap_extensions is not None
assert_allclose(p.cap_extensions, [3, 6])
assert_equal(p.to_polygons()[0].get_bounds_single(), [[-3, -3], [36, 3]])
def test_path_normalized_form_preserves_width_and_custom_cap_extensions() -> None:
p = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.SquareCustom, cap_extensions=(1, 2))
intrinsic, _extrinsic, ctor = p.normalized_form(5)
q = ctor()
assert intrinsic[-1] == (0.2, 0.4)
assert q.width == 2
assert q.cap_extensions is not None
assert_allclose(q.cap_extensions, [1, 2])
def test_path_normalized_form_distinguishes_custom_caps() -> None:
p1 = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.SquareCustom, cap_extensions=(1, 2))
p2 = Path(vertices=[[0, 0], [10, 0]], width=2, cap=Path.Cap.SquareCustom, cap_extensions=(3, 4))
assert p1.normalized_form(1)[0] != p2.normalized_form(1)[0]

108
masque/test/test_pather.py Normal file
View file

@ -0,0 +1,108 @@
import pytest
from numpy.testing import assert_equal, assert_allclose
from numpy import pi
from ..builder import Pather
from ..builder.tools import PathTool
from ..library import Library
from ..ports import Port
@pytest.fixture
def pather_setup() -> tuple[Pather, PathTool, Library]:
lib = Library()
# Simple PathTool: 2um width on layer (1,0)
tool = PathTool(layer=(1, 0), width=2, ptype="wire")
p = Pather(lib, tools=tool)
# Add an initial port facing North (pi/2)
# Port rotation points INTO device. So "North" rotation means device is North of port.
# Pathing "forward" moves South.
p.ports["start"] = Port((0, 0), pi / 2, ptype="wire")
return p, tool, lib
def test_pather_straight(pather_setup: tuple[Pather, PathTool, Library]) -> None:
p, tool, lib = pather_setup
# Route 10um "forward"
p.straight("start", 10)
# port rot pi/2 (North). Travel +pi relative to port -> South.
assert_allclose(p.ports["start"].offset, [0, -10], atol=1e-10)
assert p.ports["start"].rotation is not None
assert_allclose(p.ports["start"].rotation, pi / 2, atol=1e-10)
def test_pather_bend(pather_setup: tuple[Pather, PathTool, Library]) -> None:
p, tool, lib = pather_setup
# Start (0,0) rot pi/2 (North).
# Path 10um "forward" (South), then turn Clockwise (ccw=False).
# Facing South, turn Right -> West.
p.cw("start", 10)
# PathTool.planL(ccw=False, length=10) returns out_port at (10, -1) relative to (0,0) rot 0.
# Transformed by port rot pi/2 (North) + pi (to move "forward" away from device):
# Transformation rot = pi/2 + pi = 3pi/2.
# (10, -1) rotated 3pi/2: (x,y) -> (y, -x) -> (-1, -10).
assert_allclose(p.ports["start"].offset, [-1, -10], atol=1e-10)
# North (pi/2) + CW (90 deg) -> West (pi)?
# Actual behavior results in 0 (East) - apparently rotation is flipped.
assert p.ports["start"].rotation is not None
assert_allclose(p.ports["start"].rotation, 0, atol=1e-10)
def test_pather_path_to(pather_setup: tuple[Pather, PathTool, Library]) -> None:
p, tool, lib = pather_setup
# start at (0,0) rot pi/2 (North)
# path "forward" (South) to y=-50
p.straight("start", y=-50)
assert_equal(p.ports["start"].offset, [0, -50])
def test_pather_mpath(pather_setup: tuple[Pather, PathTool, Library]) -> None:
p, tool, lib = pather_setup
p.ports["A"] = Port((0, 0), pi / 2, ptype="wire")
p.ports["B"] = Port((10, 0), pi / 2, ptype="wire")
# Path both "forward" (South) to y=-20
p.straight(["A", "B"], ymin=-20)
assert_equal(p.ports["A"].offset, [0, -20])
assert_equal(p.ports["B"].offset, [10, -20])
def test_pather_at_chaining(pather_setup: tuple[Pather, PathTool, Library]) -> None:
p, tool, lib = pather_setup
# Fluent API test
p.at("start").straight(10).ccw(10)
# 10um South -> (0, -10) rot pi/2
# then 10um South and turn CCW (Facing South, CCW is East)
# PathTool.planL(ccw=True, length=10) -> out_port=(10, 1) rot -pi/2 relative to rot 0
# Transform (10, 1) by 3pi/2: (x,y) -> (y, -x) -> (1, -10)
# (0, -10) + (1, -10) = (1, -20)
assert_allclose(p.ports["start"].offset, [1, -20], atol=1e-10)
# pi/2 (North) + CCW (90 deg) -> 0 (East)?
# Actual behavior results in pi (West).
assert p.ports["start"].rotation is not None
assert_allclose(p.ports["start"].rotation, pi, atol=1e-10)
def test_pather_dead_ports() -> None:
lib = Library()
tool = PathTool(layer=(1, 0), width=1)
p = Pather(lib, ports={"in": Port((0, 0), 0)}, tools=tool)
p.set_dead()
# Path with negative length (impossible for PathTool, would normally raise BuildError)
p.straight("in", -10)
# Port 'in' should be updated by dummy extension despite tool failure
# port_rot=0, forward is -x. path(-10) means moving -10 in -x direction -> +10 in x.
assert_allclose(p.ports["in"].offset, [10, 0], atol=1e-10)
# Downstream path should work correctly using the dummy port location
p.straight("in", 20)
# 10 + (-20) = -10
assert_allclose(p.ports["in"].offset, [-10, 0], atol=1e-10)
# Verify no geometry
assert not p.pattern.has_shapes()

View file

@ -0,0 +1,936 @@
from typing import Any
import pytest
import numpy
from numpy import pi
from masque import Pather, Library, Pattern, Port
from masque.builder.tools import PathTool, Tool
from masque.error import BuildError, PortError, PatternError
def test_pather_trace_basic() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1000)
p = Pather(lib, tools=tool, auto_render=False)
# Port rotation 0 points in +x (INTO device).
# To extend it, we move in -x direction.
p.pattern.ports['A'] = Port((0, 0), rotation=0)
# Trace single port
p.at('A').trace(None, 5000)
assert numpy.allclose(p.pattern.ports['A'].offset, (-5000, 0))
# Trace with bend
p.at('A').trace(True, 5000) # CCW bend
# Port was at (-5000, 0) rot 0.
# New wire starts at (-5000, 0) rot 0.
# Output port of wire before rotation: (5000, 500) rot -pi/2
# Rotate by pi (since dev port rot is 0 and tool port rot is 0):
# (-5000, -500) rot pi - pi/2 = pi/2
# Add to start: (-10000, -500) rot pi/2
assert numpy.allclose(p.pattern.ports['A'].offset, (-10000, -500))
assert p.pattern.ports['A'].rotation is not None
assert numpy.isclose(p.pattern.ports['A'].rotation, pi/2)
def test_pather_trace_to() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1000)
p = Pather(lib, tools=tool, auto_render=False)
p.pattern.ports['A'] = Port((0, 0), rotation=0)
# Trace to x=-10000
p.at('A').trace_to(None, x=-10000)
assert numpy.allclose(p.pattern.ports['A'].offset, (-10000, 0))
# Trace to position=-20000
p.at('A').trace_to(None, p=-20000)
assert numpy.allclose(p.pattern.ports['A'].offset, (-20000, 0))
def test_pather_bundle_trace() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1000)
p = Pather(lib, tools=tool, auto_render=False)
p.pattern.ports['A'] = Port((0, 0), rotation=0)
p.pattern.ports['B'] = Port((0, 2000), rotation=0)
# Straight bundle - all should align to same x
p.at(['A', 'B']).straight(xmin=-10000)
assert numpy.isclose(p.pattern.ports['A'].offset[0], -10000)
assert numpy.isclose(p.pattern.ports['B'].offset[0], -10000)
# Bundle with bend
p.at(['A', 'B']).ccw(xmin=-20000, spacing=2000)
# Traveling in -x direction. CCW turn turns towards -y.
# A is at y=0, B is at y=2000.
# Rotation center is at y = -R.
# A is closer to center than B. So A is inner, B is outer.
# xmin is coordinate of innermost bend (A).
assert numpy.isclose(p.pattern.ports['A'].offset[0], -20000)
# B's bend is further out (more negative x)
assert numpy.isclose(p.pattern.ports['B'].offset[0], -22000)
def test_pather_each_bound() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1000)
p = Pather(lib, tools=tool, auto_render=False)
p.pattern.ports['A'] = Port((0, 0), rotation=0)
p.pattern.ports['B'] = Port((-1000, 2000), rotation=0)
# Each should move by 5000 (towards -x)
p.at(['A', 'B']).trace(None, each=5000)
assert numpy.allclose(p.pattern.ports['A'].offset, (-5000, 0))
assert numpy.allclose(p.pattern.ports['B'].offset, (-6000, 2000))
def test_selection_management() -> None:
lib = Library()
p = Pather(lib)
p.pattern.ports['A'] = Port((0, 0), rotation=0)
p.pattern.ports['B'] = Port((0, 0), rotation=0)
pp = p.at('A')
assert pp.ports == ['A']
pp.select('B')
assert pp.ports == ['A', 'B']
pp.deselect('A')
assert pp.ports == ['B']
pp.select(['A'])
assert pp.ports == ['B', 'A']
pp.drop()
assert 'A' not in p.pattern.ports
assert 'B' not in p.pattern.ports
assert pp.ports == []
def test_mark_fork() -> None:
lib = Library()
p = Pather(lib)
p.pattern.ports['A'] = Port((100, 200), rotation=1)
pp = p.at('A')
pp.mark('B')
assert 'B' in p.pattern.ports
assert numpy.allclose(p.pattern.ports['B'].offset, (100, 200))
assert p.pattern.ports['B'].rotation == 1
assert pp.ports == ['A'] # mark keeps current selection
pp.fork('C')
assert 'C' in p.pattern.ports
assert pp.ports == ['C'] # fork switches to new name
def test_mark_fork_reject_overwrite_and_duplicate_targets() -> None:
lib = Library()
p_mark = Pather(lib, pattern=Pattern(ports={
'A': Port((0, 0), rotation=0),
'C': Port((2, 0), rotation=0),
}))
with pytest.raises(PortError, match='overwrite existing ports'):
p_mark.at('A').mark('C')
assert numpy.allclose(p_mark.pattern.ports['C'].offset, (2, 0))
p_fork = Pather(lib, pattern=Pattern(ports={
'A': Port((0, 0), rotation=0),
'B': Port((1, 0), rotation=0),
}))
pp = p_fork.at(['A', 'B'])
with pytest.raises(PortError, match='targets would collide'):
pp.fork({'A': 'X', 'B': 'X'})
assert set(p_fork.pattern.ports) == {'A', 'B'}
assert pp.ports == ['A', 'B']
def test_mark_fork_dead_overwrite_and_duplicate_targets() -> None:
lib = Library()
p = Pather(lib, pattern=Pattern(ports={
'A': Port((0, 0), rotation=0),
'B': Port((1, 0), rotation=0),
'C': Port((2, 0), rotation=0),
}))
p.set_dead()
p.at('A').mark('C')
assert numpy.allclose(p.pattern.ports['C'].offset, (0, 0))
pp = p.at(['A', 'B'])
pp.fork({'A': 'X', 'B': 'X'})
assert numpy.allclose(p.pattern.ports['X'].offset, (1, 0))
assert pp.ports == ['X']
def test_mark_fork_reject_missing_sources() -> None:
lib = Library()
p = Pather(lib, pattern=Pattern(ports={
'A': Port((0, 0), rotation=0),
'B': Port((1, 0), rotation=0),
}))
with pytest.raises(PortError, match='selected ports'):
p.at(['A', 'B']).mark({'Z': 'C'})
with pytest.raises(PortError, match='selected ports'):
p.at(['A', 'B']).fork({'Z': 'C'})
def test_rename() -> None:
lib = Library()
p = Pather(lib)
p.pattern.ports['A'] = Port((0, 0), rotation=0)
p.at('A').rename('B')
assert 'A' not in p.pattern.ports
assert 'B' in p.pattern.ports
p.pattern.ports['C'] = Port((0, 0), rotation=0)
pp = p.at(['B', 'C'])
pp.rename({'B': 'D', 'C': 'E'})
assert 'B' not in p.pattern.ports
assert 'C' not in p.pattern.ports
assert 'D' in p.pattern.ports
assert 'E' in p.pattern.ports
assert set(pp.ports) == {'D', 'E'}
def test_renderpather_uturn_fallback() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1000)
rp = Pather(lib, tools=tool, auto_render=False)
rp.pattern.ports['A'] = Port((0, 0), rotation=0)
# PathTool doesn't implement planU, so it should fall back to two planL calls
rp.at('A').uturn(offset=10000, length=5000)
# Two steps should be added
assert len(rp.paths['A']) == 2
assert rp.paths['A'][0].opcode == 'L'
assert rp.paths['A'][1].opcode == 'L'
rp.render()
assert rp.pattern.ports['A'].rotation is not None
assert numpy.isclose(rp.pattern.ports['A'].rotation, pi)
def test_autotool_uturn() -> None:
from masque.builder.tools import AutoTool
lib = Library()
# Setup AutoTool with a simple straight and a bend
def make_straight(length: float) -> Pattern:
pat = Pattern()
pat.rect(layer='M1', xmin=0, xmax=length, yctr=0, ly=1000)
pat.ports['in'] = Port((0, 0), 0)
pat.ports['out'] = Port((length, 0), pi)
return pat
bend_pat = Pattern()
bend_pat.polygon(layer='M1', vertices=[(0, -500), (0, 500), (1000, -500)])
bend_pat.ports['in'] = Port((0, 0), 0)
bend_pat.ports['out'] = Port((500, -500), pi/2)
lib['bend'] = bend_pat
tool = AutoTool(
straights=[AutoTool.Straight(ptype='wire', fn=make_straight, in_port_name='in', out_port_name='out')],
bends=[AutoTool.Bend(abstract=lib.abstract('bend'), in_port_name='in', out_port_name='out', clockwise=True)],
sbends=[],
transitions={},
default_out_ptype='wire'
)
p = Pather(lib, tools=tool, auto_render=False)
p.pattern.ports['A'] = Port((0, 0), 0)
# CW U-turn (jog < 0)
# R = 500. jog = -2000. length = 1000.
# p0 = planL(length=1000) -> out at (1000, -500) rot pi/2
# R2 = 500.
# l2_length = abs(-2000) - abs(-500) - 500 = 1000.
p.at('A').uturn(offset=-2000, length=1000)
# Final port should be at (-1000, 2000) rot pi
# Start: (0,0) rot 0. Wire direction is rot + pi = pi (West, -x).
# Tool planU returns (length, jog) = (1000, -2000) relative to (0,0) rot 0.
# Rotation of pi transforms (1000, -2000) to (-1000, 2000).
# Final rotation: 0 + pi = pi.
assert numpy.allclose(p.pattern.ports['A'].offset, (-1000, 2000))
assert p.pattern.ports['A'].rotation is not None
assert numpy.isclose(p.pattern.ports['A'].rotation, pi)
def test_pather_trace_into() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1000)
p = Pather(lib, tools=tool, auto_render=False)
# 1. Straight connector
p.pattern.ports['A'] = Port((0, 0), rotation=0)
p.pattern.ports['B'] = Port((-10000, 0), rotation=pi)
p.at('A').trace_into('B', plug_destination=False)
assert 'B' in p.pattern.ports
assert 'A' in p.pattern.ports
assert numpy.allclose(p.pattern.ports['A'].offset, (-10000, 0))
# 2. Single bend
p.pattern.ports['C'] = Port((0, 0), rotation=0)
p.pattern.ports['D'] = Port((-5000, 5000), rotation=pi/2)
p.at('C').trace_into('D', plug_destination=False)
assert 'D' in p.pattern.ports
assert 'C' in p.pattern.ports
assert numpy.allclose(p.pattern.ports['C'].offset, (-5000, 5000))
# 3. Jog (S-bend)
p.pattern.ports['E'] = Port((0, 0), rotation=0)
p.pattern.ports['F'] = Port((-10000, 2000), rotation=pi)
p.at('E').trace_into('F', plug_destination=False)
assert 'F' in p.pattern.ports
assert 'E' in p.pattern.ports
assert numpy.allclose(p.pattern.ports['E'].offset, (-10000, 2000))
# 4. U-bend (0 deg angle)
p.pattern.ports['G'] = Port((0, 0), rotation=0)
p.pattern.ports['H'] = Port((-10000, 2000), rotation=0)
p.at('G').trace_into('H', plug_destination=False)
assert 'H' in p.pattern.ports
assert 'G' in p.pattern.ports
# A U-bend with length=-travel=10000 and jog=-2000 from (0,0) rot 0
# ends up at (-10000, 2000) rot pi.
assert numpy.allclose(p.pattern.ports['G'].offset, (-10000, 2000))
assert p.pattern.ports['G'].rotation is not None
assert numpy.isclose(p.pattern.ports['G'].rotation, pi)
# 5. Vertical straight connector
p.pattern.ports['I'] = Port((0, 0), rotation=pi / 2)
p.pattern.ports['J'] = Port((0, -10000), rotation=3 * pi / 2)
p.at('I').trace_into('J', plug_destination=False)
assert 'J' in p.pattern.ports
assert 'I' in p.pattern.ports
assert numpy.allclose(p.pattern.ports['I'].offset, (0, -10000))
assert p.pattern.ports['I'].rotation is not None
assert numpy.isclose(p.pattern.ports['I'].rotation, pi / 2)
def test_pather_trace_into_dead_updates_ports_without_geometry() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1000, ptype='wire')
p = Pather(lib, tools=tool, auto_render=False)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
p.pattern.ports['B'] = Port((-10000, 0), rotation=pi, ptype='wire')
p.set_dead()
p.trace_into('A', 'B', plug_destination=False)
assert set(p.pattern.ports) == {'A', 'B'}
assert numpy.allclose(p.pattern.ports['A'].offset, (-10000, 0))
assert p.pattern.ports['A'].rotation is not None
assert numpy.isclose(p.pattern.ports['A'].rotation, 0)
assert len(p.paths['A']) == 0
assert not p.pattern.has_shapes()
assert not p.pattern.has_refs()
def test_pather_dead_fallback_preserves_out_ptype() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1000, ptype='wire')
p = Pather(lib, tools=tool, auto_render=False)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
p.set_dead()
p.straight('A', -1000, out_ptype='other')
assert numpy.allclose(p.pattern.ports['A'].offset, (1000, 0))
assert p.pattern.ports['A'].ptype == 'other'
assert len(p.paths['A']) == 0
def test_pather_dead_place_overwrites_colliding_ports_last_wins() -> None:
lib = Library()
p = Pather(lib, pattern=Pattern(ports={
'A': Port((5, 5), rotation=0),
'keep': Port((9, 9), rotation=0),
}))
p.set_dead()
other = Pattern()
other.ports['X'] = Port((1, 0), rotation=0)
other.ports['Y'] = Port((2, 0), rotation=pi / 2)
p.place(other, port_map={'X': 'A', 'Y': 'A'})
assert set(p.pattern.ports) == {'A', 'keep'}
assert numpy.allclose(p.pattern.ports['A'].offset, (2, 0))
assert p.pattern.ports['A'].rotation is not None
assert numpy.isclose(p.pattern.ports['A'].rotation, pi / 2)
def test_pather_dead_plug_overwrites_colliding_outputs_last_wins() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1000, ptype='wire')
p = Pather(lib, tools=tool, pattern=Pattern(ports={
'A': Port((0, 0), rotation=0, ptype='wire'),
'B': Port((99, 99), rotation=0, ptype='wire'),
}))
p.set_dead()
other = Pattern()
other.ports['in'] = Port((0, 0), rotation=pi, ptype='wire')
other.ports['X'] = Port((10, 0), rotation=0, ptype='wire')
other.ports['Y'] = Port((20, 0), rotation=0, ptype='wire')
p.plug(other, map_in={'A': 'in'}, map_out={'X': 'B', 'Y': 'B'})
assert 'A' not in p.pattern.ports
assert 'B' in p.pattern.ports
assert numpy.allclose(p.pattern.ports['B'].offset, (20, 0))
assert p.pattern.ports['B'].rotation is not None
assert numpy.isclose(p.pattern.ports['B'].rotation, 0)
def test_pather_dead_rename_overwrites_colliding_ports_last_wins() -> None:
p = Pather(Library(), pattern=Pattern(ports={
'A': Port((0, 0), rotation=0),
'B': Port((1, 0), rotation=0),
'C': Port((2, 0), rotation=0),
}))
p.set_dead()
p.rename_ports({'A': 'C', 'B': 'C'})
assert set(p.pattern.ports) == {'C'}
assert numpy.allclose(p.pattern.ports['C'].offset, (1, 0))
def test_pather_jog_failed_fallback_is_atomic() -> None:
lib = Library()
tool = PathTool(layer='M1', width=2, ptype='wire')
p = Pather(lib, tools=tool)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
with pytest.raises(BuildError, match='shorter than required bend'):
p.jog('A', 1.5, length=1.5)
assert numpy.allclose(p.pattern.ports['A'].offset, (0, 0))
assert p.pattern.ports['A'].rotation == 0
assert len(p.paths['A']) == 0
def test_pather_jog_accepts_sub_width_offset_when_length_is_sufficient() -> None:
lib = Library()
tool = PathTool(layer='M1', width=2, ptype='wire')
p = Pather(lib, tools=tool)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
p.jog('A', 1.5, length=5)
assert numpy.allclose(p.pattern.ports['A'].offset, (-5, -1.5))
assert p.pattern.ports['A'].rotation == 0
assert len(p.paths['A']) == 0
def test_pather_jog_length_solved_from_single_position_bound() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1, ptype='wire')
p = Pather(lib, tools=tool)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
p.jog('A', 2, x=-6)
assert numpy.allclose(p.pattern.ports['A'].offset, (-6, -2))
assert p.pattern.ports['A'].rotation is not None
assert numpy.isclose(p.pattern.ports['A'].rotation, 0)
q = Pather(Library(), tools=tool)
q.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
q.jog('A', 2, p=-6)
assert numpy.allclose(q.pattern.ports['A'].offset, (-6, -2))
def test_pather_jog_requires_length_or_one_position_bound() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1, ptype='wire')
p = Pather(lib, tools=tool)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
with pytest.raises(BuildError, match='requires either length'):
p.jog('A', 2)
with pytest.raises(BuildError, match='exactly one positional bound'):
p.jog('A', 2, x=-6, p=-6)
def test_pather_trace_to_rejects_conflicting_position_bounds() -> None:
tool = PathTool(layer='M1', width=1, ptype='wire')
for kwargs in ({'x': -5, 'y': 2}, {'y': 2, 'x': -5}, {'p': -7, 'x': -5}):
p = Pather(Library(), tools=tool)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
with pytest.raises(BuildError, match='exactly one positional bound'):
p.trace_to('A', None, **kwargs)
p = Pather(Library(), tools=tool)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
with pytest.raises(BuildError, match='length cannot be combined'):
p.trace_to('A', None, x=-5, length=3)
def test_pather_trace_rejects_length_with_bundle_bound() -> None:
p = Pather(Library(), tools=PathTool(layer='M1', width=1, ptype='wire'))
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
with pytest.raises(BuildError, match='length cannot be combined'):
p.trace('A', None, length=5, xmin=-100)
@pytest.mark.parametrize('kwargs', ({'xmin': -10, 'xmax': -20}, {'xmax': -20, 'xmin': -10}))
def test_pather_trace_rejects_multiple_bundle_bounds(kwargs: dict[str, int]) -> None:
p = Pather(Library(), tools=PathTool(layer='M1', width=1, ptype='wire'))
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
p.pattern.ports['B'] = Port((0, 5), rotation=0, ptype='wire')
with pytest.raises(BuildError, match='exactly one bundle bound'):
p.trace(['A', 'B'], None, **kwargs)
def test_pather_jog_rejects_length_with_position_bound() -> None:
p = Pather(Library(), tools=PathTool(layer='M1', width=1, ptype='wire'))
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
with pytest.raises(BuildError, match='length cannot be combined'):
p.jog('A', 2, length=5, x=-999)
@pytest.mark.parametrize('kwargs', ({'x': -999}, {'xmin': -10}))
def test_pather_uturn_rejects_routing_bounds(kwargs: dict[str, int]) -> None:
p = Pather(Library(), tools=PathTool(layer='M1', width=1, ptype='wire'))
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
with pytest.raises(BuildError, match='Unsupported routing bounds for uturn'):
p.uturn('A', 4, **kwargs)
def test_pather_uturn_none_length_defaults_to_zero() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1, ptype='wire')
p = Pather(lib, tools=tool)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
p.uturn('A', 4)
assert numpy.allclose(p.pattern.ports['A'].offset, (0, -4))
assert p.pattern.ports['A'].rotation is not None
assert numpy.isclose(p.pattern.ports['A'].rotation, pi)
def test_pather_trace_into_failure_rolls_back_ports_and_paths() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1, ptype='wire')
p = Pather(lib, tools=tool)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
p.pattern.ports['B'] = Port((-5, 5), rotation=pi / 2, ptype='wire')
with pytest.raises(BuildError, match='does not match path ptype'):
p.trace_into('A', 'B', plug_destination=False, out_ptype='other')
assert numpy.allclose(p.pattern.ports['A'].offset, (0, 0))
assert numpy.isclose(p.pattern.ports['A'].rotation, 0)
assert numpy.allclose(p.pattern.ports['B'].offset, (-5, 5))
assert numpy.isclose(p.pattern.ports['B'].rotation, pi / 2)
assert len(p.paths['A']) == 0
def test_pather_trace_into_rename_failure_rolls_back_ports_and_paths() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1, ptype='wire')
p = Pather(lib, tools=tool)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
p.pattern.ports['B'] = Port((-10, 0), rotation=pi, ptype='wire')
p.pattern.ports['other'] = Port((3, 4), rotation=0, ptype='wire')
with pytest.raises(PortError, match='overwritten'):
p.trace_into('A', 'B', plug_destination=False, thru='other')
assert set(p.pattern.ports) == {'A', 'B', 'other'}
assert numpy.allclose(p.pattern.ports['A'].offset, (0, 0))
assert numpy.allclose(p.pattern.ports['B'].offset, (-10, 0))
assert numpy.allclose(p.pattern.ports['other'].offset, (3, 4))
assert len(p.paths['A']) == 0
@pytest.mark.parametrize(
('dst', 'kwargs', 'match'),
(
(Port((-5, 5), rotation=pi / 2, ptype='wire'), {'x': -99}, r'trace_to\(\) arguments: x'),
(Port((-10, 2), rotation=pi, ptype='wire'), {'length': 1}, r'jog\(\) arguments: length'),
(Port((-10, 2), rotation=0, ptype='wire'), {'length': 1}, r'uturn\(\) arguments: length'),
),
)
def test_pather_trace_into_rejects_reserved_route_kwargs(
dst: Port,
kwargs: dict[str, Any],
match: str,
) -> None:
lib = Library()
tool = PathTool(layer='M1', width=1, ptype='wire')
p = Pather(lib, tools=tool)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
p.pattern.ports['B'] = dst
with pytest.raises(BuildError, match=match):
p.trace_into('A', 'B', plug_destination=False, **kwargs)
assert numpy.allclose(p.pattern.ports['A'].offset, (0, 0))
assert numpy.isclose(p.pattern.ports['A'].rotation, 0)
assert numpy.allclose(p.pattern.ports['B'].offset, dst.offset)
assert dst.rotation is not None
assert p.pattern.ports['B'].rotation is not None
assert numpy.isclose(p.pattern.ports['B'].rotation, dst.rotation)
assert len(p.paths['A']) == 0
def test_pather_two_l_fallback_validation_rejects_out_ptype_sensitive_jog() -> None:
class OutPtypeSensitiveTool(Tool):
def planL(self, ccw, length, *, in_ptype=None, out_ptype=None, **kwargs):
radius = 1 if out_ptype is None else 2
if ccw is None:
rotation = pi
jog = 0
elif bool(ccw):
rotation = -pi / 2
jog = radius
else:
rotation = pi / 2
jog = -radius
ptype = out_ptype or in_ptype or 'wire'
return Port((length, jog), rotation=rotation, ptype=ptype), {'ccw': ccw, 'length': length}
p = Pather(Library(), tools=OutPtypeSensitiveTool())
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
with pytest.raises(BuildError, match='fallback via two planL'):
p.jog('A', 5, length=10, out_ptype='wide')
assert numpy.allclose(p.pattern.ports['A'].offset, (0, 0))
assert numpy.isclose(p.pattern.ports['A'].rotation, 0)
assert len(p.paths['A']) == 0
def test_pather_two_l_fallback_validation_rejects_out_ptype_sensitive_uturn() -> None:
class OutPtypeSensitiveTool(Tool):
def planL(self, ccw, length, *, in_ptype=None, out_ptype=None, **kwargs):
radius = 1 if out_ptype is None else 2
if ccw is None:
rotation = pi
jog = 0
elif bool(ccw):
rotation = -pi / 2
jog = radius
else:
rotation = pi / 2
jog = -radius
ptype = out_ptype or in_ptype or 'wire'
return Port((length, jog), rotation=rotation, ptype=ptype), {'ccw': ccw, 'length': length}
p = Pather(Library(), tools=OutPtypeSensitiveTool())
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
with pytest.raises(BuildError, match='fallback via two planL'):
p.uturn('A', 5, length=10, out_ptype='wide')
assert numpy.allclose(p.pattern.ports['A'].offset, (0, 0))
assert numpy.isclose(p.pattern.ports['A'].rotation, 0)
assert len(p.paths['A']) == 0
def test_tool_planL_fallback_accepts_custom_port_names() -> None:
class DummyTool(Tool):
def traceL(self, ccw, length, *, in_ptype=None, out_ptype=None, port_names=('A', 'B'), **kwargs) -> Library:
lib = Library()
pat = Pattern()
pat.ports[port_names[0]] = Port((0, 0), 0, ptype='wire')
pat.ports[port_names[1]] = Port((length, 0), pi, ptype='wire')
lib['top'] = pat
return lib
out_port, _ = DummyTool().planL(None, 5, port_names=('X', 'Y'))
assert numpy.allclose(out_port.offset, (5, 0))
assert numpy.isclose(out_port.rotation, pi)
def test_tool_planS_fallback_accepts_custom_port_names() -> None:
class DummyTool(Tool):
def traceS(self, length, jog, *, in_ptype=None, out_ptype=None, port_names=('A', 'B'), **kwargs) -> Library:
lib = Library()
pat = Pattern()
pat.ports[port_names[0]] = Port((0, 0), 0, ptype='wire')
pat.ports[port_names[1]] = Port((length, jog), pi, ptype='wire')
lib['top'] = pat
return lib
out_port, _ = DummyTool().planS(5, 2, port_names=('X', 'Y'))
assert numpy.allclose(out_port.offset, (5, 2))
assert numpy.isclose(out_port.rotation, pi)
def test_pather_uturn_failed_fallback_is_atomic() -> None:
lib = Library()
tool = PathTool(layer='M1', width=2, ptype='wire')
p = Pather(lib, tools=tool)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
with pytest.raises(BuildError, match='shorter than required bend'):
p.uturn('A', 1.5, length=0)
assert numpy.allclose(p.pattern.ports['A'].offset, (0, 0))
assert p.pattern.ports['A'].rotation == 0
assert len(p.paths['A']) == 0
def test_pather_render_auto_renames_single_use_tool_children() -> None:
class FullTreeTool(Tool):
def planL(self, ccw, length, *, in_ptype=None, out_ptype=None, **kwargs): # noqa: ANN001,ANN202
ptype = out_ptype or in_ptype or 'wire'
return Port((length, 0), rotation=pi, ptype=ptype), {'length': length}
def render(self, batch, *, port_names=('A', 'B'), **kwargs) -> Library: # noqa: ANN001,ANN202
tree = Library()
top = Pattern(ports={
port_names[0]: Port((0, 0), 0, ptype='wire'),
port_names[1]: Port((1, 0), pi, ptype='wire'),
})
child = Pattern(annotations={'batch': [len(batch)]})
top.ref('_seg')
tree['_top'] = top
tree['_seg'] = child
return tree
lib = Library()
p = Pather(lib, tools=FullTreeTool(), auto_render=False)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
p.straight('A', 10)
p.render()
p.straight('A', 10)
p.render()
assert len(lib) == 2
assert set(lib.keys()) == set(p.pattern.refs.keys())
assert len(set(p.pattern.refs.keys())) == 2
assert all(name.startswith('_seg') for name in lib)
assert p.pattern.referenced_patterns() <= set(lib.keys())
def test_tool_render_fallback_preserves_segment_subtrees() -> None:
class TraceTreeTool(Tool):
def traceL(self, ccw, length, *, in_ptype=None, out_ptype=None, port_names=('A', 'B'), **kwargs) -> Library: # noqa: ANN001
tree = Library()
top = Pattern(ports={
port_names[0]: Port((0, 0), 0, ptype='wire'),
port_names[1]: Port((length, 0), pi, ptype='wire'),
})
child = Pattern(annotations={'length': [length]})
top.ref('_seg')
tree['_top'] = top
tree['_seg'] = child
return tree
lib = Library()
p = Pather(lib, tools=TraceTreeTool(), auto_render=False)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
p.straight('A', 10)
p.render()
assert '_seg' in lib
assert '_seg' in p.pattern.refs
assert p.pattern.referenced_patterns() <= set(lib.keys())
def test_pather_render_rejects_missing_single_use_tool_refs() -> None:
class MissingSingleUseTool(Tool):
def planL(self, ccw, length, *, in_ptype=None, out_ptype=None, **kwargs): # noqa: ANN001,ANN202
ptype = out_ptype or in_ptype or 'wire'
return Port((length, 0), rotation=pi, ptype=ptype), {'length': length}
def render(self, batch, *, port_names=('A', 'B'), **kwargs) -> Library: # noqa: ANN001,ANN202
tree = Library()
top = Pattern(ports={
port_names[0]: Port((0, 0), 0, ptype='wire'),
port_names[1]: Port((1, 0), pi, ptype='wire'),
})
top.ref('_seg')
tree['_top'] = top
return tree
lib = Library()
lib['_seg'] = Pattern(annotations={'stale': [1]})
p = Pather(lib, tools=MissingSingleUseTool(), auto_render=False)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
p.straight('A', 10)
with pytest.raises(BuildError, match='missing single-use refs'):
p.render()
assert list(lib.keys()) == ['_seg']
assert not p.pattern.refs
def test_pather_render_allows_missing_non_single_use_tool_refs() -> None:
class SharedRefTool(Tool):
def planL(self, ccw, length, *, in_ptype=None, out_ptype=None, **kwargs): # noqa: ANN001,ANN202
ptype = out_ptype or in_ptype or 'wire'
return Port((length, 0), rotation=pi, ptype=ptype), {'length': length}
def render(self, batch, *, port_names=('A', 'B'), **kwargs) -> Library: # noqa: ANN001,ANN202
tree = Library()
top = Pattern(ports={
port_names[0]: Port((0, 0), 0, ptype='wire'),
port_names[1]: Port((1, 0), pi, ptype='wire'),
})
top.ref('shared')
tree['_top'] = top
return tree
lib = Library()
lib['shared'] = Pattern(annotations={'shared': [1]})
p = Pather(lib, tools=SharedRefTool(), auto_render=False)
p.pattern.ports['A'] = Port((0, 0), rotation=0, ptype='wire')
p.straight('A', 10)
p.render()
assert 'shared' in p.pattern.refs
assert p.pattern.referenced_patterns() <= set(lib.keys())
def test_renderpather_rename_to_none_keeps_pending_geometry_without_port() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1000)
rp = Pather(lib, tools=tool, auto_render=False)
rp.pattern.ports['A'] = Port((0, 0), rotation=0)
rp.at('A').straight(5000)
rp.rename_ports({'A': None})
assert 'A' not in rp.pattern.ports
assert len(rp.paths['A']) == 1
rp.render()
assert rp.pattern.has_shapes()
assert 'A' not in rp.pattern.ports
def test_pather_place_treeview_resolves_once() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1000)
p = Pather(lib, tools=tool)
tree = {'child': Pattern(ports={'B': Port((1, 0), pi)})}
p.place(tree)
assert len(lib) == 1
assert 'child' in lib
assert 'child' in p.pattern.refs
assert 'B' in p.pattern.ports
def test_pather_plug_treeview_resolves_once() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1000)
p = Pather(lib, tools=tool)
p.pattern.ports['A'] = Port((0, 0), rotation=0)
tree = {'child': Pattern(ports={'B': Port((0, 0), pi)})}
p.plug(tree, {'A': 'B'})
assert len(lib) == 1
assert 'child' in lib
assert 'child' in p.pattern.refs
assert 'A' not in p.pattern.ports
def test_pather_failed_plug_does_not_add_break_marker() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1000)
p = Pather(lib, tools=tool, auto_render=False)
p.pattern.annotations = {'k': [1]}
p.pattern.ports['A'] = Port((0, 0), rotation=0)
p.at('A').trace(None, 5000)
assert [step.opcode for step in p.paths['A']] == ['L']
other = Pattern(
annotations={'k': [2]},
ports={'X': Port((0, 0), pi), 'Y': Port((5, 0), 0)},
)
with pytest.raises(PatternError, match='Annotation keys overlap'):
p.plug(other, {'A': 'X'}, map_out={'Y': 'Z'}, append=True)
assert [step.opcode for step in p.paths['A']] == ['L']
assert set(p.pattern.ports) == {'A'}
def test_pather_place_reused_deleted_name_keeps_break_marker() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1000)
p = Pather(lib, tools=tool, auto_render=False)
p.pattern.ports['A'] = Port((0, 0), rotation=0)
p.at('A').straight(5000)
p.rename_ports({'A': None})
other = Pattern(ports={'X': Port((-5000, 0), rotation=0)})
p.place(other, port_map={'X': 'A'}, append=True)
p.at('A').straight(2000)
assert [step.opcode for step in p.paths['A']] == ['L', 'P', 'L']
p.render()
assert p.pattern.has_shapes()
assert 'A' in p.pattern.ports
assert numpy.allclose(p.pattern.ports['A'].offset, (-7000, 0))
def test_pather_plug_reused_deleted_name_keeps_break_marker() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1000)
p = Pather(lib, tools=tool, auto_render=False)
p.pattern.ports['A'] = Port((0, 0), rotation=0)
p.pattern.ports['B'] = Port((0, 0), rotation=0)
p.at('A').straight(5000)
p.rename_ports({'A': None})
other = Pattern(
ports={
'X': Port((0, 0), rotation=pi),
'Y': Port((-5000, 0), rotation=0),
},
)
p.plug(other, {'B': 'X'}, map_out={'Y': 'A'}, append=True)
p.at('A').straight(2000)
assert [step.opcode for step in p.paths['A']] == ['L', 'P', 'L']
p.render()
assert p.pattern.has_shapes()
assert 'A' in p.pattern.ports
assert 'B' not in p.pattern.ports
assert numpy.allclose(p.pattern.ports['A'].offset, (-7000, 0))
def test_pather_failed_plugged_does_not_add_break_marker() -> None:
lib = Library()
tool = PathTool(layer='M1', width=1000)
p = Pather(lib, tools=tool, auto_render=False)
p.pattern.ports['A'] = Port((0, 0), rotation=0)
p.at('A').straight(5000)
assert [step.opcode for step in p.paths['A']] == ['L']
with pytest.raises(PortError, match='Connection destination ports were not found'):
p.plugged({'A': 'missing'})
assert [step.opcode for step in p.paths['A']] == ['L']
assert set(p.paths) == {'A'}

310
masque/test/test_pattern.py Normal file
View file

@ -0,0 +1,310 @@
import pytest
import copy
from typing import cast
from numpy.testing import assert_equal, assert_allclose
from numpy import pi
from ..error import PatternError
from ..abstract import Abstract
from ..pattern import Pattern
from ..shapes import Polygon
from ..ref import Ref
from ..ports import Port, PortError
from ..label import Label
from ..repetition import Grid
def test_pattern_init() -> None:
pat = Pattern()
assert pat.is_empty()
assert not pat.has_shapes()
assert not pat.has_refs()
assert not pat.has_labels()
assert not pat.has_ports()
def test_pattern_with_elements() -> None:
poly = Polygon.square(10)
label = Label("test", offset=(5, 5))
ref = Ref(offset=(100, 100))
port = Port((0, 0), 0)
pat = Pattern(shapes={(1, 0): [poly]}, labels={(1, 2): [label]}, refs={"sub": [ref]}, ports={"P1": port})
assert pat.has_shapes()
assert pat.has_labels()
assert pat.has_refs()
assert pat.has_ports()
assert not pat.is_empty()
assert pat.shapes[(1, 0)] == [poly]
assert pat.labels[(1, 2)] == [label]
assert pat.refs["sub"] == [ref]
assert pat.ports["P1"] == port
def test_pattern_append() -> None:
pat1 = Pattern()
pat1.polygon((1, 0), vertices=[[0, 0], [1, 0], [1, 1]])
pat2 = Pattern()
pat2.polygon((2, 0), vertices=[[10, 10], [11, 10], [11, 11]])
pat1.append(pat2)
assert len(pat1.shapes[(1, 0)]) == 1
assert len(pat1.shapes[(2, 0)]) == 1
def test_pattern_translate() -> None:
pat = Pattern()
pat.polygon((1, 0), vertices=[[0, 0], [1, 0], [1, 1]])
pat.ports["P1"] = Port((5, 5), 0)
pat.translate_elements((10, 20))
# Polygon.translate adds to vertices, and offset is always (0,0)
assert_equal(cast("Polygon", pat.shapes[(1, 0)][0]).vertices[0], [10, 20])
assert_equal(pat.ports["P1"].offset, [15, 25])
def test_pattern_scale() -> None:
pat = Pattern()
# Polygon.rect sets an offset in its constructor which is immediately translated into vertices
pat.rect((1, 0), xmin=0, xmax=1, ymin=0, ymax=1)
pat.scale_by(2)
# Vertices should be scaled
assert_equal(cast("Polygon", pat.shapes[(1, 0)][0]).vertices, [[0, 0], [0, 2], [2, 2], [2, 0]])
def test_pattern_rotate() -> None:
pat = Pattern()
pat.polygon((1, 0), vertices=[[10, 0], [11, 0], [10, 1]])
# Rotate 90 degrees CCW around (0,0)
pat.rotate_around((0, 0), pi / 2)
# [10, 0] rotated 90 deg around (0,0) is [0, 10]
assert_allclose(cast("Polygon", pat.shapes[(1, 0)][0]).vertices[0], [0, 10], atol=1e-10)
def test_pattern_mirror() -> None:
pat = Pattern()
pat.polygon((1, 0), vertices=[[10, 5], [11, 5], [10, 6]])
# Mirror across X axis (y -> -y)
pat.mirror(0)
assert_equal(cast("Polygon", pat.shapes[(1, 0)][0]).vertices[0], [10, -5])
def test_pattern_get_bounds() -> None:
pat = Pattern()
pat.polygon((1, 0), vertices=[[0, 0], [10, 0], [10, 10]])
pat.polygon((1, 0), vertices=[[-5, -5], [5, -5], [5, 5]])
bounds = pat.get_bounds()
assert_equal(bounds, [[-5, -5], [10, 10]])
def test_pattern_flatten_preserves_ports_only_child() -> None:
child = Pattern(ports={"P1": Port((1, 2), 0)})
parent = Pattern()
parent.ref("child", offset=(10, 10))
parent.flatten({"child": child}, flatten_ports=True)
assert set(parent.ports) == {"P1"}
assert parent.ports["P1"].rotation == 0
assert tuple(parent.ports["P1"].offset) == (11.0, 12.0)
def test_pattern_flatten_repeated_ref_with_ports_raises() -> None:
child = Pattern(ports={"P1": Port((1, 2), 0)})
child.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]])
parent = Pattern()
parent.ref("child", repetition=Grid(a_vector=(10, 0), a_count=2))
with pytest.raises(PatternError, match='Cannot flatten ports from repeated ref'):
parent.flatten({"child": child}, flatten_ports=True)
def test_pattern_place_requires_abstract_for_reference() -> None:
parent = Pattern()
child = Pattern()
with pytest.raises(PatternError, match='Must provide an `Abstract`'):
parent.place(child)
assert not parent.ports
def test_pattern_place_append_requires_pattern_atomically() -> None:
parent = Pattern()
child = Abstract("child", {"A": Port((1, 2), 0)})
with pytest.raises(PatternError, match='Must provide a full `Pattern`'):
parent.place(child, append=True)
assert not parent.ports
def test_pattern_place_append_annotation_conflict_is_atomic() -> None:
parent = Pattern(annotations={"k": [1]})
child = Pattern(annotations={"k": [2]}, ports={"A": Port((1, 2), 0)})
with pytest.raises(PatternError, match="Annotation keys overlap"):
parent.place(child, append=True)
assert not parent.ports
assert parent.annotations == {"k": [1]}
def test_pattern_place_skip_geometry_overwrites_colliding_ports_last_wins() -> None:
parent = Pattern(ports={
"A": Port((5, 5), 0),
"keep": Port((9, 9), 0),
})
child = Pattern(ports={
"X": Port((1, 0), 0),
"Y": Port((2, 0), pi / 2),
})
parent.place(child, port_map={"X": "A", "Y": "A"}, skip_geometry=True, append=True)
assert set(parent.ports) == {"A", "keep"}
assert_allclose(parent.ports["A"].offset, (2, 0))
assert parent.ports["A"].rotation is not None
assert_allclose(parent.ports["A"].rotation, pi / 2)
def test_pattern_interface() -> None:
source = Pattern()
source.ports["A"] = Port((10, 20), 0, ptype="test")
iface = Pattern.interface(source, in_prefix="in_", out_prefix="out_")
assert "in_A" in iface.ports
assert "out_A" in iface.ports
assert iface.ports["in_A"].rotation is not None
assert_allclose(iface.ports["in_A"].rotation, pi, atol=1e-10)
assert iface.ports["out_A"].rotation is not None
assert_allclose(iface.ports["out_A"].rotation, 0, atol=1e-10)
assert iface.ports["in_A"].ptype == "test"
assert iface.ports["out_A"].ptype == "test"
def test_pattern_interface_duplicate_port_map_targets_raise() -> None:
source = Pattern()
source.ports["A"] = Port((10, 20), 0)
source.ports["B"] = Port((30, 40), pi)
with pytest.raises(PortError, match='Duplicate targets in `port_map`'):
Pattern.interface(source, port_map={"A": "X", "B": "X"})
def test_pattern_interface_empty_port_map_copies_no_ports() -> None:
source = Pattern()
source.ports["A"] = Port((10, 20), 0)
source.ports["B"] = Port((30, 40), pi)
assert not Pattern.interface(source, port_map={}).ports
assert not Pattern.interface(source, port_map=[]).ports
def test_pattern_plug_requires_abstract_for_reference_atomically() -> None:
parent = Pattern(ports={"X": Port((0, 0), 0)})
child = Pattern(ports={"A": Port((0, 0), pi)})
with pytest.raises(PatternError, match='Must provide an `Abstract`'):
parent.plug(child, {"X": "A"})
assert set(parent.ports) == {"X"}
def test_pattern_plug_append_annotation_conflict_is_atomic() -> None:
parent = Pattern(
annotations={"k": [1]},
ports={"X": Port((0, 0), 0), "Q": Port((9, 9), 0)},
)
child = Pattern(
annotations={"k": [2]},
ports={"A": Port((0, 0), pi), "B": Port((5, 0), 0)},
)
with pytest.raises(PatternError, match="Annotation keys overlap"):
parent.plug(child, {"X": "A"}, map_out={"B": "Y"}, append=True)
assert set(parent.ports) == {"X", "Q"}
assert_allclose(parent.ports["X"].offset, (0, 0))
assert_allclose(parent.ports["Q"].offset, (9, 9))
assert parent.annotations == {"k": [1]}
def test_pattern_plug_skip_geometry_overwrites_colliding_ports_last_wins() -> None:
parent = Pattern(ports={
"A": Port((0, 0), 0, ptype="wire"),
"B": Port((99, 99), 0, ptype="wire"),
})
child = Pattern(ports={
"in": Port((0, 0), pi, ptype="wire"),
"X": Port((10, 0), 0, ptype="wire"),
"Y": Port((20, 0), 0, ptype="wire"),
})
parent.plug(child, {"A": "in"}, map_out={"X": "B", "Y": "B"}, skip_geometry=True, append=True)
assert "A" not in parent.ports
assert "B" in parent.ports
assert_allclose(parent.ports["B"].offset, (20, 0))
assert parent.ports["B"].rotation is not None
assert_allclose(parent.ports["B"].rotation, 0)
def test_pattern_append_port_conflict_is_atomic() -> None:
pat1 = Pattern()
pat1.ports["A"] = Port((0, 0), 0)
pat2 = Pattern()
pat2.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]])
pat2.ports["A"] = Port((1, 0), 0)
with pytest.raises(PatternError, match="Port names overlap"):
pat1.append(pat2)
assert not pat1.shapes
assert set(pat1.ports) == {"A"}
def test_pattern_append_annotation_conflict_is_atomic() -> None:
pat1 = Pattern(annotations={"k": [1]})
pat2 = Pattern(annotations={"k": [2]})
pat2.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]])
with pytest.raises(PatternError, match="Annotation keys overlap"):
pat1.append(pat2)
assert not pat1.shapes
assert pat1.annotations == {"k": [1]}
def test_pattern_deepcopy_does_not_share_shape_repetitions() -> None:
pat = Pattern()
pat.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]], repetition=Grid(a_vector=(10, 0), a_count=2))
pat2 = copy.deepcopy(pat)
pat2.scale_by(2)
assert_allclose(cast("Polygon", pat.shapes[(1, 0)][0]).repetition.a_vector, [10, 0])
assert_allclose(cast("Polygon", pat2.shapes[(1, 0)][0]).repetition.a_vector, [20, 0])
def test_pattern_flatten_does_not_mutate_child_repetitions() -> None:
child = Pattern()
child.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]], repetition=Grid(a_vector=(10, 0), a_count=2))
parent = Pattern()
parent.ref("child", scale=2)
parent.flatten({"child": child})
assert_allclose(cast("Polygon", child.shapes[(1, 0)][0]).repetition.a_vector, [10, 0])

125
masque/test/test_polygon.py Normal file
View file

@ -0,0 +1,125 @@
import pytest
import numpy
from numpy.testing import assert_equal
from ..shapes import Polygon
from ..utils import R90
from ..error import PatternError
@pytest.fixture
def polygon() -> Polygon:
return Polygon([[0, 0], [1, 0], [1, 1], [0, 1]])
def test_vertices(polygon: Polygon) -> None:
assert_equal(polygon.vertices, [[0, 0], [1, 0], [1, 1], [0, 1]])
def test_xs(polygon: Polygon) -> None:
assert_equal(polygon.xs, [0, 1, 1, 0])
def test_ys(polygon: Polygon) -> None:
assert_equal(polygon.ys, [0, 0, 1, 1])
def test_offset(polygon: Polygon) -> None:
assert_equal(polygon.offset, [0, 0])
def test_square() -> None:
square = Polygon.square(1)
assert_equal(square.vertices, [[-0.5, -0.5], [-0.5, 0.5], [0.5, 0.5], [0.5, -0.5]])
def test_rectangle() -> None:
rectangle = Polygon.rectangle(1, 2)
assert_equal(rectangle.vertices, [[-0.5, -1], [-0.5, 1], [0.5, 1], [0.5, -1]])
def test_rect() -> None:
rect1 = Polygon.rect(xmin=0, xmax=1, ymin=-1, ymax=1)
assert_equal(rect1.vertices, [[0, -1], [0, 1], [1, 1], [1, -1]])
rect2 = Polygon.rect(xmin=0, lx=1, ymin=-1, ly=2)
assert_equal(rect2.vertices, [[0, -1], [0, 1], [1, 1], [1, -1]])
rect3 = Polygon.rect(xctr=0, lx=1, yctr=-2, ly=2)
assert_equal(rect3.vertices, [[-0.5, -3], [-0.5, -1], [0.5, -1], [0.5, -3]])
rect4 = Polygon.rect(xctr=0, xmax=1, yctr=-2, ymax=0)
assert_equal(rect4.vertices, [[-1, -4], [-1, 0], [1, 0], [1, -4]])
with pytest.raises(PatternError):
Polygon.rect(xctr=0, yctr=-2, ymax=0)
with pytest.raises(PatternError):
Polygon.rect(xmin=0, yctr=-2, ymax=0)
with pytest.raises(PatternError):
Polygon.rect(xmax=0, yctr=-2, ymax=0)
with pytest.raises(PatternError):
Polygon.rect(lx=0, yctr=-2, ymax=0)
with pytest.raises(PatternError):
Polygon.rect(yctr=0, xctr=-2, xmax=0)
with pytest.raises(PatternError):
Polygon.rect(ymin=0, xctr=-2, xmax=0)
with pytest.raises(PatternError):
Polygon.rect(ymax=0, xctr=-2, xmax=0)
with pytest.raises(PatternError):
Polygon.rect(ly=0, xctr=-2, xmax=0)
def test_octagon() -> None:
octagon = Polygon.octagon(side_length=1) # regular=True
assert_equal(octagon.vertices.shape, (8, 2))
diff = octagon.vertices - numpy.roll(octagon.vertices, -1, axis=0)
side_len = numpy.sqrt((diff * diff).sum(axis=1))
assert numpy.allclose(side_len, 1)
def test_to_polygons(polygon: Polygon) -> None:
assert polygon.to_polygons() == [polygon]
def test_get_bounds_single(polygon: Polygon) -> None:
assert_equal(polygon.get_bounds_single(), [[0, 0], [1, 1]])
def test_rotate(polygon: Polygon) -> None:
rotated_polygon = polygon.rotate(R90)
assert_equal(rotated_polygon.vertices, [[0, 0], [0, 1], [-1, 1], [-1, 0]])
def test_mirror(polygon: Polygon) -> None:
mirrored_by_y = polygon.deepcopy().mirror(1)
assert_equal(mirrored_by_y.vertices, [[0, 0], [-1, 0], [-1, 1], [0, 1]])
print(polygon.vertices)
mirrored_by_x = polygon.deepcopy().mirror(0)
assert_equal(mirrored_by_x.vertices, [[0, 0], [1, 0], [1, -1], [0, -1]])
def test_scale_by(polygon: Polygon) -> None:
scaled_polygon = polygon.scale_by(2)
assert_equal(scaled_polygon.vertices, [[0, 0], [2, 0], [2, 2], [0, 2]])
def test_clean_vertices(polygon: Polygon) -> None:
polygon = Polygon([[0, 0], [1, 1], [2, 2], [2, 2], [2, -4], [2, 0], [0, 0]]).clean_vertices()
assert_equal(polygon.vertices, [[0, 0], [2, 2], [2, 0]])
def test_remove_duplicate_vertices() -> None:
polygon = Polygon([[0, 0], [1, 1], [2, 2], [2, 2], [2, 0], [0, 0]]).remove_duplicate_vertices()
assert_equal(polygon.vertices, [[0, 0], [1, 1], [2, 2], [2, 0]])
def test_remove_colinear_vertices() -> None:
polygon = Polygon([[0, 0], [1, 1], [2, 2], [2, 2], [2, 0], [0, 0]]).remove_colinear_vertices()
assert_equal(polygon.vertices, [[0, 0], [2, 2], [2, 0]])
def test_vertices_dtype() -> None:
polygon = Polygon(numpy.array([[0, 0], [1, 0], [1, 1], [0, 1], [0, 0]], dtype=numpy.int32))
polygon.scale_by(0.5)
assert_equal(polygon.vertices, [[0, 0], [0.5, 0], [0.5, 0.5], [0, 0.5], [0, 0]])

293
masque/test/test_ports.py Normal file
View file

@ -0,0 +1,293 @@
import pytest
from numpy.testing import assert_equal, assert_allclose
from numpy import pi
from ..ports import Port, PortList
from ..error import PortError
from ..pattern import Pattern
def test_port_init() -> None:
p = Port(offset=(10, 20), rotation=pi / 2, ptype="test")
assert_equal(p.offset, [10, 20])
assert p.rotation == pi / 2
assert p.ptype == "test"
def test_port_transform() -> None:
p = Port(offset=(10, 0), rotation=0)
p.rotate_around((0, 0), pi / 2)
assert_allclose(p.offset, [0, 10], atol=1e-10)
assert p.rotation is not None
assert_allclose(p.rotation, pi / 2, atol=1e-10)
p.mirror(0) # Mirror across x axis (axis 0): in-place relative to offset
assert_allclose(p.offset, [0, 10], atol=1e-10)
# rotation was pi/2 (90 deg), mirror across x (0 deg) -> -pi/2 == 3pi/2
assert p.rotation is not None
assert_allclose(p.rotation, 3 * pi / 2, atol=1e-10)
def test_port_flip_across() -> None:
p = Port(offset=(10, 0), rotation=0)
p.flip_across(axis=1) # Mirror across x=0: flips x-offset
assert_equal(p.offset, [-10, 0])
# rotation was 0, mirrored(1) -> pi
assert p.rotation is not None
assert_allclose(p.rotation, pi, atol=1e-10)
def test_port_measure_travel() -> None:
p1 = Port((0, 0), 0)
p2 = Port((10, 5), pi) # Facing each other
(travel, jog), rotation = p1.measure_travel(p2)
assert travel == 10
assert jog == 5
assert rotation == pi
def test_port_list_measure_travel() -> None:
class MyPorts(PortList):
def __init__(self) -> None:
self._ports = {
"A": Port((0, 0), 0),
"B": Port((10, 5), pi),
}
@property
def ports(self) -> dict[str, Port]:
return self._ports
@ports.setter
def ports(self, val: dict[str, Port]) -> None:
self._ports = val
pl = MyPorts()
(travel, jog), rotation = pl.measure_travel("A", "B")
assert travel == 10
assert jog == 5
assert rotation == pi
def test_port_describe_any_rotation() -> None:
p = Port((0, 0), None)
assert p.describe() == "pos=(0, 0), rot=any"
def test_port_list_rename() -> None:
class MyPorts(PortList):
def __init__(self) -> None:
self._ports = {"A": Port((0, 0), 0)}
@property
def ports(self) -> dict[str, Port]:
return self._ports
@ports.setter
def ports(self, val: dict[str, Port]) -> None:
self._ports = val
pl = MyPorts()
pl.rename_ports({"A": "B"})
assert "A" not in pl.ports
assert "B" in pl.ports
def test_port_list_rename_missing_port_raises() -> None:
class MyPorts(PortList):
def __init__(self) -> None:
self._ports = {"A": Port((0, 0), 0)}
@property
def ports(self) -> dict[str, Port]:
return self._ports
@ports.setter
def ports(self, val: dict[str, Port]) -> None:
self._ports = val
pl = MyPorts()
with pytest.raises(PortError, match="Ports to rename were not found"):
pl.rename_ports({"missing": "B"})
assert set(pl.ports) == {"A"}
def test_port_list_rename_colliding_targets_raises() -> None:
class MyPorts(PortList):
def __init__(self) -> None:
self._ports = {"A": Port((0, 0), 0), "B": Port((1, 0), 0)}
@property
def ports(self) -> dict[str, Port]:
return self._ports
@ports.setter
def ports(self, val: dict[str, Port]) -> None:
self._ports = val
pl = MyPorts()
with pytest.raises(PortError, match="Renamed ports would collide"):
pl.rename_ports({"A": "C", "B": "C"})
assert set(pl.ports) == {"A", "B"}
def test_port_list_add_port_pair_requires_distinct_names() -> None:
class MyPorts(PortList):
def __init__(self) -> None:
self._ports: dict[str, Port] = {}
@property
def ports(self) -> dict[str, Port]:
return self._ports
@ports.setter
def ports(self, val: dict[str, Port]) -> None:
self._ports = val
pl = MyPorts()
with pytest.raises(PortError, match="Port names must be distinct"):
pl.add_port_pair(names=("A", "A"))
assert not pl.ports
def test_port_list_plugged() -> None:
class MyPorts(PortList):
def __init__(self) -> None:
self._ports = {"A": Port((10, 10), 0), "B": Port((10, 10), pi)}
@property
def ports(self) -> dict[str, Port]:
return self._ports
@ports.setter
def ports(self, val: dict[str, Port]) -> None:
self._ports = val
pl = MyPorts()
pl.plugged({"A": "B"})
assert not pl.ports # Both should be removed
def test_port_list_plugged_empty_raises() -> None:
class MyPorts(PortList):
def __init__(self) -> None:
self._ports = {"A": Port((10, 10), 0), "B": Port((10, 10), pi)}
@property
def ports(self) -> dict[str, Port]:
return self._ports
@ports.setter
def ports(self, val: dict[str, Port]) -> None:
self._ports = val
pl = MyPorts()
with pytest.raises(PortError, match="Must provide at least one port connection"):
pl.plugged({})
assert set(pl.ports) == {"A", "B"}
def test_port_list_plugged_missing_port_raises() -> None:
class MyPorts(PortList):
def __init__(self) -> None:
self._ports = {"A": Port((10, 10), 0), "B": Port((10, 10), pi)}
@property
def ports(self) -> dict[str, Port]:
return self._ports
@ports.setter
def ports(self, val: dict[str, Port]) -> None:
self._ports = val
pl = MyPorts()
with pytest.raises(PortError, match="Connection source ports were not found"):
pl.plugged({"missing": "B"})
assert set(pl.ports) == {"A", "B"}
def test_port_list_plugged_reused_port_raises_atomically() -> None:
class MyPorts(PortList):
def __init__(self) -> None:
self._ports = {"A": Port((0, 0), None), "B": Port((0, 0), None), "C": Port((0, 0), None)}
@property
def ports(self) -> dict[str, Port]:
return self._ports
@ports.setter
def ports(self, val: dict[str, Port]) -> None:
self._ports = val
for connections in ({"A": "A"}, {"A": "B", "C": "B"}):
pl = MyPorts()
with pytest.raises(PortError, match="Each port may appear in at most one connection"):
pl.plugged(connections)
assert set(pl.ports) == {"A", "B", "C"}
pl = MyPorts()
with pytest.raises(PortError, match="Connection destination ports were not found"):
pl.plugged({"A": "missing"})
assert set(pl.ports) == {"A", "B", "C"}
def test_port_list_plugged_mismatch() -> None:
class MyPorts(PortList):
def __init__(self) -> None:
self._ports = {
"A": Port((10, 10), 0),
"B": Port((11, 10), pi), # Offset mismatch
}
@property
def ports(self) -> dict[str, Port]:
return self._ports
@ports.setter
def ports(self, val: dict[str, Port]) -> None:
self._ports = val
pl = MyPorts()
with pytest.raises(PortError):
pl.plugged({"A": "B"})
def test_port_list_check_ports_duplicate_map_in_values_raise() -> None:
class MyPorts(PortList):
def __init__(self) -> None:
self._ports = {"A": Port((0, 0), 0), "B": Port((0, 0), 0)}
@property
def ports(self) -> dict[str, Port]:
return self._ports
@ports.setter
def ports(self, val: dict[str, Port]) -> None:
self._ports = val
pl = MyPorts()
with pytest.raises(PortError, match="Duplicate values in `map_in`"):
pl.check_ports({"X", "Y"}, map_in={"A": "X", "B": "X"})
assert set(pl.ports) == {"A", "B"}
def test_pattern_plug_rejects_map_out_on_connected_ports_atomically() -> None:
host = Pattern(ports={"A": Port((0, 0), 0)})
other = Pattern(ports={"X": Port((0, 0), pi), "Y": Port((5, 0), 0)})
with pytest.raises(PortError, match="`map_out` keys conflict with connected ports"):
host.plug(other, {"A": "X"}, map_out={"X": "renamed", "Y": "out"}, append=True)
assert set(host.ports) == {"A"}
def test_find_transform_requires_connection_map() -> None:
host = Pattern(ports={"A": Port((0, 0), 0)})
other = Pattern(ports={"X": Port((0, 0), pi)})
with pytest.raises(PortError, match="at least one port connection"):
host.find_transform(other, {})
with pytest.raises(PortError, match="at least one port connection"):
Pattern.find_port_transform({}, {}, {})

View file

@ -0,0 +1,132 @@
import numpy
import pytest
from numpy.testing import assert_allclose
from ..utils.ports2data import ports_to_data, data_to_ports
from ..pattern import Pattern
from ..ports import Port
from ..library import Library
from ..error import PortError
from ..repetition import Grid
def test_ports2data_roundtrip() -> None:
pat = Pattern()
pat.ports["P1"] = Port((10, 20), numpy.pi / 2, ptype="test")
layer = (10, 0)
ports_to_data(pat, layer)
assert len(pat.labels[layer]) == 1
assert pat.labels[layer][0].string == "P1:test 90"
assert tuple(pat.labels[layer][0].offset) == (10.0, 20.0)
# New pattern, read ports back
pat2 = Pattern()
pat2.labels[layer] = pat.labels[layer]
data_to_ports([layer], {}, pat2)
assert "P1" in pat2.ports
assert_allclose(pat2.ports["P1"].offset, [10, 20], atol=1e-10)
assert pat2.ports["P1"].rotation is not None
assert_allclose(pat2.ports["P1"].rotation, numpy.pi / 2, atol=1e-10)
assert pat2.ports["P1"].ptype == "test"
def test_data_to_ports_hierarchical() -> None:
lib = Library()
# Child has port data in labels
child = Pattern()
layer = (10, 0)
child.label(layer=layer, string="A:type1 0", offset=(5, 0))
lib["child"] = child
# Parent references child
parent = Pattern()
parent.ref("child", offset=(100, 100), rotation=numpy.pi / 2)
# Read ports hierarchically (max_depth > 0)
data_to_ports([layer], lib, parent, max_depth=1)
# child port A (5,0) rot 0
# transformed by parent ref: rot pi/2, trans (100, 100)
# (5,0) rot pi/2 -> (0, 5)
# (0, 5) + (100, 100) = (100, 105)
# rot 0 + pi/2 = pi/2
assert "A" in parent.ports
assert_allclose(parent.ports["A"].offset, [100, 105], atol=1e-10)
assert parent.ports["A"].rotation is not None
assert_allclose(parent.ports["A"].rotation, numpy.pi / 2, atol=1e-10)
def test_data_to_ports_hierarchical_scaled_ref() -> None:
lib = Library()
child = Pattern()
layer = (10, 0)
child.label(layer=layer, string="A:type1 0", offset=(5, 0))
lib["child"] = child
parent = Pattern()
parent.ref("child", offset=(100, 100), rotation=numpy.pi / 2, scale=2)
data_to_ports([layer], lib, parent, max_depth=1)
assert "A" in parent.ports
assert_allclose(parent.ports["A"].offset, [100, 110], atol=1e-10)
assert parent.ports["A"].rotation is not None
assert_allclose(parent.ports["A"].rotation, numpy.pi / 2, atol=1e-10)
def test_data_to_ports_hierarchical_repeated_ref_warns_and_keeps_best_effort(
caplog: pytest.LogCaptureFixture,
) -> None:
lib = Library()
child = Pattern()
layer = (10, 0)
child.label(layer=layer, string="A:type1 0", offset=(5, 0))
lib["child"] = child
parent = Pattern()
parent.ref("child", repetition=Grid(a_vector=(100, 0), a_count=3))
caplog.set_level("WARNING")
data_to_ports([layer], lib, parent, max_depth=1)
assert "A" in parent.ports
assert_allclose(parent.ports["A"].offset, [5, 0], atol=1e-10)
assert any("importing only the base instance ports" in record.message for record in caplog.records)
def test_data_to_ports_hierarchical_collision_is_atomic() -> None:
lib = Library()
child = Pattern()
layer = (10, 0)
child.label(layer=layer, string="A:type1 0", offset=(5, 0))
lib["child"] = child
parent = Pattern()
parent.ref("child", offset=(0, 0))
parent.ref("child", offset=(10, 0))
with pytest.raises(PortError, match="Device ports conflict with existing ports"):
data_to_ports([layer], lib, parent, max_depth=1)
assert not parent.ports
def test_data_to_ports_flat_bad_angle_warns_and_skips(
caplog: pytest.LogCaptureFixture,
) -> None:
layer = (10, 0)
pat = Pattern()
pat.label(layer=layer, string="A:type1 nope", offset=(5, 0))
caplog.set_level("WARNING")
data_to_ports([layer], {}, pat)
assert not pat.ports
assert any('bad angle' in record.message for record in caplog.records)

View file

@ -0,0 +1,97 @@
import numpy
from numpy import pi
from numpy.testing import assert_allclose
from ..shapes import Arc, Circle, Ellipse, Path, Text
def test_circle_raw_constructor_matches_public() -> None:
raw = Circle._from_raw(
radius=5.0,
offset=numpy.array([1.0, 2.0]),
annotations={'1': ['circle']},
)
public = Circle(
radius=5.0,
offset=(1.0, 2.0),
annotations={'1': ['circle']},
)
assert raw == public
def test_ellipse_raw_constructor_matches_public() -> None:
raw = Ellipse._from_raw(
radii=numpy.array([3.0, 5.0]),
offset=numpy.array([1.0, 2.0]),
rotation=5 * pi / 2,
annotations={'2': ['ellipse']},
)
public = Ellipse(
radii=(3.0, 5.0),
offset=(1.0, 2.0),
rotation=5 * pi / 2,
annotations={'2': ['ellipse']},
)
assert raw == public
def test_arc_raw_constructor_matches_public() -> None:
raw = Arc._from_raw(
radii=numpy.array([10.0, 6.0]),
angles=numpy.array([0.0, pi / 2]),
width=2.0,
offset=numpy.array([1.0, 2.0]),
rotation=5 * pi / 2,
annotations={'3': ['arc']},
)
public = Arc(
radii=(10.0, 6.0),
angles=(0.0, pi / 2),
width=2.0,
offset=(1.0, 2.0),
rotation=5 * pi / 2,
annotations={'3': ['arc']},
)
assert raw == public
def test_path_raw_constructor_matches_public() -> None:
raw = Path._from_raw(
vertices=numpy.array([[0.0, 0.0], [10.0, 0.0], [10.0, 5.0]]),
width=2.0,
cap=Path.Cap.SquareCustom,
cap_extensions=numpy.array([1.0, 3.0]),
annotations={'4': ['path']},
)
public = Path(
vertices=((0.0, 0.0), (10.0, 0.0), (10.0, 5.0)),
width=2.0,
cap=Path.Cap.SquareCustom,
cap_extensions=(1.0, 3.0),
annotations={'4': ['path']},
)
assert raw == public
assert raw.cap_extensions is not None
assert_allclose(raw.cap_extensions, [1.0, 3.0])
def test_text_raw_constructor_matches_public() -> None:
raw = Text._from_raw(
string='RAW',
height=12.0,
font_path='font.otf',
offset=numpy.array([1.0, 2.0]),
rotation=5 * pi / 2,
mirrored=True,
annotations={'5': ['text']},
)
public = Text(
string='RAW',
height=12.0,
font_path='font.otf',
offset=(1.0, 2.0),
rotation=5 * pi / 2,
mirrored=True,
annotations={'5': ['text']},
)
assert raw == public

View file

@ -0,0 +1,70 @@
import copy
import numpy
import pytest
from numpy.testing import assert_allclose, assert_equal
from ..error import PatternError
from ..shapes import Polygon, RectCollection
def test_rect_collection_init_and_to_polygons() -> None:
rects = RectCollection([[10, 10, 12, 12], [0, 0, 5, 5]])
assert_equal(rects.rects, [[0, 0, 5, 5], [10, 10, 12, 12]])
polys = rects.to_polygons()
assert len(polys) == 2
assert all(isinstance(poly, Polygon) for poly in polys)
assert_equal(polys[0].vertices, [[0, 0], [0, 5], [5, 5], [5, 0]])
def test_rect_collection_rejects_invalid_rects() -> None:
with pytest.raises(PatternError):
RectCollection([[0, 0, 1]])
with pytest.raises(PatternError):
RectCollection([[5, 0, 1, 2]])
with pytest.raises(PatternError):
RectCollection([[0, 5, 1, 2]])
def test_rect_collection_raw_constructor_matches_public() -> None:
raw = RectCollection._from_raw(
rects=numpy.array([[10.0, 10.0, 12.0, 12.0], [0.0, 0.0, 5.0, 5.0]]),
annotations={'1': ['rects']},
)
public = RectCollection(
[[0, 0, 5, 5], [10, 10, 12, 12]],
annotations={'1': ['rects']},
)
assert raw == public
assert_equal(raw.get_bounds_single(), [[0, 0], [12, 12]])
def test_rect_collection_manhattan_transforms() -> None:
rects = RectCollection([[0, 0, 2, 4], [10, 20, 12, 22]])
mirrored = copy.deepcopy(rects).mirror(1)
assert_equal(mirrored.rects, [[-2, 0, 0, 4], [-12, 20, -10, 22]])
scaled = copy.deepcopy(rects).scale_by(-2)
assert_equal(scaled.rects, [[-4, -8, 0, 0], [-24, -44, -20, -40]])
rotated = copy.deepcopy(rects).rotate(numpy.pi / 2)
assert_equal(rotated.rects, [[-4, 0, 0, 2], [-22, 10, -20, 12]])
def test_rect_collection_non_manhattan_rotation_raises() -> None:
rects = RectCollection([[0, 0, 2, 4]])
with pytest.raises(PatternError, match='Manhattan rotations'):
rects.rotate(numpy.pi / 4)
def test_rect_collection_normalized_form_rebuild_is_independent() -> None:
rects = RectCollection([[0, 0, 2, 4], [10, 20, 12, 22]])
_intrinsic, extrinsic, rebuild = rects.normalized_form(2)
clone = rebuild()
clone.rects[:] = [[1, 1, 2, 2], [3, 3, 4, 4]]
assert_allclose(extrinsic[0], [6, 11.5])
assert_equal(rects.rects, [[0, 0, 2, 4], [10, 20, 12, 22]])

111
masque/test/test_ref.py Normal file
View file

@ -0,0 +1,111 @@
from typing import cast, TYPE_CHECKING
import pytest
from numpy.testing import assert_equal, assert_allclose
from numpy import pi
from ..error import MasqueError
from ..pattern import Pattern
from ..ref import Ref
from ..repetition import Grid
if TYPE_CHECKING:
from ..shapes import Polygon
def test_ref_init() -> None:
ref = Ref(offset=(10, 20), rotation=pi / 4, mirrored=True, scale=2.0)
assert_equal(ref.offset, [10, 20])
assert ref.rotation == pi / 4
assert ref.mirrored is True
assert ref.scale == 2.0
def test_ref_as_pattern() -> None:
sub_pat = Pattern()
sub_pat.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]])
ref = Ref(offset=(10, 10), rotation=pi / 2, scale=2.0)
transformed_pat = ref.as_pattern(sub_pat)
# Check transformed shape
shape = cast("Polygon", transformed_pat.shapes[(1, 0)][0])
# ref.as_pattern deepcopies sub_pat then applies transformations:
# 1. pattern.scale_by(2) -> vertices [[0,0], [2,0], [0,2]]
# 2. pattern.rotate_around((0,0), pi/2) -> vertices [[0,0], [0,2], [-2,0]]
# 3. pattern.translate_elements((10,10)) -> vertices [[10,10], [10,12], [8,10]]
assert_allclose(shape.vertices, [[10, 10], [10, 12], [8, 10]], atol=1e-10)
def test_ref_with_repetition() -> None:
sub_pat = Pattern()
sub_pat.polygon((1, 0), vertices=[[0, 0], [1, 0], [0, 1]])
rep = Grid(a_vector=(10, 0), b_vector=(0, 10), a_count=2, b_count=2)
ref = Ref(repetition=rep)
repeated_pat = ref.as_pattern(sub_pat)
# Should have 4 shapes
assert len(repeated_pat.shapes[(1, 0)]) == 4
first_verts = sorted([tuple(cast("Polygon", s).vertices[0]) for s in repeated_pat.shapes[(1, 0)]])
assert first_verts == [(0.0, 0.0), (0.0, 10.0), (10.0, 0.0), (10.0, 10.0)]
def test_ref_get_bounds() -> None:
sub_pat = Pattern()
sub_pat.polygon((1, 0), vertices=[[0, 0], [5, 0], [0, 5]])
ref = Ref(offset=(10, 10), scale=2.0)
bounds = ref.get_bounds_single(sub_pat)
# sub_pat bounds [[0,0], [5,5]]
# scaled [[0,0], [10,10]]
# translated [[10,10], [20,20]]
assert_equal(bounds, [[10, 10], [20, 20]])
def test_ref_get_bounds_single_ignores_repetition_for_non_manhattan_rotation() -> None:
sub_pat = Pattern()
sub_pat.rect((1, 0), xmin=0, xmax=1, ymin=0, ymax=2)
rep = Grid(a_vector=(5, 0), b_vector=(0, 7), a_count=3, b_count=2)
ref = Ref(offset=(10, 20), rotation=pi / 4, repetition=rep)
bounds = ref.get_bounds_single(sub_pat)
repeated_bounds = ref.get_bounds(sub_pat)
assert bounds is not None
assert repeated_bounds is not None
assert repeated_bounds[1, 0] > bounds[1, 0]
assert repeated_bounds[1, 1] > bounds[1, 1]
def test_ref_copy() -> None:
ref1 = Ref(offset=(1, 2), rotation=0.5, annotations={"a": [1]})
ref2 = ref1.copy()
assert ref1 == ref2
assert ref1 is not ref2
ref2.offset[0] = 100
assert ref1.offset[0] == 1
def test_ref_rejects_nonpositive_scale() -> None:
with pytest.raises(MasqueError, match='Scale must be positive'):
Ref(scale=0)
with pytest.raises(MasqueError, match='Scale must be positive'):
Ref(scale=-1)
def test_ref_scale_by_rejects_nonpositive_scale() -> None:
ref = Ref(scale=2.0)
with pytest.raises(MasqueError, match='Scale must be positive'):
ref.scale_by(-1)
def test_ref_eq_unrelated_objects_is_false() -> None:
ref = Ref()
assert not (ref == None)
assert not (ref == object())

View file

@ -0,0 +1,199 @@
import pytest
from typing import cast, TYPE_CHECKING
from numpy.testing import assert_allclose
from numpy import pi
from ..builder import Pather
from ..builder.tools import PathTool
from ..library import Library
from ..ports import Port
if TYPE_CHECKING:
from ..shapes import Path
@pytest.fixture
def rpather_setup() -> tuple[Pather, PathTool, Library]:
lib = Library()
tool = PathTool(layer=(1, 0), width=2, ptype="wire")
rp = Pather(lib, tools=tool, auto_render=False)
rp.ports["start"] = Port((0, 0), pi / 2, ptype="wire")
return rp, tool, lib
def test_renderpather_basic(rpather_setup: tuple[Pather, PathTool, Library]) -> None:
rp, tool, lib = rpather_setup
# Plan two segments
rp.at("start").straight(10).straight(10)
# Before rendering, no shapes in pattern
assert not rp.pattern.has_shapes()
assert len(rp.paths["start"]) == 2
# Render
rp.render()
assert rp.pattern.has_shapes()
assert len(rp.pattern.shapes[(1, 0)]) == 1
# Path vertices should be (0,0), (0,-10), (0,-20)
# transformed by start port (rot pi/2 -> 270 deg transform)
# wait, PathTool.render for opcode L uses rotation_matrix_2d(port_rot + pi)
# start_port rot pi/2. pi/2 + pi = 3pi/2.
# (10, 0) rotated 3pi/2 -> (0, -10)
# So vertices: (0,0), (0,-10), (0,-20)
path_shape = cast("Path", rp.pattern.shapes[(1, 0)][0])
assert len(path_shape.vertices) == 3
assert_allclose(path_shape.vertices, [[0, 0], [0, -10], [0, -20]], atol=1e-10)
def test_renderpather_bend(rpather_setup: tuple[Pather, PathTool, Library]) -> None:
rp, tool, lib = rpather_setup
# Plan straight then bend
rp.at("start").straight(10).cw(10)
rp.render()
path_shape = cast("Path", rp.pattern.shapes[(1, 0)][0])
# Path vertices:
# 1. Start (0,0)
# 2. Straight end: (0, -10)
# 3. Bend end: (-1, -20)
# PathTool.planL(ccw=False, length=10) returns data=[10, -1]
# start_port for 2nd segment is at (0, -10) with rotation pi/2
# dxy = rot(pi/2 + pi) @ (10, 0) = (0, -10). So vertex at (0, -20).
# and final end_port.offset is (-1, -20).
assert len(path_shape.vertices) == 4
assert_allclose(path_shape.vertices, [[0, 0], [0, -10], [0, -20], [-1, -20]], atol=1e-10)
def test_renderpather_jog_uses_native_pathtool_planS(rpather_setup: tuple[Pather, PathTool, Library]) -> None:
rp, tool, lib = rpather_setup
rp.at("start").jog(4, length=10)
assert len(rp.paths["start"]) == 1
assert rp.paths["start"][0].opcode == "S"
rp.render()
path_shape = cast("Path", rp.pattern.shapes[(1, 0)][0])
# Native PathTool S-bends place the jog width/2 before the route end.
assert_allclose(path_shape.vertices, [[0, 0], [0, -9], [4, -9], [4, -10]], atol=1e-10)
assert_allclose(rp.ports["start"].offset, [4, -10], atol=1e-10)
def test_renderpather_mirror_preserves_planned_bend_geometry(rpather_setup: tuple[Pather, PathTool, Library]) -> None:
rp, tool, lib = rpather_setup
rp.at("start").straight(10).cw(10)
rp.mirror(0)
rp.render()
path_shape = cast("Path", rp.pattern.shapes[(1, 0)][0])
assert_allclose(path_shape.vertices, [[0, 0], [0, 10], [0, 20], [-1, 20]], atol=1e-10)
def test_renderpather_retool(rpather_setup: tuple[Pather, PathTool, Library]) -> None:
rp, tool1, lib = rpather_setup
tool2 = PathTool(layer=(2, 0), width=4, ptype="wire")
rp.at("start").straight(10)
rp.retool(tool2, keys=["start"])
rp.at("start").straight(10)
rp.render()
# Different tools should cause different batches/shapes
assert len(rp.pattern.shapes[(1, 0)]) == 1
assert len(rp.pattern.shapes[(2, 0)]) == 1
def test_portpather_translate_only_affects_future_steps(rpather_setup: tuple[Pather, PathTool, Library]) -> None:
rp, tool, lib = rpather_setup
pp = rp.at("start")
pp.straight(10)
pp.translate((5, 0))
pp.straight(10)
rp.render()
shapes = rp.pattern.shapes[(1, 0)]
assert len(shapes) == 2
assert_allclose(cast("Path", shapes[0]).vertices, [[0, 0], [0, -10]], atol=1e-10)
assert_allclose(cast("Path", shapes[1]).vertices, [[5, -10], [5, -20]], atol=1e-10)
assert_allclose(rp.ports["start"].offset, [5, -20], atol=1e-10)
def test_renderpather_dead_ports() -> None:
lib = Library()
tool = PathTool(layer=(1, 0), width=1)
rp = Pather(lib, ports={"in": Port((0, 0), 0)}, tools=tool, auto_render=False)
rp.set_dead()
# Impossible path
rp.straight("in", -10)
# port_rot=0, forward is -x. path(-10) means moving -10 in -x direction -> +10 in x.
assert_allclose(rp.ports["in"].offset, [10, 0], atol=1e-10)
# Verify no render steps were added
assert len(rp.paths["in"]) == 0
# Verify no geometry
rp.render()
assert not rp.pattern.has_shapes()
def test_renderpather_rename_port(rpather_setup: tuple[Pather, PathTool, Library]) -> None:
rp, tool, lib = rpather_setup
rp.at("start").straight(10)
# Rename port while path is planned
rp.rename_ports({"start": "new_start"})
# Continue path on new name
rp.at("new_start").straight(10)
assert "start" not in rp.paths
assert len(rp.paths["new_start"]) == 2
rp.render()
assert rp.pattern.has_shapes()
assert len(rp.pattern.shapes[(1, 0)]) == 1
# Total length 20. start_port rot pi/2 -> 270 deg transform.
# Vertices (0,0), (0,-10), (0,-20)
path_shape = cast("Path", rp.pattern.shapes[(1, 0)][0])
assert_allclose(path_shape.vertices, [[0, 0], [0, -10], [0, -20]], atol=1e-10)
assert "new_start" in rp.ports
assert_allclose(rp.ports["new_start"].offset, [0, -20], atol=1e-10)
def test_renderpather_drop_keeps_pending_geometry_without_port(rpather_setup: tuple[Pather, PathTool, Library]) -> None:
rp, tool, lib = rpather_setup
rp.at("start").straight(10).drop()
assert "start" not in rp.ports
assert len(rp.paths["start"]) == 1
rp.render()
assert rp.pattern.has_shapes()
assert "start" not in rp.ports
path_shape = cast("Path", rp.pattern.shapes[(1, 0)][0])
assert_allclose(path_shape.vertices, [[0, 0], [0, -10]], atol=1e-10)
def test_pathtool_traceL_bend_geometry_matches_ports() -> None:
tool = PathTool(layer=(1, 0), width=2, ptype="wire")
tree = tool.traceL(True, 10)
pat = tree.top_pattern()
path_shape = cast("Path", pat.shapes[(1, 0)][0])
assert_allclose(path_shape.vertices, [[0, 0], [10, 0], [10, 1]], atol=1e-10)
assert_allclose(pat.ports["B"].offset, [10, 1], atol=1e-10)
def test_pathtool_traceS_geometry_matches_ports() -> None:
tool = PathTool(layer=(1, 0), width=2, ptype="wire")
tree = tool.traceS(10, 4)
pat = tree.top_pattern()
path_shape = cast("Path", pat.shapes[(1, 0)][0])
assert_allclose(path_shape.vertices, [[0, 0], [9, 0], [9, 4], [10, 4]], atol=1e-10)
assert_allclose(pat.ports["B"].offset, [10, 4], atol=1e-10)
assert_allclose(pat.ports["B"].rotation, pi, atol=1e-10)

View file

@ -0,0 +1,91 @@
import pytest
from numpy.testing import assert_equal, assert_allclose
from numpy import pi
from ..repetition import Grid, Arbitrary
from ..error import PatternError
def test_grid_displacements() -> None:
# 2x2 grid
grid = Grid(a_vector=(10, 0), b_vector=(0, 5), a_count=2, b_count=2)
disps = sorted([tuple(d) for d in grid.displacements])
assert disps == [(0.0, 0.0), (0.0, 5.0), (10.0, 0.0), (10.0, 5.0)]
def test_grid_1d() -> None:
grid = Grid(a_vector=(10, 0), a_count=3)
disps = sorted([tuple(d) for d in grid.displacements])
assert disps == [(0.0, 0.0), (10.0, 0.0), (20.0, 0.0)]
def test_grid_rotate() -> None:
grid = Grid(a_vector=(10, 0), a_count=2)
grid.rotate(pi / 2)
assert_allclose(grid.a_vector, [0, 10], atol=1e-10)
def test_grid_get_bounds() -> None:
grid = Grid(a_vector=(10, 0), b_vector=(0, 5), a_count=2, b_count=2)
bounds = grid.get_bounds()
assert_equal(bounds, [[0, 0], [10, 5]])
def test_arbitrary_displacements() -> None:
pts = [[0, 0], [10, 20], [-5, 30]]
arb = Arbitrary(pts)
# They should be sorted by displacements.setter
disps = arb.displacements
assert len(disps) == 3
assert any((disps == [0, 0]).all(axis=1))
assert any((disps == [10, 20]).all(axis=1))
assert any((disps == [-5, 30]).all(axis=1))
def test_arbitrary_transform() -> None:
arb = Arbitrary([[10, 0]])
arb.rotate(pi / 2)
assert_allclose(arb.displacements, [[0, 10]], atol=1e-10)
arb.mirror(0) # Mirror x across y axis? Wait, mirror(axis=0) in repetition.py is:
# self.displacements[:, 1 - axis] *= -1
# if axis=0, 1-axis=1, so y *= -1
assert_allclose(arb.displacements, [[0, -10]], atol=1e-10)
def test_arbitrary_empty_repetition_is_allowed() -> None:
arb = Arbitrary([])
assert arb.displacements.shape == (0, 2)
assert arb.get_bounds() is None
def test_arbitrary_rejects_non_nx2_displacements() -> None:
for displacements in ([[1], [2]], [[1, 2, 3]], [1, 2, 3]):
with pytest.raises(PatternError, match='displacements must be convertible to an Nx2 ndarray'):
Arbitrary(displacements)
def test_grid_count_setters_reject_nonpositive_values() -> None:
for attr, value, message in (
('a_count', 0, 'a_count'),
('a_count', -1, 'a_count'),
('b_count', 0, 'b_count'),
('b_count', -1, 'b_count'),
):
grid = Grid(a_vector=(10, 0), b_vector=(0, 5), a_count=2, b_count=2)
with pytest.raises(PatternError, match=message):
setattr(grid, attr, value)
def test_repetition_less_equal_includes_equality() -> None:
grid_a = Grid(a_vector=(10, 0), a_count=2)
grid_b = Grid(a_vector=(10, 0), a_count=2)
assert grid_a == grid_b
assert grid_a <= grid_b
assert grid_a >= grid_b
arb_a = Arbitrary([[0, 0], [1, 0]])
arb_b = Arbitrary([[0, 0], [1, 0]])
assert arb_a == arb_b
assert arb_a <= arb_b
assert arb_a >= arb_b

View file

@ -0,0 +1,133 @@
from typing import cast
import numpy as np
from numpy.testing import assert_allclose
from ..pattern import Pattern
from ..ref import Ref
from ..label import Label
from ..repetition import Grid
def test_ref_rotate_intrinsic() -> None:
# Intrinsic rotate() should NOT affect repetition
rep = Grid(a_vector=(10, 0), a_count=2)
ref = Ref(repetition=rep)
ref.rotate(np.pi/2)
assert_allclose(ref.rotation, np.pi/2, atol=1e-10)
# Grid vector should still be (10, 0)
assert ref.repetition is not None
assert_allclose(cast('Grid', ref.repetition).a_vector, [10, 0], atol=1e-10)
def test_ref_rotate_around_extrinsic() -> None:
# Extrinsic rotate_around() SHOULD affect repetition
rep = Grid(a_vector=(10, 0), a_count=2)
ref = Ref(repetition=rep)
ref.rotate_around((0, 0), np.pi/2)
assert_allclose(ref.rotation, np.pi/2, atol=1e-10)
# Grid vector should be rotated to (0, 10)
assert ref.repetition is not None
assert_allclose(cast('Grid', ref.repetition).a_vector, [0, 10], atol=1e-10)
def test_pattern_rotate_around_extrinsic() -> None:
# Pattern.rotate_around() SHOULD affect repetition of its elements
rep = Grid(a_vector=(10, 0), a_count=2)
ref = Ref(repetition=rep)
pat = Pattern()
pat.refs['cell'].append(ref)
pat.rotate_around((0, 0), np.pi/2)
# Check the ref inside the pattern
ref_in_pat = pat.refs['cell'][0]
assert_allclose(ref_in_pat.rotation, np.pi/2, atol=1e-10)
# Grid vector should be rotated to (0, 10)
assert ref_in_pat.repetition is not None
assert_allclose(cast('Grid', ref_in_pat.repetition).a_vector, [0, 10], atol=1e-10)
def test_label_rotate_around_extrinsic() -> None:
# Extrinsic rotate_around() SHOULD affect repetition of labels
rep = Grid(a_vector=(10, 0), a_count=2)
lbl = Label("test", repetition=rep, offset=(5, 0))
lbl.rotate_around((0, 0), np.pi/2)
# Label offset should be (0, 5)
assert_allclose(lbl.offset, [0, 5], atol=1e-10)
# Grid vector should be rotated to (0, 10)
assert lbl.repetition is not None
assert_allclose(cast('Grid', lbl.repetition).a_vector, [0, 10], atol=1e-10)
def test_pattern_rotate_elements_intrinsic() -> None:
# rotate_elements() should NOT affect repetition
rep = Grid(a_vector=(10, 0), a_count=2)
ref = Ref(repetition=rep)
pat = Pattern()
pat.refs['cell'].append(ref)
pat.rotate_elements(np.pi/2)
ref_in_pat = pat.refs['cell'][0]
assert_allclose(ref_in_pat.rotation, np.pi/2, atol=1e-10)
# Grid vector should still be (10, 0)
assert ref_in_pat.repetition is not None
assert_allclose(cast('Grid', ref_in_pat.repetition).a_vector, [10, 0], atol=1e-10)
def test_pattern_rotate_element_centers_extrinsic() -> None:
# rotate_element_centers() SHOULD affect repetition and offset
rep = Grid(a_vector=(10, 0), a_count=2)
ref = Ref(repetition=rep, offset=(5, 0))
pat = Pattern()
pat.refs['cell'].append(ref)
pat.rotate_element_centers(np.pi/2)
ref_in_pat = pat.refs['cell'][0]
# Offset should be (0, 5)
assert_allclose(ref_in_pat.offset, [0, 5], atol=1e-10)
# Grid vector should be rotated to (0, 10)
assert ref_in_pat.repetition is not None
assert_allclose(cast('Grid', ref_in_pat.repetition).a_vector, [0, 10], atol=1e-10)
# Ref rotation should NOT be changed
assert_allclose(ref_in_pat.rotation, 0, atol=1e-10)
def test_pattern_mirror_elements_intrinsic() -> None:
# mirror_elements() should NOT affect repetition or offset
rep = Grid(a_vector=(10, 5), a_count=2)
ref = Ref(repetition=rep, offset=(5, 2))
pat = Pattern()
pat.refs['cell'].append(ref)
pat.mirror_elements(axis=0) # Mirror across x (flip y)
ref_in_pat = pat.refs['cell'][0]
assert ref_in_pat.mirrored is True
# Repetition and offset should be unchanged
assert ref_in_pat.repetition is not None
assert_allclose(cast('Grid', ref_in_pat.repetition).a_vector, [10, 5], atol=1e-10)
assert_allclose(ref_in_pat.offset, [5, 2], atol=1e-10)
def test_pattern_mirror_element_centers_extrinsic() -> None:
# mirror_element_centers() SHOULD affect repetition and offset
rep = Grid(a_vector=(10, 5), a_count=2)
ref = Ref(repetition=rep, offset=(5, 2))
pat = Pattern()
pat.refs['cell'].append(ref)
pat.mirror_element_centers(axis=0) # Mirror across x (flip y)
ref_in_pat = pat.refs['cell'][0]
# Offset should be (5, -2)
assert_allclose(ref_in_pat.offset, [5, -2], atol=1e-10)
# Grid vector should be (10, -5)
assert ref_in_pat.repetition is not None
assert_allclose(cast('Grid', ref_in_pat.repetition).a_vector, [10, -5], atol=1e-10)
# Ref mirrored state should NOT be changed
assert ref_in_pat.mirrored is False

View file

@ -0,0 +1,244 @@
from pathlib import Path
import pytest
import numpy
from numpy.testing import assert_equal, assert_allclose
from numpy import pi
from ..shapes import Arc, Ellipse, Circle, Polygon, Path as MPath, Text, PolyCollection
from ..error import PatternError
# 1. Text shape tests
def test_text_to_polygons() -> None:
pytest.importorskip("freetype")
font_path = "/usr/share/fonts/truetype/dejavu/DejaVuMathTeXGyre.ttf"
if not Path(font_path).exists():
pytest.skip("Font file not found")
t = Text("Hi", height=10, font_path=font_path)
polys = t.to_polygons()
assert len(polys) > 0
assert all(isinstance(p, Polygon) for p in polys)
# Check that it advances
# Character 'H' and 'i' should have different vertices
# Each character is a set of polygons. We check the mean x of vertices for each character.
char_x_means = [p.vertices[:, 0].mean() for p in polys]
assert len(set(char_x_means)) >= 2
def test_text_bounds_and_normalized_form() -> None:
pytest.importorskip("freetype")
font_path = "/usr/share/fonts/truetype/dejavu/DejaVuMathTeXGyre.ttf"
if not Path(font_path).exists():
pytest.skip("Font file not found")
text = Text("Hi", height=10, font_path=font_path)
_intrinsic, extrinsic, ctor = text.normalized_form(5)
normalized = ctor()
assert extrinsic[1] == 2
assert normalized.height == 5
bounds = text.get_bounds_single()
assert bounds is not None
assert numpy.isfinite(bounds).all()
assert numpy.all(bounds[1] > bounds[0])
def test_text_mirroring_affects_comparison() -> None:
text = Text("A", height=10, font_path="dummy.ttf")
mirrored = Text("A", height=10, font_path="dummy.ttf", mirrored=True)
assert text != mirrored
assert (text < mirrored) != (mirrored < text)
# 2. Manhattanization tests
def test_manhattanize() -> None:
pytest.importorskip("float_raster")
pytest.importorskip("skimage.measure")
# Diamond shape
poly = Polygon([[0, 5], [5, 10], [10, 5], [5, 0]])
grid = numpy.arange(0, 11, 1)
manhattan_polys = poly.manhattanize(grid, grid)
assert len(manhattan_polys) >= 1
for mp in manhattan_polys:
# Check that all edges are axis-aligned
dv = numpy.diff(mp.vertices, axis=0)
# For each segment, either dx or dy must be zero
assert numpy.all((dv[:, 0] == 0) | (dv[:, 1] == 0))
# 3. Comparison and Sorting tests
def test_shape_comparisons() -> None:
c1 = Circle(radius=10)
c2 = Circle(radius=20)
assert c1 < c2
assert not (c2 < c1)
p1 = Polygon([[0, 0], [10, 0], [10, 10]])
p2 = Polygon([[0, 0], [10, 0], [10, 11]]) # Different vertex
assert p1 < p2
# Different types
assert c1 < p1 or p1 < c1
assert (c1 < p1) != (p1 < c1)
# 4. Arc/Path Edge Cases
def test_arc_edge_cases() -> None:
# Wrapped arc (> 360 deg)
a = Arc(radii=(10, 10), angles=(0, 3 * pi), width=2)
a.to_polygons(num_vertices=64)
# Should basically be a ring
bounds = a.get_bounds_single()
assert_allclose(bounds, [[-11, -11], [11, 11]], atol=1e-10)
def test_rotated_ellipse_bounds_match_polygonized_geometry() -> None:
ellipse = Ellipse(radii=(10, 20), rotation=pi / 4, offset=(100, 200))
bounds = ellipse.get_bounds_single()
poly_bounds = ellipse.to_polygons(num_vertices=8192)[0].get_bounds_single()
assert_allclose(bounds, poly_bounds, atol=1e-3)
def test_rotated_arc_bounds_match_polygonized_geometry() -> None:
arc = Arc(radii=(10, 20), angles=(0, pi), width=2, rotation=pi / 4, offset=(100, 200))
bounds = arc.get_bounds_single()
poly_bounds = arc.to_polygons(num_vertices=8192)[0].get_bounds_single()
assert_allclose(bounds, poly_bounds, atol=1e-3)
def test_curve_polygonizers_clamp_large_max_arclen() -> None:
for shape in (
Circle(radius=10),
Ellipse(radii=(10, 20)),
Arc(radii=(10, 20), angles=(0, 1), width=2),
):
polys = shape.to_polygons(num_vertices=None, max_arclen=1e9)
assert len(polys) == 1
assert len(polys[0].vertices) >= 3
def test_arc_polygonization_rejects_nan_implied_arclen() -> None:
arc = Arc(radii=(10, 20), angles=(0, numpy.nan), width=2)
with pytest.raises(PatternError, match='valid max_arclen'):
arc.to_polygons(num_vertices=24)
def test_ellipse_integer_radii_scale_cleanly() -> None:
ellipse = Ellipse(radii=(10, 20))
ellipse.scale_by(0.5)
assert_allclose(ellipse.radii, [5, 10])
def test_arc_rejects_zero_radii_up_front() -> None:
with pytest.raises(PatternError, match='Radii must be positive'):
Arc(radii=(0, 5), angles=(0, 1), width=1)
with pytest.raises(PatternError, match='Radii must be positive'):
Arc(radii=(5, 0), angles=(0, 1), width=1)
with pytest.raises(PatternError, match='Radii must be positive'):
Arc(radii=(0, 0), angles=(0, 1), width=1)
def test_path_edge_cases() -> None:
# Zero-length segments
p = MPath(vertices=[[0, 0], [0, 0], [10, 0]], width=2)
polys = p.to_polygons()
assert len(polys) == 1
assert_equal(polys[0].get_bounds_single(), [[0, -1], [10, 1]])
# 5. PolyCollection with holes
def test_poly_collection_holes() -> None:
# Outer square, inner square hole
# PolyCollection doesn't explicitly support holes, but its constituents (Polygons) do?
# wait, Polygon in masque is just a boundary. Holes are usually handled by having multiple
# polygons or using specific winding rules.
# masque.shapes.Polygon doc says "specify an implicitly-closed boundary".
# Pyclipper is used in connectivity.py for holes.
# Let's test PolyCollection with multiple polygons
verts = [
[0, 0],
[10, 0],
[10, 10],
[0, 10], # Poly 1
[2, 2],
[2, 8],
[8, 8],
[8, 2], # Poly 2
]
offsets = [0, 4]
pc = PolyCollection(verts, offsets)
polys = pc.to_polygons()
assert len(polys) == 2
assert_equal(polys[0].vertices, [[0, 0], [10, 0], [10, 10], [0, 10]])
assert_equal(polys[1].vertices, [[2, 2], [2, 8], [8, 8], [8, 2]])
def test_poly_collection_constituent_empty() -> None:
# One real triangle, one "empty" polygon (0 vertices), one real square
# Note: Polygon requires 3 vertices, so "empty" here might mean just some junk
# that to_polygons should handle.
# Actually PolyCollection doesn't check vertex count per polygon.
verts = [
[0, 0],
[1, 0],
[0, 1], # Tri
# Empty space
[10, 10],
[11, 10],
[11, 11],
[10, 11], # Square
]
offsets = [0, 3, 3] # Index 3 is start of "empty", Index 3 is also start of Square?
# No, offsets should be strictly increasing or handle 0-length slices.
# vertex_slices uses zip(offsets, chain(offsets[1:], [len(verts)]))
# if offsets = [0, 3, 3], slices are [0:3], [3:3], [3:7]
offsets = [0, 3, 3]
pc = PolyCollection(verts, offsets)
# Polygon(vertices=[]) will fail because of the setter check.
# Let's see if pc.to_polygons() handles it.
# It calls Polygon(vertices=vv) for each slice.
# slice [3:3] gives empty vv.
with pytest.raises(PatternError):
pc.to_polygons()
def test_poly_collection_valid() -> None:
verts = [[0, 0], [1, 0], [0, 1], [10, 10], [11, 10], [11, 11], [10, 11]]
offsets = [0, 3]
pc = PolyCollection(verts, offsets)
assert len(pc.to_polygons()) == 2
shapes = [Circle(radius=20), Circle(radius=10), Polygon([[0, 0], [10, 0], [10, 10]]), Ellipse(radii=(5, 5))]
sorted_shapes = sorted(shapes)
assert len(sorted_shapes) == 4
# Just verify it doesn't crash and is stable
assert sorted(sorted_shapes) == sorted_shapes
def test_poly_collection_normalized_form_reconstruction_is_independent() -> None:
pc = PolyCollection([[0, 0], [1, 0], [0, 1]], [0])
_intrinsic, _extrinsic, rebuild = pc.normalized_form(1)
clone = rebuild()
clone.vertex_offsets[:] = [5]
assert_equal(pc.vertex_offsets, [0])
assert_equal(clone.vertex_offsets, [5])
def test_poly_collection_normalized_form_rebuilds_independent_clones() -> None:
pc = PolyCollection([[0, 0], [1, 0], [0, 1]], [0])
_intrinsic, _extrinsic, rebuild = pc.normalized_form(1)
first = rebuild()
second = rebuild()
first.vertex_offsets[:] = [7]
assert_equal(first.vertex_offsets, [7])
assert_equal(second.vertex_offsets, [0])
assert_equal(pc.vertex_offsets, [0])

142
masque/test/test_shapes.py Normal file
View file

@ -0,0 +1,142 @@
import numpy
from numpy.testing import assert_equal, assert_allclose
from numpy import pi
from ..shapes import Arc, Ellipse, Circle, Polygon, PolyCollection
def test_poly_collection_init() -> None:
# Two squares: [[0,0], [1,0], [1,1], [0,1]] and [[10,10], [11,10], [11,11], [10,11]]
verts = [[0, 0], [1, 0], [1, 1], [0, 1], [10, 10], [11, 10], [11, 11], [10, 11]]
offsets = [0, 4]
pc = PolyCollection(vertex_lists=verts, vertex_offsets=offsets)
assert len(list(pc.polygon_vertices)) == 2
assert_equal(pc.get_bounds_single(), [[0, 0], [11, 11]])
def test_poly_collection_to_polygons() -> None:
verts = [[0, 0], [1, 0], [1, 1], [0, 1], [10, 10], [11, 10], [11, 11], [10, 11]]
offsets = [0, 4]
pc = PolyCollection(vertex_lists=verts, vertex_offsets=offsets)
polys = pc.to_polygons()
assert len(polys) == 2
assert_equal(polys[0].vertices, [[0, 0], [1, 0], [1, 1], [0, 1]])
assert_equal(polys[1].vertices, [[10, 10], [11, 10], [11, 11], [10, 11]])
def test_circle_init() -> None:
c = Circle(radius=10, offset=(5, 5))
assert c.radius == 10
assert_equal(c.offset, [5, 5])
def test_circle_to_polygons() -> None:
c = Circle(radius=10)
polys = c.to_polygons(num_vertices=32)
assert len(polys) == 1
assert isinstance(polys[0], Polygon)
# A circle with 32 vertices should have vertices distributed around (0,0)
bounds = polys[0].get_bounds_single()
assert_allclose(bounds, [[-10, -10], [10, 10]], atol=1e-10)
def test_ellipse_init() -> None:
e = Ellipse(radii=(10, 5), offset=(1, 2), rotation=pi / 4)
assert_equal(e.radii, [10, 5])
assert_equal(e.offset, [1, 2])
assert e.rotation == pi / 4
def test_ellipse_to_polygons() -> None:
e = Ellipse(radii=(10, 5))
polys = e.to_polygons(num_vertices=64)
assert len(polys) == 1
bounds = polys[0].get_bounds_single()
assert_allclose(bounds, [[-10, -5], [10, 5]], atol=1e-10)
def test_arc_init() -> None:
a = Arc(radii=(10, 10), angles=(0, pi / 2), width=2, offset=(0, 0))
assert_equal(a.radii, [10, 10])
assert_equal(a.angles, [0, pi / 2])
assert a.width == 2
def test_arc_to_polygons() -> None:
# Quarter circle arc
a = Arc(radii=(10, 10), angles=(0, pi / 2), width=2)
polys = a.to_polygons(num_vertices=32)
assert len(polys) == 1
# Outer radius 11, inner radius 9
# Quarter circle from 0 to 90 deg
bounds = polys[0].get_bounds_single()
# Min x should be 0 (inner edge start/stop or center if width is large)
# But wait, the arc is centered at 0,0.
# Outer edge goes from (11, 0) to (0, 11)
# Inner edge goes from (9, 0) to (0, 9)
# So x ranges from 0 to 11, y ranges from 0 to 11.
assert_allclose(bounds, [[0, 0], [11, 11]], atol=1e-10)
def test_shape_mirror() -> None:
e = Ellipse(radii=(10, 5), offset=(10, 20), rotation=pi / 4)
e.mirror(0) # Mirror across x axis (axis 0): in-place relative to offset
assert_equal(e.offset, [10, 20])
# rotation was pi/4, mirrored(0) -> -pi/4 == 3pi/4 (mod pi)
assert_allclose(e.rotation, 3 * pi / 4, atol=1e-10)
a = Arc(radii=(10, 10), angles=(0, pi / 4), width=2, offset=(10, 20))
a.mirror(0)
assert_equal(a.offset, [10, 20])
# For Arc, mirror(0) negates rotation and angles
assert_allclose(a.angles, [0, -pi / 4], atol=1e-10)
def test_shape_flip_across() -> None:
e = Ellipse(radii=(10, 5), offset=(10, 20), rotation=pi / 4)
e.flip_across(axis=0) # Mirror across y=0: flips y-offset
assert_equal(e.offset, [10, -20])
# rotation also flips: -pi/4 == 3pi/4 (mod pi)
assert_allclose(e.rotation, 3 * pi / 4, atol=1e-10)
# Mirror across specific y
e = Ellipse(radii=(10, 5), offset=(10, 20))
e.flip_across(y=10) # Mirror across y=10
# y=20 mirrored across y=10 -> y=0
assert_equal(e.offset, [10, 0])
def test_shape_scale() -> None:
e = Ellipse(radii=(10, 5))
e.scale_by(2)
assert_equal(e.radii, [20, 10])
a = Arc(radii=(10, 5), angles=(0, pi), width=2)
a.scale_by(0.5)
assert_equal(a.radii, [5, 2.5])
assert a.width == 1
def test_shape_arclen() -> None:
# Test that max_arclen correctly limits segment lengths
# Ellipse
e = Ellipse(radii=(10, 5))
# Approximate perimeter is ~48.4
# With max_arclen=5, should have > 10 segments
polys = e.to_polygons(max_arclen=5)
v = polys[0].vertices
dist = numpy.sqrt(numpy.sum(numpy.diff(v, axis=0, append=v[:1]) ** 2, axis=1))
assert numpy.all(dist <= 5.000001)
assert len(v) > 10
# Arc
a = Arc(radii=(10, 10), angles=(0, pi / 2), width=2)
# Outer perimeter is 11 * pi/2 ~ 17.27
# Inner perimeter is 9 * pi/2 ~ 14.14
# With max_arclen=2, should have > 8 segments on outer edge
polys = a.to_polygons(max_arclen=2)
v = polys[0].vertices
# Arc polygons are closed, but contain both inner and outer edges and caps
# Let's just check that all segment lengths are within limit
dist = numpy.sqrt(numpy.sum(numpy.diff(v, axis=0, append=v[:1]) ** 2, axis=1))
assert numpy.all(dist <= 2.000001)

100
masque/test/test_svg.py Normal file
View file

@ -0,0 +1,100 @@
from pathlib import Path
import xml.etree.ElementTree as ET
import numpy
import pytest
from numpy.testing import assert_allclose
pytest.importorskip("svgwrite")
from ..library import Library
from ..pattern import Pattern
from ..file import svg
SVG_NS = "{http://www.w3.org/2000/svg}"
XLINK_HREF = "{http://www.w3.org/1999/xlink}href"
def _child_transform(svg_path: Path) -> tuple[float, ...]:
root = ET.fromstring(svg_path.read_text())
for use in root.iter(f"{SVG_NS}use"):
if use.attrib.get(XLINK_HREF) == "#child":
raw = use.attrib["transform"]
assert raw.startswith("matrix(") and raw.endswith(")")
return tuple(float(value) for value in raw[7:-1].split())
raise AssertionError("No child reference found in SVG output")
def test_svg_ref_rotation_uses_correct_affine_transform(tmp_path: Path) -> None:
lib = Library()
child = Pattern()
child.polygon("1", vertices=[[0, 0], [1, 0], [0, 1]])
lib["child"] = child
top = Pattern()
top.ref("child", offset=(3, 4), rotation=numpy.pi / 2, scale=2)
lib["top"] = top
svg_path = tmp_path / "rotation.svg"
svg.writefile(lib, "top", str(svg_path))
assert_allclose(_child_transform(svg_path), (0, 2, -2, 0, 3, 4), atol=1e-10)
def test_svg_ref_mirroring_changes_affine_transform(tmp_path: Path) -> None:
base = Library()
child = Pattern()
child.polygon("1", vertices=[[0, 0], [1, 0], [0, 1]])
base["child"] = child
top_plain = Pattern()
top_plain.ref("child", offset=(3, 4), rotation=numpy.pi / 2, scale=2, mirrored=False)
base["plain"] = top_plain
plain_path = tmp_path / "plain.svg"
svg.writefile(base, "plain", str(plain_path))
plain_transform = _child_transform(plain_path)
mirrored = Library()
mirrored["child"] = child.deepcopy()
top_mirrored = Pattern()
top_mirrored.ref("child", offset=(3, 4), rotation=numpy.pi / 2, scale=2, mirrored=True)
mirrored["mirrored"] = top_mirrored
mirrored_path = tmp_path / "mirrored.svg"
svg.writefile(mirrored, "mirrored", str(mirrored_path))
mirrored_transform = _child_transform(mirrored_path)
assert_allclose(plain_transform, (0, 2, -2, 0, 3, 4), atol=1e-10)
assert_allclose(mirrored_transform, (0, 2, 2, 0, 3, 4), atol=1e-10)
def test_svg_uses_unique_ids_for_colliding_mangled_names(tmp_path: Path) -> None:
lib = Library()
first = Pattern()
first.polygon("1", vertices=[[0, 0], [1, 0], [0, 1]])
lib["a b"] = first
second = Pattern()
second.polygon("1", vertices=[[0, 0], [2, 0], [0, 2]])
lib["a-b"] = second
top = Pattern()
top.ref("a b")
top.ref("a-b", offset=(5, 0))
lib["top"] = top
svg_path = tmp_path / "colliding_ids.svg"
svg.writefile(lib, "top", str(svg_path))
root = ET.fromstring(svg_path.read_text())
ids = [group.attrib["id"] for group in root.iter(f"{SVG_NS}g")]
top_group = next(group for group in root.iter(f"{SVG_NS}g") if group.attrib["id"] == "top")
hrefs = [use.attrib[XLINK_HREF] for use in top_group.iter(f"{SVG_NS}use")]
assert len(set(ids)) == len(ids)
assert len(hrefs) == 2
assert len(set(hrefs)) == 2
assert all(href.startswith("#") for href in hrefs)
assert all(href[1:] in ids for href in hrefs)

192
masque/test/test_utils.py Normal file
View file

@ -0,0 +1,192 @@
from pathlib import Path
import numpy
from numpy.testing import assert_equal, assert_allclose
from numpy import pi
import pytest
from ..utils import remove_duplicate_vertices, remove_colinear_vertices, poly_contains_points, rotation_matrix_2d, apply_transforms, normalize_mirror, DeferredDict
from ..file.utils import tmpfile
from ..utils.curves import bezier
from ..error import PatternError
def test_remove_duplicate_vertices() -> None:
# Closed path (default)
v = [[0, 0], [1, 1], [1, 1], [2, 2], [0, 0]]
v_clean = remove_duplicate_vertices(v, closed_path=True)
# The last [0,0] is a duplicate of the first [0,0] if closed_path=True
assert_equal(v_clean, [[0, 0], [1, 1], [2, 2]])
# Open path
v_clean_open = remove_duplicate_vertices(v, closed_path=False)
assert_equal(v_clean_open, [[0, 0], [1, 1], [2, 2], [0, 0]])
def test_remove_colinear_vertices() -> None:
v = [[0, 0], [1, 0], [2, 0], [2, 1], [2, 2], [1, 1], [0, 0]]
v_clean = remove_colinear_vertices(v, closed_path=True)
# [1, 0] is between [0, 0] and [2, 0]
# [2, 1] is between [2, 0] and [2, 2]
# [1, 1] is between [2, 2] and [0, 0]
assert_equal(v_clean, [[0, 0], [2, 0], [2, 2]])
def test_remove_colinear_vertices_exhaustive() -> None:
# U-turn
v = [[0, 0], [10, 0], [0, 0]]
v_clean = remove_colinear_vertices(v, closed_path=False, preserve_uturns=True)
# Open path should keep ends. [10,0] is between [0,0] and [0,0]?
# They are colinear, but it's a 180 degree turn.
# We preserve 180 degree turns if preserve_uturns is True.
assert len(v_clean) == 3
v_collapsed = remove_colinear_vertices(v, closed_path=False, preserve_uturns=False)
# If not preserving u-turns, it should collapse to just the endpoints
assert len(v_collapsed) == 2
# 180 degree U-turn in closed path
v = [[0, 0], [10, 0], [5, 0]]
v_clean = remove_colinear_vertices(v, closed_path=True, preserve_uturns=False)
assert len(v_clean) == 2
def test_poly_contains_points() -> None:
v = [[0, 0], [10, 0], [10, 10], [0, 10]]
pts = [[5, 5], [-1, -1], [10, 10], [11, 5]]
inside = poly_contains_points(v, pts)
assert_equal(inside, [True, False, True, False])
def test_rotation_matrix_2d() -> None:
m = rotation_matrix_2d(pi / 2)
assert_allclose(m, [[0, -1], [1, 0]], atol=1e-10)
def test_rotation_matrix_non_manhattan() -> None:
# 45 degrees
m = rotation_matrix_2d(pi / 4)
s = numpy.sqrt(2) / 2
assert_allclose(m, [[s, -s], [s, s]], atol=1e-10)
def test_apply_transforms() -> None:
# cumulative [x_offset, y_offset, rotation (rad), mirror_x (0 or 1)]
t1 = [10, 20, 0, 0]
t2 = [[5, 0, 0, 0], [0, 5, 0, 0]]
combined = apply_transforms(t1, t2)
assert_equal(combined, [[15, 20, 0, 0, 1], [10, 25, 0, 0, 1]])
def test_apply_transforms_advanced() -> None:
# Ox4: (x, y, rot, mir)
# Outer: mirror x (axis 0), then rotate 90 deg CCW
# apply_transforms logic for mirror uses y *= -1 (which is axis 0 mirror)
outer = [0, 0, pi / 2, 1]
# Inner: (10, 0, 0, 0)
inner = [10, 0, 0, 0]
combined = apply_transforms(outer, inner)
# 1. mirror inner y if outer mirrored: (10, 0) -> (10, 0)
# 2. rotate by outer rotation (pi/2): (10, 0) -> (0, 10)
# 3. add outer offset (0, 0) -> (0, 10)
assert_allclose(combined[0], [0, 10, pi / 2, 1, 1], atol=1e-10)
def test_apply_transforms_empty_inputs() -> None:
empty_outer = apply_transforms(numpy.empty((0, 5)), [[1, 2, 0, 0, 1]])
assert empty_outer.shape == (0, 5)
empty_inner = apply_transforms([[1, 2, 0, 0, 1]], numpy.empty((0, 5)))
assert empty_inner.shape == (0, 5)
both_empty_tensor = apply_transforms(numpy.empty((0, 5)), numpy.empty((0, 5)), tensor=True)
assert both_empty_tensor.shape == (0, 0, 5)
def test_normalize_mirror_validates_length() -> None:
with pytest.raises(ValueError, match='2-item sequence'):
normalize_mirror(())
with pytest.raises(ValueError, match='2-item sequence'):
normalize_mirror((True,))
with pytest.raises(ValueError, match='2-item sequence'):
normalize_mirror((True, False, True))
def test_bezier_validates_weight_length() -> None:
with pytest.raises(PatternError, match='one entry per control point'):
bezier([[0, 0], [1, 1]], [0, 0.5, 1], weights=[1])
with pytest.raises(PatternError, match='one entry per control point'):
bezier([[0, 0], [1, 1]], [0, 0.5, 1], weights=[1, 2, 3])
def test_bezier_accepts_exact_weight_count() -> None:
samples = bezier([[0, 0], [1, 1]], [0, 0.5, 1], weights=[1, 2])
assert_allclose(samples, [[0, 0], [2 / 3, 2 / 3], [1, 1]], atol=1e-10)
def test_deferred_dict_accessors_resolve_values_once() -> None:
calls = 0
def make_value() -> int:
nonlocal calls
calls += 1
return 7
deferred = DeferredDict[str, int]()
deferred["x"] = make_value
assert deferred.get("missing", 9) == 9
assert deferred.get("x") == 7
assert list(deferred.values()) == [7]
assert list(deferred.items()) == [("x", 7)]
assert calls == 1
def test_deferred_dict_mutating_accessors_preserve_value_semantics() -> None:
calls = 0
def make_value() -> int:
nonlocal calls
calls += 1
return 7
deferred = DeferredDict[str, int]()
assert deferred.setdefault("x", 5) == 5
assert deferred["x"] == 5
assert deferred.setdefault("y", make_value) == 7
assert deferred["y"] == 7
assert calls == 1
copy_deferred = deferred.copy()
assert isinstance(copy_deferred, DeferredDict)
assert copy_deferred["x"] == 5
assert copy_deferred["y"] == 7
assert calls == 1
assert deferred.pop("x") == 5
assert deferred.pop("missing", 9) == 9
assert deferred.popitem() == ("y", 7)
def test_tmpfile_cleans_up_on_exception(tmp_path: Path) -> None:
target = tmp_path / "out.txt"
temp_path = None
try:
with tmpfile(target) as stream:
temp_path = Path(stream.name)
stream.write(b"hello")
raise RuntimeError("boom")
except RuntimeError:
pass
assert temp_path is not None
assert not target.exists()
assert not temp_path.exists()

View file

@ -0,0 +1,55 @@
import numpy as np
import pytest
from masque.pattern import Pattern
from masque.ports import Port
from masque.repetition import Grid
try:
import matplotlib
HAS_MATPLOTLIB = True
except ImportError:
HAS_MATPLOTLIB = False
@pytest.mark.skipif(not HAS_MATPLOTLIB, reason="matplotlib not installed")
def test_visualize_noninteractive(tmp_path) -> None:
"""
Test that visualize() runs and saves a file without error.
This covers the recursive transformation and collection logic.
"""
# Create a hierarchy
child = Pattern()
child.polygon('L1', [[0, 0], [1, 0], [1, 1], [0, 1]])
child.ports['P1'] = Port((0.5, 0.5), 0)
parent = Pattern()
# Add some refs with various transforms
parent.ref('child', offset=(10, 0), rotation=np.pi/4, mirrored=True, scale=2.0)
# Add a repetition
rep = Grid(a_vector=(5, 5), a_count=2)
parent.ref('child', offset=(0, 10), repetition=rep)
library = {'child': child}
output_file = tmp_path / "test_plot.png"
# Run visualize with filename to avoid showing window
parent.visualize(library=library, filename=str(output_file), ports=True)
assert output_file.exists()
assert output_file.stat().st_size > 0
@pytest.mark.skipif(not HAS_MATPLOTLIB, reason="matplotlib not installed")
def test_visualize_empty() -> None:
""" Test visualizing an empty pattern. """
pat = Pattern()
# Should not raise
pat.visualize(overdraw=True)
@pytest.mark.skipif(not HAS_MATPLOTLIB, reason="matplotlib not installed")
def test_visualize_no_refs() -> None:
""" Test visualizing a pattern with only local shapes (no library needed). """
pat = Pattern()
pat.polygon('L1', [[0, 0], [1, 0], [0, 1]])
# Should not raise even if library is None
pat.visualize(overdraw=True)

View file

@ -26,7 +26,11 @@ from .scalable import (
Scalable as Scalable,
ScalableImpl as ScalableImpl,
)
from .mirrorable import Mirrorable as Mirrorable
from .mirrorable import (
Mirrorable as Mirrorable,
Flippable as Flippable,
FlippableImpl as FlippableImpl,
)
from .copyable import Copyable as Copyable
from .annotatable import (
Annotatable as Annotatable,

View file

@ -45,6 +45,6 @@ class AnnotatableImpl(Annotatable, metaclass=ABCMeta):
@annotations.setter
def annotations(self, annotations: annotations_t) -> None:
if not isinstance(annotations, dict):
raise MasqueError(f'annotations expected dict, got {type(annotations)}')
if not isinstance(annotations, dict) and annotations is not None:
raise MasqueError(f'annotations expected dict or None, got {type(annotations)}')
self._annotations = annotations

View file

@ -1,6 +1,13 @@
from typing import Self
from abc import ABCMeta, abstractmethod
import numpy
from numpy.typing import NDArray
from ..error import MasqueError
from .positionable import Positionable
from .repeatable import Repeatable
class Mirrorable(metaclass=ABCMeta):
"""
@ -11,11 +18,17 @@ class Mirrorable(metaclass=ABCMeta):
@abstractmethod
def mirror(self, axis: int = 0) -> Self:
"""
Mirror the entity across an axis.
Intrinsic transformation: Mirror the entity across an axis through its origin.
This does NOT affect the object's repetition grid.
This operation is performed relative to the object's internal origin (ignoring
its offset). For objects like `Polygon` and `Path` where the offset is forced
to (0, 0), this is equivalent to mirroring in the container's coordinate system.
Args:
axis: Axis to mirror across.
axis: Axis to mirror across:
0: X-axis (flip y coords),
1: Y-axis (flip x coords)
Returns:
self
"""
@ -23,10 +36,11 @@ class Mirrorable(metaclass=ABCMeta):
def mirror2d(self, across_x: bool = False, across_y: bool = False) -> Self:
"""
Optionally mirror the entity across both axes
Optionally mirror the entity across both axes through its origin.
Args:
axes: (mirror_across_x, mirror_across_y)
across_x: Mirror across the horizontal X-axis (flip Y coordinates).
across_y: Mirror across the vertical Y-axis (flip X coordinates).
Returns:
self
@ -38,30 +52,61 @@ class Mirrorable(metaclass=ABCMeta):
return self
#class MirrorableImpl(Mirrorable, metaclass=ABCMeta):
# """
# Simple implementation of `Mirrorable`
# """
# __slots__ = ()
#
# _mirrored: NDArray[numpy.bool]
# """ Whether to mirror the instance across the x and/or y axes. """
#
# #
# # Properties
# #
# # Mirrored property
# @property
# def mirrored(self) -> NDArray[numpy.bool]:
# """ Whether to mirror across the [x, y] axes, respectively """
# return self._mirrored
#
# @mirrored.setter
# def mirrored(self, val: Sequence[bool]) -> None:
# if is_scalar(val):
# raise MasqueError('Mirrored must be a 2-element list of booleans')
# self._mirrored = numpy.array(val, dtype=bool)
#
# #
# # Methods
# #
class Flippable(Positionable, metaclass=ABCMeta):
"""
Trait class for entities which can be mirrored relative to an external line.
"""
__slots__ = ()
@staticmethod
def _check_flip_args(axis: int | None = None, *, x: float | None = None, y: float | None = None) -> tuple[int, NDArray[numpy.float64]]:
pivot = numpy.zeros(2)
if axis is not None:
if x is not None or y is not None:
raise MasqueError('Cannot specify both axis and x or y')
return axis, pivot
if x is not None:
if y is not None:
raise MasqueError('Cannot specify both x and y')
return 1, pivot + (x, 0)
if y is not None:
return 0, pivot + (0, y)
raise MasqueError('Must specify one of axis, x, or y')
@abstractmethod
def flip_across(self, axis: int | None = None, *, x: float | None = None, y: float | None = None) -> Self:
"""
Extrinsic transformation: Mirror the object across a line in the container's
coordinate system. This affects both the object's offset and its repetition grid.
Unlike `mirror()`, this operation is performed relative to the container's origin
(e.g. the `Pattern` origin, in the case of shapes) and takes the object's offset
into account.
Args:
axis: Axis to mirror across. 0: x-axis (flip y coord), 1: y-axis (flip x coord).
x: Vertical line x=val to mirror across.
y: Horizontal line y=val to mirror across.
Returns:
self
"""
pass
class FlippableImpl(Flippable, Mirrorable, Repeatable, metaclass=ABCMeta):
"""
Implementation of `Flippable` for objects which are `Mirrorable`, `Positionable`,
and `Repeatable`.
"""
__slots__ = ()
def flip_across(self, axis: int | None = None, *, x: float | None = None, y: float | None = None) -> Self:
axis, pivot = self._check_flip_args(axis=axis, x=x, y=y)
self.translate(-pivot)
self.mirror(axis)
if self.repetition is not None:
self.repetition.mirror(axis)
self.offset[1 - axis] *= -1
self.translate(+pivot)
return self

View file

@ -1,4 +1,4 @@
from typing import Self, Any
from typing import Self
from abc import ABCMeta, abstractmethod
import numpy
@ -73,7 +73,7 @@ class PositionableImpl(Positionable, metaclass=ABCMeta):
#
# offset property
@property
def offset(self) -> Any: # mypy#3004 NDArray[numpy.float64]:
def offset(self) -> NDArray[numpy.float64]:
"""
[x, y] offset
"""
@ -95,7 +95,7 @@ class PositionableImpl(Positionable, metaclass=ABCMeta):
return self
def translate(self, offset: ArrayLike) -> Self:
self._offset += offset # type: ignore # NDArray += ArrayLike should be fine??
self._offset += numpy.asarray(offset)
return self

View file

@ -76,7 +76,7 @@ class RepeatableImpl(Repeatable, Bounded, metaclass=ABCMeta):
@repetition.setter
def repetition(self, repetition: 'Repetition | None') -> None:
from ..repetition import Repetition
from ..repetition import Repetition #noqa: PLC0415
if repetition is not None and not isinstance(repetition, Repetition):
raise MasqueError(f'{repetition} is not a valid Repetition object!')
self._repetition = repetition

View file

@ -1,4 +1,4 @@
from typing import Self, cast, Any, TYPE_CHECKING
from typing import Self
from abc import ABCMeta, abstractmethod
import numpy
@ -8,8 +8,7 @@ from numpy.typing import ArrayLike
from ..error import MasqueError
from ..utils import rotation_matrix_2d
if TYPE_CHECKING:
from .positionable import Positionable
from .positionable import Positionable
_empty_slots = () # Workaround to get mypy to ignore intentionally empty slots for superclass
@ -26,7 +25,8 @@ class Rotatable(metaclass=ABCMeta):
@abstractmethod
def rotate(self, val: float) -> Self:
"""
Rotate the shape around its origin (0, 0), ignoring its offset.
Intrinsic transformation: Rotate the shape around its origin (0, 0), ignoring its offset.
This does NOT affect the object's repetition grid.
Args:
val: Angle to rotate by (counterclockwise, radians)
@ -64,6 +64,10 @@ class RotatableImpl(Rotatable, metaclass=ABCMeta):
# Methods
#
def rotate(self, rotation: float) -> Self:
"""
Intrinsic transformation: Rotate the shape around its origin (0, 0), ignoring its offset.
This does NOT affect the object's repetition grid.
"""
self.rotation += rotation
return self
@ -81,9 +85,9 @@ class RotatableImpl(Rotatable, metaclass=ABCMeta):
return self
class Pivotable(metaclass=ABCMeta):
class Pivotable(Positionable, metaclass=ABCMeta):
"""
Trait class for entites which can be rotated around a point.
Trait class for entities which can be rotated around a point.
This requires that they are `Positionable` but not necessarily `Rotatable` themselves.
"""
__slots__ = ()
@ -91,7 +95,11 @@ class Pivotable(metaclass=ABCMeta):
@abstractmethod
def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self:
"""
Rotate the object around a point.
Extrinsic transformation: Rotate the object around a point in the container's
coordinate system. This affects both the object's offset and its repetition grid.
For objects that are also `Rotatable`, this also performs an intrinsic
rotation of the object.
Args:
pivot: Point (x, y) to rotate around
@ -103,20 +111,21 @@ class Pivotable(metaclass=ABCMeta):
pass
class PivotableImpl(Pivotable, metaclass=ABCMeta):
class PivotableImpl(Pivotable, Rotatable, metaclass=ABCMeta):
"""
Implementation of `Pivotable` for objects which are `Rotatable`
and `Positionable`.
"""
__slots__ = ()
offset: Any # TODO see if we can get around defining `offset` in PivotableImpl
""" `[x_offset, y_offset]` """
def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self:
from .repeatable import Repeatable #noqa: PLC0415
pivot = numpy.asarray(pivot, dtype=float)
cast('Positionable', self).translate(-pivot)
cast('Rotatable', self).rotate(rotation)
self.offset = numpy.dot(rotation_matrix_2d(rotation), self.offset) # type: ignore # mypy#3004
cast('Positionable', self).translate(+pivot)
self.translate(-pivot)
self.rotate(rotation)
if isinstance(self, Repeatable) and self.repetition is not None:
self.repetition.rotate(rotation)
self.offset = numpy.dot(rotation_matrix_2d(rotation), self.offset)
self.translate(+pivot)
return self

View file

@ -17,11 +17,12 @@ class AutoSlots(ABCMeta):
for base in bases:
parents |= set(base.mro())
slots = tuple(dctn.get('__slots__', ()))
slots = list(dctn.get('__slots__', ()))
for parent in parents:
if not hasattr(parent, '__annotations__'):
continue
slots += tuple(parent.__annotations__.keys())
slots.extend(parent.__annotations__.keys())
dctn['__slots__'] = slots
# Deduplicate (dict to preserve order)
dctn['__slots__'] = tuple(dict.fromkeys(slots))
return super().__new__(cls, name, bases, dctn)

196
masque/utils/boolean.py Normal file
View file

@ -0,0 +1,196 @@
from typing import Any, Literal
from collections.abc import Iterable
import logging
import numpy
from numpy.typing import NDArray
from ..shapes.polygon import Polygon
from ..error import PatternError
logger = logging.getLogger(__name__)
def _bridge_holes(outer_path: NDArray[numpy.float64], holes: list[NDArray[numpy.float64]]) -> NDArray[numpy.float64]:
"""
Bridge multiple holes into an outer boundary using zero-width slits.
"""
current_outer = outer_path
# Sort holes by max X to potentially minimize bridge lengths or complexity
# (though not strictly necessary for correctness)
holes = sorted(holes, key=lambda h: numpy.max(h[:, 0]), reverse=True)
for hole in holes:
# Find max X vertex of hole
max_idx = numpy.argmax(hole[:, 0])
m = hole[max_idx]
# Find intersection of ray (m.x, m.y) + (t, 0) with current_outer edges
best_t = numpy.inf
best_pt = None
best_edge_idx = -1
n = len(current_outer)
for i in range(n):
p1 = current_outer[i]
p2 = current_outer[(i + 1) % n]
# Check if edge (p1, p2) spans m.y
if (p1[1] <= m[1] < p2[1]) or (p2[1] <= m[1] < p1[1]):
# Intersection x:
# x = p1.x + (m.y - p1.y) * (p2.x - p1.x) / (p2.y - p1.y)
t = (p1[0] + (m[1] - p1[1]) * (p2[0] - p1[0]) / (p2[1] - p1[1])) - m[0]
if 0 <= t < best_t:
best_t = t
best_pt = numpy.array([m[0] + t, m[1]])
best_edge_idx = i
if best_edge_idx == -1:
# Fallback: find nearest vertex if ray fails (shouldn't happen for valid hole)
dists = numpy.linalg.norm(current_outer - m, axis=1)
best_edge_idx = int(numpy.argmin(dists))
best_pt = current_outer[best_edge_idx]
# Adjust best_edge_idx to insert AFTER this vertex
# (treating it as a degenerate edge)
assert best_pt is not None
# Reorder hole vertices to start at m
hole_reordered = numpy.roll(hole, -max_idx, axis=0)
# Construct new outer:
# 1. Start of outer up to best_edge_idx
# 2. Intersection point
# 3. Hole vertices (starting and ending at m)
# 4. Intersection point (to close slit)
# 5. Rest of outer
new_outer: list[NDArray[numpy.float64]] = []
new_outer.extend(current_outer[:best_edge_idx + 1])
new_outer.append(best_pt)
new_outer.extend(hole_reordered)
new_outer.append(hole_reordered[0]) # close hole loop at m
new_outer.append(best_pt) # back to outer
new_outer.extend(current_outer[best_edge_idx + 1:])
current_outer = numpy.array(new_outer)
return current_outer
def boolean(
subjects: Iterable[Any],
clips: Iterable[Any] | None = None,
operation: Literal['union', 'intersection', 'difference', 'xor'] = 'union',
scale: float = 1e6,
) -> list[Polygon]:
"""
Perform a boolean operation on two sets of polygons.
Args:
subjects: List of subjects (Polygons or vertex arrays).
clips: List of clips (Polygons or vertex arrays).
operation: The boolean operation to perform.
scale: Scaling factor for integer conversion (pyclipper uses integers).
Returns:
A list of result Polygons.
"""
try:
import pyclipper #noqa: PLC0415
except ImportError:
raise ImportError(
"Boolean operations require 'pyclipper'. "
"Install it with 'pip install pyclipper' or 'pip install masque[boolean]'."
) from None
op_map = {
'union': pyclipper.CT_UNION,
'intersection': pyclipper.CT_INTERSECTION,
'difference': pyclipper.CT_DIFFERENCE,
'xor': pyclipper.CT_XOR,
}
def to_vertices(objs: Iterable[Any] | Any | None) -> list[NDArray]:
if objs is None:
return []
if hasattr(objs, 'to_polygons') or isinstance(objs, numpy.ndarray | Polygon):
objs = (objs,)
elif not isinstance(objs, Iterable):
raise PatternError(f"Unsupported type for boolean operation: {type(objs)}")
verts = []
for obj in objs:
if hasattr(obj, 'to_polygons'):
for p in obj.to_polygons():
verts.append(p.vertices)
elif isinstance(obj, numpy.ndarray):
verts.append(obj)
elif isinstance(obj, Polygon):
verts.append(obj.vertices)
else:
# Try to iterate if it's an iterable of shapes
try:
for sub in obj:
if hasattr(sub, 'to_polygons'):
for p in sub.to_polygons():
verts.append(p.vertices)
elif isinstance(sub, Polygon):
verts.append(sub.vertices)
except TypeError:
raise PatternError(f"Unsupported type for boolean operation: {type(obj)}") from None
return verts
subject_verts = to_vertices(subjects)
clip_verts = to_vertices(clips)
if not subject_verts:
if operation in ('union', 'xor'):
return [Polygon(vertices) for vertices in clip_verts]
return []
if not clip_verts:
if operation == 'intersection':
return []
return [Polygon(vertices) for vertices in subject_verts]
pc = pyclipper.Pyclipper()
pc.AddPaths(pyclipper.scale_to_clipper(subject_verts, scale), pyclipper.PT_SUBJECT, True)
if clip_verts:
pc.AddPaths(pyclipper.scale_to_clipper(clip_verts, scale), pyclipper.PT_CLIP, True)
# Use GetPolyTree to distinguish between outers and holes
polytree = pc.Execute2(op_map[operation.lower()], pyclipper.PFT_NONZERO, pyclipper.PFT_NONZERO)
result_polygons = []
def process_node(node: Any) -> None:
if not node.IsHole:
# This is an outer boundary
outer_path = numpy.array(pyclipper.scale_from_clipper(node.Contour, scale))
# Find immediate holes
holes = []
for child in node.Childs:
if child.IsHole:
holes.append(numpy.array(pyclipper.scale_from_clipper(child.Contour, scale)))
if holes:
combined_vertices = _bridge_holes(outer_path, holes)
result_polygons.append(Polygon(combined_vertices))
else:
result_polygons.append(Polygon(outer_path))
# Recursively process children of holes (which are nested outers)
for child in node.Childs:
if child.IsHole:
for grandchild in child.Childs:
process_node(grandchild)
else:
# Holes are processed as children of outers
pass
for top_node in polytree.Childs:
process_node(top_node)
return result_polygons

View file

@ -9,7 +9,15 @@ def annotation2key(aaa: int | float | str) -> tuple[bool, Any]:
return (isinstance(aaa, str), aaa)
def _normalized_annotations(annotations: annotations_t) -> annotations_t:
if not annotations:
return None
return annotations
def annotations_lt(aa: annotations_t, bb: annotations_t) -> bool:
aa = _normalized_annotations(aa)
bb = _normalized_annotations(bb)
if aa is None:
return bb is not None
elif bb is None: # noqa: RET505
@ -36,6 +44,8 @@ def annotations_lt(aa: annotations_t, bb: annotations_t) -> bool:
def annotations_eq(aa: annotations_t, bb: annotations_t) -> bool:
aa = _normalized_annotations(aa)
bb = _normalized_annotations(bb)
if aa is None:
return bb is None
elif bb is None: # noqa: RET505
@ -47,7 +57,7 @@ def annotations_eq(aa: annotations_t, bb: annotations_t) -> bool:
keys_a = tuple(sorted(aa.keys()))
keys_b = tuple(sorted(bb.keys()))
if keys_a != keys_b:
return keys_a < keys_b
return False
for key in keys_a:
va = aa[key]

View file

@ -2,10 +2,12 @@ import numpy
from numpy.typing import ArrayLike, NDArray
from numpy import pi
from ..error import PatternError
try:
from numpy import trapezoid
except ImportError:
from numpy import trapz as trapezoid
from numpy import trapz as trapezoid # type:ignore
def bezier(
@ -31,6 +33,11 @@ def bezier(
tt = numpy.asarray(tt)
nn = nodes.shape[0]
weights = numpy.ones(nn) if weights is None else numpy.asarray(weights)
if weights.ndim != 1 or weights.shape[0] != nn:
raise PatternError(
f'weights must be a 1D array with one entry per control point; '
f'got shape {weights.shape} for {nn} control points'
)
with numpy.errstate(divide='ignore'):
umul = (tt / (1 - tt)).clip(max=1)
@ -69,14 +76,25 @@ def euler_bend(
num_points_arc = num_points - 2 * num_points_spiral
def gen_spiral(ll_max: float) -> NDArray[numpy.float64]:
xx = []
yy = []
for ll in numpy.linspace(0, ll_max, num_points_spiral):
qq = numpy.linspace(0, ll, 1000) # integrate to current arclength
xx.append(trapezoid( numpy.cos(qq * qq / 2), qq))
yy.append(trapezoid(-numpy.sin(qq * qq / 2), qq))
xy_part = numpy.stack((xx, yy), axis=1)
return xy_part
if ll_max == 0:
return numpy.zeros((num_points_spiral, 2))
resolution = 100000
qq = numpy.linspace(0, ll_max, resolution)
dx = numpy.cos(qq * qq / 2)
dy = -numpy.sin(qq * qq / 2)
dq = ll_max / (resolution - 1)
ix = numpy.zeros(resolution)
iy = numpy.zeros(resolution)
ix[1:] = numpy.cumsum((dx[:-1] + dx[1:]) / 2) * dq
iy[1:] = numpy.cumsum((dy[:-1] + dy[1:]) / 2) * dq
ll_target = numpy.linspace(0, ll_max, num_points_spiral)
x_target = numpy.interp(ll_target, qq, ix)
y_target = numpy.interp(ll_target, qq, iy)
return numpy.stack((x_target, y_target), axis=1)
xy_spiral = gen_spiral(ll_max)
xy_parts = [xy_spiral]
@ -99,6 +117,6 @@ def euler_bend(
xy = numpy.concatenate(xy_parts)
# Remove any 2x-duplicate points
xy = xy[(numpy.roll(xy, 1, axis=0) != xy).any(axis=1)]
xy = xy[(numpy.roll(xy, 1, axis=0) - xy > 1e-12).any(axis=1)]
return xy

View file

@ -1,10 +1,11 @@
from typing import TypeVar, Generic
from collections.abc import Callable
from collections.abc import Callable, Iterator
from functools import lru_cache
Key = TypeVar('Key')
Value = TypeVar('Value')
_MISSING = object()
class DeferredDict(dict, Generic[Key, Value]):
@ -25,18 +26,73 @@ class DeferredDict(dict, Generic[Key, Value]):
"""
def __init__(self, *args, **kwargs) -> None:
dict.__init__(self)
self.update(*args, **kwargs)
if args or kwargs:
self.update(*args, **kwargs)
def __setitem__(self, key: Key, value: Callable[[], Value]) -> None:
"""
Set a value, which must be a callable that returns the actual value.
The result of the callable is cached after the first access.
"""
if not callable(value):
raise TypeError(f"DeferredDict value must be callable, got {type(value)}")
cached_fn = lru_cache(maxsize=1)(value)
dict.__setitem__(self, key, cached_fn)
def __getitem__(self, key: Key) -> Value:
return dict.__getitem__(self, key)()
def get(self, key: Key, default: Value | None = None) -> Value | None:
if key not in self:
return default
return self[key]
def setdefault(self, key: Key, default: Value | Callable[[], Value] | None = None) -> Value | None:
if key in self:
return self[key]
if callable(default):
self[key] = default
else:
self.set_const(key, default)
return self[key]
def items(self) -> Iterator[tuple[Key, Value]]:
for key in self.keys():
yield key, self[key]
def values(self) -> Iterator[Value]:
for key in self.keys():
yield self[key]
def update(self, *args, **kwargs) -> None:
"""
Update the DeferredDict. If a value is callable, it is used as a generator.
Otherwise, it is wrapped as a constant.
"""
for k, v in dict(*args, **kwargs).items():
self[k] = v
if callable(v):
self[k] = v
else:
self.set_const(k, v)
def pop(self, key: Key, default: Value | object = _MISSING) -> Value:
if key in self:
value = self[key]
dict.pop(self, key)
return value
if default is _MISSING:
raise KeyError(key)
return default # type: ignore[return-value]
def popitem(self) -> tuple[Key, Value]:
key, value_func = dict.popitem(self)
return key, value_func()
def copy(self) -> 'DeferredDict[Key, Value]':
new = DeferredDict[Key, Value]()
for key in self.keys():
dict.__setitem__(new, key, dict.__getitem__(self, key))
return new
def __repr__(self) -> str:
return '<DeferredDict with keys ' + repr(set(self.keys())) + '>'
@ -46,4 +102,4 @@ class DeferredDict(dict, Generic[Key, Value]):
Convenience function to avoid having to manually wrap
constant values into callables.
"""
self[key] = lambda: value
self[key] = lambda v=value: v

View file

@ -60,6 +60,12 @@ def maxrects_bssf(
degenerate = (min_more & max_less).any(axis=0)
regions = regions[~degenerate]
if regions.shape[0] == 0:
if allow_rejects:
rejected_inds.add(rect_ind)
continue
raise MasqueError(f'Failed to find a suitable location for rectangle {rect_ind}')
''' Place the rect '''
# Best short-side fit (bssf) to pick a region
region_sizes = regions[:, 2:] - regions[:, :2]
@ -102,7 +108,7 @@ def maxrects_bssf(
if presort:
unsort_order = rect_order.argsort()
rect_locs = rect_locs[unsort_order]
rejected_inds = set(unsort_order[list(rejected_inds)])
rejected_inds = {int(rect_order[ii]) for ii in rejected_inds}
return rect_locs, rejected_inds
@ -187,7 +193,7 @@ def guillotine_bssf_sas(
if presort:
unsort_order = rect_order.argsort()
rect_locs = rect_locs[unsort_order]
rejected_inds = set(unsort_order[list(rejected_inds)])
rejected_inds = {int(rect_order[ii]) for ii in rejected_inds}
return rect_locs, rejected_inds
@ -236,7 +242,9 @@ def pack_patterns(
locations, reject_inds = packer(sizes, containers, presort=presort, allow_rejects=allow_rejects)
pat = Pattern()
for pp, oo, loc in zip(patterns, offsets, locations, strict=True):
for ii, (pp, oo, loc) in enumerate(zip(patterns, offsets, locations, strict=True)):
if ii in reject_inds:
continue
pat.ref(pp, offset=oo + loc)
rejects = [patterns[ii] for ii in reject_inds]

View file

@ -57,11 +57,9 @@ def data_to_ports(
name: str | None = None, # Note: name optional, but arg order different from read(postprocess=)
max_depth: int = 0,
skip_subcells: bool = True,
# TODO missing ok?
visited: set[int] | None = None,
) -> Pattern:
"""
# TODO fixup documentation in ports2data
# TODO move to utils.file?
Examine `pattern` for labels specifying port info, and use that info
to fill out its `ports` attribute.
@ -70,18 +68,30 @@ def data_to_ports(
Args:
layers: Search for labels on all the given layers.
library: Mapping from pattern names to patterns.
pattern: Pattern object to scan for labels.
max_depth: Maximum hierarcy depth to search. Default 999_999.
name: Name of the pattern object.
max_depth: Maximum hierarcy depth to search. Default 0.
Reduce this to 0 to avoid ever searching subcells.
skip_subcells: If port labels are found at a given hierarcy level,
do not continue searching at deeper levels. This allows subcells
to contain their own port info without interfering with supercells'
port data.
Default True.
visited: Set of object IDs which have already been processed.
Returns:
The updated `pattern`. Port labels are not removed.
"""
if visited is None:
visited = set()
# Note: visited uses id(pattern) to detect cycles and avoid redundant processing.
# This may not catch identical patterns if they are loaded as separate object instances.
if id(pattern) in visited:
return pattern
visited.add(id(pattern))
if pattern.ports:
logger.warning(f'Pattern {name if name else pattern} already had ports, skipping data_to_ports')
return pattern
@ -99,18 +109,20 @@ def data_to_ports(
if target is None:
continue
pp = data_to_ports(
layers=layers,
library=library,
pattern=library[target],
name=target,
max_depth=max_depth - 1,
skip_subcells=skip_subcells,
layers = layers,
library = library,
pattern = library[target],
name = target,
max_depth = max_depth - 1,
skip_subcells = skip_subcells,
visited = visited,
)
found_ports |= bool(pp.ports)
if not found_ports:
return pattern
imported_ports: dict[str, Port] = {}
for target, refs in pattern.refs.items():
if target is None:
continue
@ -122,9 +134,14 @@ def data_to_ports(
if not aa.ports:
break
if ref.repetition is not None:
logger.warning(f'Pattern {name if name else pattern} has repeated ref to {target!r}; '
'data_to_ports() is importing only the base instance ports')
aa.apply_ref_transform(ref)
pattern.check_ports(other_names=aa.ports.keys())
pattern.ports.update(aa.ports)
Pattern(ports={**pattern.ports, **imported_ports}).check_ports(other_names=aa.ports.keys())
imported_ports.update(aa.ports)
pattern.ports.update(imported_ports)
return pattern
@ -160,13 +177,24 @@ def data_to_ports_flat(
local_ports = {}
for label in labels:
name, property_string = label.string.split(':')
properties = property_string.split(' ')
ptype = properties[0]
angle_deg = float(properties[1]) if len(ptype) else 0
if ':' not in label.string:
logger.warning(f'Invalid port label "{label.string}" in pattern "{pstr}" (missing ":")')
continue
name, property_string = label.string.split(':', 1)
properties = property_string.split()
ptype = properties[0] if len(properties) > 0 else 'unk'
if len(properties) > 1:
try:
angle_deg = float(properties[1])
except ValueError:
logger.warning(f'Invalid port label "{label.string}" in pattern "{pstr}" (bad angle)')
continue
else:
angle_deg = numpy.inf
xy = label.offset
angle = numpy.deg2rad(angle_deg)
angle = numpy.deg2rad(angle_deg) if numpy.isfinite(angle_deg) else None
if name in local_ports:
logger.warning(f'Duplicate port "{name}" in pattern "{pstr}"')
@ -175,4 +203,3 @@ def data_to_ports_flat(
pattern.ports.update(local_ports)
return pattern

View file

@ -28,8 +28,9 @@ def rotation_matrix_2d(theta: float) -> NDArray[numpy.float64]:
arr = numpy.array([[numpy.cos(theta), -numpy.sin(theta)],
[numpy.sin(theta), +numpy.cos(theta)]])
# If this was a manhattan rotation, round to remove some inacuraccies in sin & cos
if numpy.isclose(theta % (pi / 2), 0):
# If this was a manhattan rotation, round to remove some inaccuracies in sin & cos
# cos(4*theta) is 1 for any multiple of pi/2.
if numpy.isclose(numpy.cos(4 * theta), 1, atol=1e-12):
arr = numpy.round(arr)
arr.flags.writeable = False
@ -49,7 +50,10 @@ def normalize_mirror(mirrored: Sequence[bool]) -> tuple[bool, float]:
`angle_to_rotate` in radians
"""
mirrored_x, mirrored_y = mirrored
if len(mirrored) != 2:
raise ValueError(f'mirrored must be a 2-item sequence, got length {len(mirrored)}')
mirrored_x, mirrored_y = (bool(value) for value in mirrored)
mirror_x = (mirrored_x != mirrored_y) # XOR
angle = numpy.pi if mirrored_y else 0
return mirror_x, angle
@ -86,37 +90,55 @@ def apply_transforms(
Apply a set of transforms (`outer`) to a second set (`inner`).
This is used to find the "absolute" transform for nested `Ref`s.
The two transforms should be of shape Ox4 and Ix4.
Rows should be of the form `(x_offset, y_offset, rotation_ccw_rad, mirror_across_x)`.
The output will be of the form (O*I)x4 (if `tensor=False`) or OxIx4 (`tensor=True`).
The two transforms should be of shape Ox5 and Ix5.
Rows should be of the form `(x_offset, y_offset, rotation_ccw_rad, mirror_across_x, scale)`.
The output will be of the form (O*I)x5 (if `tensor=False`) or OxIx5 (`tensor=True`).
Args:
outer: Transforms for the container refs. Shape Ox4.
inner: Transforms for the contained refs. Shape Ix4.
tensor: If `True`, an OxIx4 array is returned, with `result[oo, ii, :]` corresponding
outer: Transforms for the container refs. Shape Ox5.
inner: Transforms for the contained refs. Shape Ix5.
tensor: If `True`, an OxIx5 array is returned, with `result[oo, ii, :]` corresponding
to the `oo`th `outer` transform applied to the `ii`th inner transform.
If `False` (default), this is concatenated into `(O*I)x4` to allow simple
If `False` (default), this is concatenated into `(O*I)x5` to allow simple
chaining into additional `apply_transforms()` calls.
Returns:
OxIx4 or (O*I)x4 array. Final dimension is
`(total_x, total_y, total_rotation_ccw_rad, net_mirrored_x)`.
OxIx5 or (O*I)x5 array. Final dimension is
`(total_x, total_y, total_rotation_ccw_rad, net_mirrored_x, total_scale)`.
"""
outer = numpy.atleast_2d(outer).astype(float, copy=False)
inner = numpy.atleast_2d(inner).astype(float, copy=False)
if outer.shape[1] == 4:
outer = numpy.pad(outer, ((0, 0), (0, 1)), constant_values=1.0)
if inner.shape[1] == 4:
inner = numpy.pad(inner, ((0, 0), (0, 1)), constant_values=1.0)
if outer.shape[0] == 0 or inner.shape[0] == 0:
if tensor:
return numpy.empty((outer.shape[0], inner.shape[0], 5))
return numpy.empty((0, 5))
# If mirrored, flip y's
xy_mir = numpy.tile(inner[:, :2], (outer.shape[0], 1, 1)) # dims are outer, inner, xyrm
xy_mir[outer[:, 3].astype(bool), :, 1] *= -1
# Apply outer scale to inner offset
xy_mir *= outer[:, None, 4, None]
rot_mats = [rotation_matrix_2d(angle) for angle in outer[:, 2]]
xy = numpy.einsum('ort,oit->oir', rot_mats, xy_mir)
tot = numpy.empty((outer.shape[0], inner.shape[0], 4))
tot = numpy.empty((outer.shape[0], inner.shape[0], 5))
tot[:, :, :2] = outer[:, None, :2] + xy
tot[:, :, 2:] = outer[:, None, 2:] + inner[None, :, 2:] # sum rotations and mirrored
tot[:, :, 2] %= 2 * pi # clamp rot
tot[:, :, 3] %= 2 # clamp mirrored
# If mirrored, flip inner rotation
mirrored_outer = outer[:, None, 3].astype(bool)
rotations = outer[:, None, 2] + numpy.where(mirrored_outer, -inner[None, :, 2], inner[None, :, 2])
tot[:, :, 2] = rotations % (2 * pi)
tot[:, :, 3] = (outer[:, None, 3] + inner[None, :, 3]) % 2 # net mirrored
tot[:, :, 4] = outer[:, None, 4] * inner[None, :, 4] # net scale
if tensor:
return tot

Some files were not shown because too many files have changed in this diff Show more