Compare commits

...

301 Commits

Author SHA1 Message Date
jan
94a1b3d793 cleanup comment 2024-10-14 17:25:01 -07:00
jan
7c7a7e916c Fix offset handling in polygon normalized_form() 2024-10-14 17:24:49 -07:00
73193473df Fixup arclength calculation for wedges (or other thick arcs) 2024-10-05 11:24:40 -07:00
febaaeff0b add Library functions for finding instances and extracting hierarchy
added child_graph, parent_graph, child_order, find_refs_local and find_refs_global
2024-10-04 17:21:31 -07:00
a54ee5a26c bump klamath req 2024-08-01 00:41:01 -07:00
8d671ed709 bump version to v3.2
Highlights:
- Pather.path_into() for connecting into existing ports
- Pattern.plugged() for removing ports which were manually pathed into
  each other.
- Defined ordering/comparsions to enable sorting patterns and shapes
- numpy 2.0 compatibility
- Fix bounds calculation for arrays with manhattan rotations
- Bugfixes for DXF and OASIS
- Speed improvement for default Library.get_name() and GDS writing
2024-07-29 21:00:40 -07:00
a816a7db8e allow numpy v2.0 2024-07-29 18:24:31 -07:00
a8a42bba1d speed up b64suffix by using a simple array lookup instead of base64.b64encode 2024-07-29 11:30:31 -07:00
da7118f521 misc cleanup 2024-07-29 03:13:36 -07:00
ef6c5df386 be more consistent about when copies are made 2024-07-29 03:13:23 -07:00
ad0adec8e8 numpy.array(..., copy=False) -> numpy.asarray(...)
For numpy 2.0
2024-07-29 02:37:48 -07:00
8fd6896a71 set stacklevel=1 2024-07-28 20:41:17 -07:00
1ae3ffb9a2 linter cleanup 2024-07-28 20:35:37 -07:00
810a09f18b simplify comparison 2024-07-28 20:34:25 -07:00
97688ffae1 don't want to use context manager here 2024-07-28 20:32:55 -07:00
445c5690e1 use path.open 2024-07-28 20:32:55 -07:00
7e1f617274 fix bug where use_mmap was ignored 2024-07-28 20:32:53 -07:00
b10803efe9 pass along string arg 2024-07-28 20:31:41 -07:00
5f0a450ffa no need for string annotation 2024-07-28 20:31:41 -07:00
aa3636ebc6 flatten indent 2024-07-28 20:31:41 -07:00
48ffc9709e double quotes for docstrings 2024-07-28 20:31:41 -07:00
5cdafd580f don't need ABCMeta here 2024-07-28 20:31:41 -07:00
2cf187fdb8 def-in-loop needs assigments for vars 2024-07-28 20:31:41 -07:00
99e55f931c refactor to single-line conditional assignments 2024-07-28 20:31:41 -07:00
c48b427c77 mark some missing annotations as intentional 2024-07-28 20:31:41 -07:00
62fc64c344 iteration and collection simplifications 2024-07-28 20:31:41 -07:00
f304217d76 format is ok 2024-07-28 20:31:41 -07:00
ae21a2132e handle int-based cell references 2024-07-28 20:31:41 -07:00
e159c80b0c improve error generation and handling 2024-07-28 20:31:41 -07:00
38e9d5c250 use strict zip 2024-07-28 20:31:41 -07:00
5614eea3b4 Update DXF reading 2024-07-28 20:31:41 -07:00
8035daee7e mark intentionally unused args 2024-07-28 20:31:41 -07:00
4c69e773fd pass kwargs down into gen_straight() 2024-07-28 20:31:41 -07:00
39d9b88fa4 flatten indentation where it makes sense 2024-07-28 20:31:29 -07:00
9d5b1ef5e6 type annotation updates 2024-07-28 19:44:04 -07:00
3d50ff0070 add ruff config 2024-07-28 19:37:57 -07:00
01fe53dc79 fix final assignment and clarify what's going 2024-07-28 19:37:20 -07:00
d5adf57bc6 fix repr outside of class 2024-07-28 19:35:44 -07:00
4c721feaec re-exports: import x as x 2024-07-28 19:34:17 -07:00
6ec94fb3c3 import Sequence et al from collections.abc not typing 2024-07-28 19:33:16 -07:00
b1d78b9acb mkdir examples/layouts/ 2024-07-28 19:28:26 -07:00
dca918e63f notes for more todos 2024-07-28 19:28:05 -07:00
cda895a7d3 remove Builder.path() to avoid confusion with Pather.path() 2024-06-03 17:09:43 -07:00
jan
6db4bb96db Create an ordering for everything
In order to make layouts more reproducible
Also add pattern.sort() and file.utils.preflight_check()

optionally don't sort elements
elements aren't re-ordered that often, sorting them is slow, and the
sort criteria are arbitrary, so we might want to only sort stuff by name
2024-06-03 17:00:20 -07:00
jan
94aa853a49 add plugged() for manually-aligned ports 2024-06-03 16:57:07 -07:00
bb054b9eee port .copy() should deepcopy 2024-06-03 16:54:25 -07:00
5fb736eb74 add a more descriptive error message 2024-06-03 16:54:15 -07:00
4334d0d50b fix bounds calculation for arrays with manhattan rotation 2024-06-03 16:54:02 -07:00
31863c9799 reduce compression level to improve speed 2024-06-03 16:53:14 -07:00
30982d742b make sure kwargs get passed into gen_straight() 2024-06-03 16:53:03 -07:00
447d4ba35b improve path_into docs and error messages 2024-06-03 16:52:34 -07:00
70a51ed8ef path_into should use destination port's ptype by default 2024-06-03 16:26:12 -07:00
jan
b33c632569 cache base64encode calls since it's actually fairly slow 2024-03-09 18:38:29 -08:00
c115780bc7 bump version to v3.1 2024-03-30 18:02:40 -07:00
66d9a4eff8 add note about github mirror 2024-03-30 18:01:14 -07:00
3a0c49174b improve variable naming 2024-03-30 18:01:14 -07:00
8d122cbd2e add path_into() 2024-03-30 18:01:08 -07:00
383b5a0bef add plug_into arg 2023-11-24 23:55:39 -08:00
jan
24c77fd3c3 remove custom __copy__
no longer necessary now that we're not locking anything
2023-11-18 12:29:36 -08:00
jan
33529f5ed3 pattern shouldn't have an offset 2023-11-18 12:28:51 -08:00
jan
2516f06e40 add missing returns 2023-11-18 12:28:33 -08:00
1f6d78386c pass kwargs down into tool's path() calls 2023-11-12 02:30:11 -08:00
41d670eef3 Add missing f for f-strings 2023-11-12 02:29:52 -08:00
7f927c46b3 another arc fix 2023-10-27 23:31:22 -07:00
55e3066485 Wrap Pattern functions for label, ref, polygon, etc. 2023-10-27 21:59:48 -07:00
c7736a18c3 add missing arc endpoints 2023-10-27 21:55:17 -07:00
aefd79fb5d Pattern should be a forward reference 2023-10-23 10:24:49 -07:00
jan
7353617878 add .x and .y aliases for .offset 2023-10-20 23:19:28 -07:00
jan
f28c31fe29 = should have been + 2023-10-20 23:16:39 -07:00
jan
8ef5e2e852 improve docs 2023-10-20 23:16:02 -07:00
jan
ed433861e3 make sure transform is float-typed 2023-10-20 23:15:38 -07:00
jan
e710fa44b5 improve type annotations 2023-10-20 23:15:13 -07:00
jan
9a7a5583ed Add Tree/TreeView and allow Builder to ingest them 2023-10-20 23:14:47 -07:00
jan
b4d31903c1 update required python version 2023-10-15 23:55:41 -07:00
jan
d6ab8a1f34 Bump version to v3.0. Note that MAJOR BREAKING CHANGES were introduced almost everywhere in this version -- see the readme to understand how everything works now. 2023-10-15 23:12:33 -07:00
jan
83e82db5da doc typo 2023-10-15 23:10:58 -07:00
jan
73ce794fec import pack2d by default 2023-10-15 23:07:37 -07:00
jan
3a6807707b Add more docs 2023-10-15 23:07:28 -07:00
jan
1bdb998085 Generalize underscore into SINGLE_USE_PREFIX 2023-10-15 23:01:47 -07:00
jan
668d4b5d8b docstring updates 2023-10-15 18:31:58 -07:00
jan
2229ee5d25 surface BasicTool and PathTool at top level 2023-10-15 16:24:20 -07:00
jan
6ba44e375b remove todo 2023-10-15 16:21:51 -07:00
jan
f12f14e087 Add RenderPather tutorial, tutorial README, and some minor doc updates 2023-10-15 16:18:34 -07:00
jan
ef3bec01ce Replicate routing using paths 2023-10-15 16:18:34 -07:00
jan
5f5c78455b Add missing final vertex when the path ends in a bend 2023-10-15 16:18:34 -07:00
jan
1c7b0ce5e1 Start working on a pather tutorial 2023-10-15 16:18:34 -07:00
jan
8c14401788 add Library.map_layers 2023-10-15 16:18:34 -07:00
jan
4de82ab2ba fix transition calculation 2023-10-15 16:18:34 -07:00
jan
5a6826f8e5 stop taking in base_name -- tools can set their own cell names 2023-10-15 16:18:34 -07:00
jan
bfd81f777c Cleanup based on flake8 lint 2023-10-15 16:18:34 -07:00
jan
dec084818a some further work on Tool interface 2023-10-15 16:18:34 -07:00
jan
590b6b36bd No need for Builder 2023-10-15 16:18:34 -07:00
jan
80e0c5daa8 path() should return a tree 2023-10-15 16:18:34 -07:00
jan
5001664547 doc updates 2023-10-15 16:18:34 -07:00
jan
333b21ecf4 more design pattern docs 2023-10-15 16:18:34 -07:00
jan
0aa4a6ee7a doc updates 2023-10-15 16:18:34 -07:00
jan
fa7a850ec3 Add some notes on shorthand 2023-10-15 16:18:34 -07:00
jan
621f8420f8 comment grammar 2023-10-15 16:18:34 -07:00
jan
a3b356ac14 save new name on a separate line, for debugging convenience 2023-10-15 16:18:34 -07:00
jan
2f9c7e61ee add <= operator for library (returns an Abstract) 2023-10-15 16:18:34 -07:00
jan
3245de99b3 Add NoReturn __contains__ with a more descriptive error message 2023-10-15 16:18:34 -07:00
jan
c02c2f90ef add mkport() for safely making ports 2023-10-15 16:18:34 -07:00
jan
772e42ebf1 references to Pattern should be forward references 2023-10-15 16:18:34 -07:00
jan
8d2d1ffd50 Allow Pattern.ref() to take an Abstract 2023-10-15 16:18:34 -07:00
jan
ceaa4923ef fix broken import 2023-10-15 16:18:34 -07:00
jan
f40c74adb5 improve docs and variable names 2023-10-15 16:18:34 -07:00
jan
9de382b856 Fix major bugs in presort 2023-10-15 16:18:34 -07:00
169e5a1f12 Lots of doc updates 2023-10-15 16:18:34 -07:00
d79a0a6388 get rid of Pather.mk() 2023-10-15 16:18:34 -07:00
6975787717 remove unused import 2023-10-15 16:18:34 -07:00
c4ff53a0ba fix isinstance call arg order 2023-10-15 16:18:34 -07:00
3415a16cd1 Give a more explicit error message 2023-10-15 16:18:34 -07:00
0ea3b6625f add missing end condition 2023-10-15 16:18:34 -07:00
272cfb7e48 fix arclength calculations giving invalid values or non-integral steps 2023-10-15 16:18:34 -07:00
8fe7b14f4b repr updates 2023-10-15 16:18:34 -07:00
086d07a82d Add the option to use explicit x= or y= in path_to 2023-10-15 16:18:33 -07:00
d02ea400a0 Move plug/place/interface to Pattern
Since Pattern has ports already, these should live in Pattern and get
wrapped elsewhere. Builder becomes a context-holder (holding .library
and .dead) and some code duplication goes away.
2023-10-15 16:18:33 -07:00
4bca0e2638 clean some old code 2023-10-15 16:18:33 -07:00
33377df883 add notes about ports 2023-10-15 16:18:33 -07:00
jan
63e8f0b10e fix old variable name 2023-10-15 16:18:33 -07:00
jan
99f3b0871a missing import 2023-10-15 16:18:33 -07:00
jan
d5608786ea Remove more mentions of AutoSlots 2023-10-15 16:18:33 -07:00
jan
6866d44021 simplify imports and use new approach 2023-10-15 16:18:33 -07:00
jan
a2cc94794e don't need to deepcopy twice 2023-10-15 16:18:33 -07:00
jan
c2008f2719 Improve arc arclength estimation (untested) 2023-10-15 16:18:33 -07:00
jan
e2c7f8c8cc various doc updates 2023-10-15 16:18:33 -07:00
04e15f7c85 use retstep instead of subtracting 2023-10-15 16:18:33 -07:00
a5ddfc76ca speed up get_bounds when called on a manhattan ref 2023-10-15 16:18:33 -07:00
0c0012def0 find_ptransform -> find_port_transform 2023-10-15 16:18:33 -07:00
468322ceb9 add has_ports() 2023-10-15 16:18:33 -07:00
d4bb466ad9 add mutate_other arg 2023-10-15 16:18:33 -07:00
e6ff6daa32 move __repr__ higher 2023-10-15 16:18:33 -07:00
f7f5a62f54 Update comments 2023-10-15 16:18:33 -07:00
e47f9b76b1 remove TODO labels from mypy #3004 comments 2023-10-15 16:18:33 -07:00
b872e19dec Improve arclength calculation for elliptical arcs 2023-10-15 16:18:33 -07:00
jan
efac8efa90 update some examples 2023-10-15 16:18:33 -07:00
31d97d8df0 add retool() 2023-10-15 16:18:33 -07:00
3b2be804e2 Only remove existing ports 2023-10-15 16:18:33 -07:00
b443a2a41e add prune_layers and prune_refs 2023-10-15 16:18:33 -07:00
064c3803ed fix comment 2023-10-15 16:18:33 -07:00
jan
0618be91d4 delete some old code 2023-10-15 16:18:33 -07:00
jan
c55d95505c improve accuracy of manhattan rotations 2023-10-15 16:18:33 -07:00
jan
97ccd8c303 fix missing tools prop 2023-10-15 16:18:33 -07:00
jan
df4c867e5c fix bounds 2023-10-15 16:18:33 -07:00
jan
24fc97e7f5 update readme 2023-10-15 16:18:33 -07:00
jan
91465b7175 don't keep track of y-mirroring separately from x 2023-10-15 16:18:33 -07:00
jan
9bc8d29b85 renderbuilder fixes 2023-10-15 16:18:33 -07:00
jan
9a28e1617c renderpather, get_bounds includes repetitions, Boundable 2023-10-15 16:18:33 -07:00
jan
22e1c6ae1d fix bounds 2023-10-15 16:18:33 -07:00
jan
87be06dcbe pattern copy should be deep 2023-10-15 16:18:33 -07:00
jan
bbc61a2fcd wrong func name 2023-10-15 16:18:33 -07:00
jan
e3c7150e18 missing import 2023-10-15 16:18:33 -07:00
jan
976ca0a2da missing parens 2023-10-15 16:18:33 -07:00
jan
723d856915 repetitions affect bounds 2023-10-15 16:18:33 -07:00
jan
079250e665 wip get_bounds 2023-10-15 16:18:33 -07:00
jan
8959101162 faster get_bounds for manhattan refs 2023-10-15 16:18:33 -07:00
jan
234264c0af Make rotation matrix immutable and cache the value 2023-10-15 16:18:33 -07:00
jan
93ab0a942d misc fixes 2023-10-15 16:18:33 -07:00
jan
9a077ea2df move to dicty layers and targets 2023-10-15 16:18:33 -07:00
jan
6b240de268 delete FlatBuilder (Builder subsumes it) 2023-10-15 16:18:33 -07:00
jan
3028ea0941 pather fixes / type updates 2023-10-15 16:18:33 -07:00
jan
5f24ceb13f add RenderPather 2023-10-15 16:18:33 -07:00
jan
75821c4ff9 comment 2023-10-15 16:18:33 -07:00
jan
2ed868ec25 split out find_ptransform (static version, only need ports) 2023-10-15 16:18:33 -07:00
jan
cbe5c07f8f add todo about underscore 2023-10-15 16:18:33 -07:00
jan
b13d7286e5 shorten labels 2023-10-15 16:18:33 -07:00
jan
de0d35d3d7 cleanup 2023-10-15 16:18:33 -07:00
jan
1008b6aabd split pather into its own file 2023-10-15 16:18:33 -07:00
jan
bb3caf1ad7 comment updates 2023-10-15 16:18:33 -07:00
jan
c5c31a5f0f only mutable variant should have rename_top 2023-10-15 16:18:33 -07:00
jan
08291da167 fixes 2023-10-15 16:18:33 -07:00
jan
68318a1382 add functions for dealing with the topcell and its name 2023-10-15 16:18:33 -07:00
jan
31cf0047e7 add mktree 2023-10-15 16:18:33 -07:00
jan
f0a71bfb8b redo library class naming 2023-10-15 16:18:33 -07:00
jan
a07446808a should be union; we want to exclude dangling refs 2023-10-15 16:18:33 -07:00
jan
340fe7f656 fixes to subtree and lshift, as well as some cast() improvements 2023-10-15 16:18:33 -07:00
jan
45265faec4 oneshot available at toplevel 2023-10-15 16:18:33 -07:00
jan
46a7f60460 add @oneshot decorator 2023-10-15 16:18:33 -07:00
jan
d7e89ef5c8 lshift operator shouldn't special-case trees
Instead, just call .tops() if there are multiple cells, and fail if
there are multiple tops
2023-10-15 16:18:33 -07:00
jan
0efd9afd16 find_toplevel -> tops 2023-10-15 16:18:33 -07:00
jan
64413f69d4 create no longer exists. Make mk() give similar ordering as mkpat() 2023-10-15 16:18:33 -07:00
jan
37e4c03547 fix return value 2023-10-15 16:18:33 -07:00
jan
94691dac85 top is always a string 2023-10-15 16:18:33 -07:00
jan
4eee4d19e9 cleanup 2023-10-15 16:18:33 -07:00
jan
cbfbdf66a1 get rid of NamedPattern in favor of just returning a tuple 2023-10-15 16:18:33 -07:00
9115371b19 Drop ports when repeating 2023-10-15 16:18:33 -07:00
cf634f1c16 port translation is already handled in Pattern 2023-10-15 16:18:33 -07:00
09291e58f7 drop ability to use python-gdsii 2023-10-15 16:18:33 -07:00
ea2eaa4603 fix rounding 2023-10-15 16:18:33 -07:00
b744a11e8e str(namedpattern) should just return its name 2023-10-15 16:18:33 -07:00
f54193edf0 updates to Pattern.polygonize() 2023-10-15 16:18:33 -07:00
59c8f47f4d update to newer ezdxf 2023-10-15 16:18:33 -07:00
e5ed28a854 Need to check against self, since we may add new conflicts as we go 2023-10-15 16:18:33 -07:00
b4f36417fd Pipe-operator does not support forward references 2023-10-15 16:18:33 -07:00
45081c2d31 add polygon() and label() convenience methods 2023-10-15 16:18:33 -07:00
4482ede3a7 use Self type 2023-10-15 16:18:33 -07:00
1463535676 modernize type annotations 2023-10-15 16:18:33 -07:00
ada8c591a0 fix error message 2023-10-15 16:18:33 -07:00
9d42df831e remove per-shape polygonization state 2023-10-15 16:18:33 -07:00
7befe89af3 fixes based on mypy 2023-10-15 16:18:33 -07:00
f766a3ad64 add prune_empty and delete() 2023-10-15 16:18:33 -07:00
85a2eb6acc fixes/updates 2023-10-15 16:18:33 -07:00
069dde3648 Drop ports by default 2023-10-15 16:18:33 -07:00
e0939049dd force 'wb' mode for gzipfile 2023-10-15 16:18:33 -07:00
88adc08259 data_to_ports max_depth default to 0
Makes it more compatible with LazyLibrary -- with recursive approach, we
have to load all the subcells to run ports2data, but those subcells may
or may not exist (e.g. partial library, or maybe we've removed some
duplicates-to-be prior to merging with a different lib)
2023-10-15 16:18:33 -07:00
4ab718d578 pass along library for bounds 2023-10-15 16:18:33 -07:00
f834aaee47 fix precache 2023-10-15 16:18:33 -07:00
27d87a988d redo library merging 2023-10-15 16:18:33 -07:00
6f97f7e6db pass along tools 2023-10-15 16:18:33 -07:00
d0f76d150f Make default quiet for underscores 2023-10-15 16:18:33 -07:00
5ffcadb362 always apply postprocess 2023-10-15 16:18:33 -07:00
2ccef554db misc fixes 2023-10-15 16:18:33 -07:00
d349aa3366 Revert "allow ports2data to take a tree"
This reverts commit 44f823c736.
LazyLibrary can't take Trees anymore, so no need for it.
2023-10-15 16:18:33 -07:00
680da46f5c LazyLibrary should not contain Trees
altering itself during iteration is not a good idea
2023-10-15 16:18:33 -07:00
59a986546c missing import 2023-10-15 16:18:33 -07:00
19ac45a4f4 fix type for __contains__ 2023-10-15 16:18:33 -07:00
db7a98bb0f allow ports2data to take a tree 2023-10-15 16:18:33 -07:00
8687badac5 misc fixes 2023-10-15 16:18:33 -07:00
4a6584a60a Only allow 1-sized Libraries 2023-10-15 16:18:33 -07:00
4a94259249 Allow lshift to operate on any library. If only one name, return it, else None 2023-10-15 16:18:33 -07:00
7cc732248e add missing functions to tree 2023-10-15 16:18:33 -07:00
98728521fd add Pather.mk() 2023-10-15 16:18:33 -07:00
460222ce6e add name arg 2023-10-15 16:18:33 -07:00
f1a380b170 pather reorganization/clenaup 2023-10-15 16:18:33 -07:00
38585e5a9e add lshift operator to MutableLibrary 2023-10-15 16:18:33 -07:00
2449486a28 set default for library to None 2023-10-15 16:18:33 -07:00
4fc2e67b62 Turn Builder into a subset of Pather 2023-10-15 16:18:32 -07:00
039320d180 fix add_tree operator 2023-10-15 16:18:32 -07:00
853c20e8df Allow LazyLibrary to store Trees as well? 2023-10-15 16:18:32 -07:00
f642c226c7 Use lshift for tree combination 2023-10-15 16:18:32 -07:00
103eb4f1f8 stringy type 2023-10-15 16:18:32 -07:00
abc721cf67 ergonomics 2023-10-15 16:18:32 -07:00
d8e789f179 Add Tree as a possible way to allow construction of whole subtrees at once 2023-10-15 16:18:32 -07:00
234557dc93 Add move_references() and auto-move references during add()-with-rename
Also remove enable_cache, since we now rely on the cache.
2023-10-15 16:18:32 -07:00
439d5914e0 implement auto-renaming during merge, and change _merge() to support it 2023-10-15 16:18:32 -07:00
jan
ac9776628a remove some trailing undescores 2023-10-15 16:18:32 -07:00
ab8fd9b351 add NamedPattern 2023-10-15 16:18:32 -07:00
1a9116cdbe add .create() 2023-10-15 16:18:32 -07:00
e348267a3d notes on organization 2023-10-15 16:18:32 -07:00
7a8a3ef3c7 note in comments 2023-10-15 16:18:32 -07:00
f8b5cec340 Add recurse arg to get_bounds 2023-10-15 16:18:32 -07:00
1598582865 remove log messages 2023-10-15 16:18:32 -07:00
42ee4db989 Return WrapLibrary from read() and readfile() 2023-10-15 16:18:32 -07:00
a35bf9770a Default to adding ports at the origin 2023-10-15 16:18:32 -07:00
5c48a28661 some cleanup 2023-10-15 16:18:32 -07:00
a8da0fc429 add FlatBuilder 2023-10-15 16:18:32 -07:00
cb87543e0c import ports2data at top level 2023-10-15 16:18:32 -07:00
e5029ae21d add library .rename(...) 2023-10-15 16:18:32 -07:00
0172b7488e missing comma 2023-10-15 16:18:32 -07:00
d44374bf1f writefile should write to a temporary file first 2023-10-15 16:18:32 -07:00
5fcd31a719 add name_and_set 2023-10-15 16:18:32 -07:00
2940316c48 add missing comments 2023-10-15 16:18:32 -07:00
c0e4ee1b6b Allow library __setitem__ to take in either Pattern or Callable
No longer need it to be Generic!
2023-10-15 16:18:32 -07:00
963918d1d9 various fixes and cleanup
mainly involving ports_to_data and data_to_ports
2023-10-15 16:18:32 -07:00
16567c8a66 move builder.port_utils into utils.ports2data
and rename functions
2023-10-15 16:18:32 -07:00
5452bc5608 more fixes and improvements 2023-10-15 16:18:32 -07:00
d9fe295f4f get things working with a LazyLibrary hack while we think about cycles 2023-10-15 16:18:32 -07:00
f4537a0feb Lots of progress on tutorials 2023-10-15 16:18:32 -07:00
c31d7dfa2c Add note about reproducibility for DXF 2023-10-15 16:18:32 -07:00
61b381cfaa remove dead code 2023-10-15 16:18:32 -07:00
cca7cbaae1 formatting 2023-10-15 16:18:32 -07:00
b75c8de0c4 lots of fixes to get test_rep running 2023-10-15 16:18:32 -07:00
92f7fce6ff improve gzipped file reproducibility
Mostly avoid writing the old filename and modification time to the gzip
header
2023-10-15 16:18:32 -07:00
ea87418bf5 clarify comment 2023-10-15 16:18:32 -07:00
e812c69bfb get rid of Mapping stuff on PortsList 2023-10-15 16:18:32 -07:00
71db073a54 add todos 2023-10-15 16:18:32 -07:00
a6cb276468 add AbstractView 2023-10-15 16:18:32 -07:00
090e86644a Move Abstract into its own file 2023-10-15 16:18:32 -07:00
a1073eca6b handle library=None 2023-10-15 16:18:32 -07:00
0368cf7a00 library can generate abstracts 2023-10-15 16:18:32 -07:00
e288c3f5e0 B becomes BB for searchability 2023-10-15 16:18:32 -07:00
3b8866732b PortsRef -> Abstract 2023-10-15 16:18:32 -07:00
6b01b43559 flake8-aided fixes 2023-10-15 16:18:32 -07:00
db9b39dbc0 fix more type issues 2023-10-15 16:18:32 -07:00
557c6c98dc more wip -- most central stuff is first pass done 2023-10-15 16:18:32 -07:00
6549faddbb wip -- more fixes 2023-10-15 16:18:32 -07:00
jan
9efb6f0eeb wip 2023-10-15 16:18:32 -07:00
jan
d9ae8dd6e3 wip 2023-10-15 16:18:32 -07:00
f7902fa517 busL -> mpath 2023-10-15 16:18:32 -07:00
fbbc1d5cc7 comment out some ipython commands 2023-10-15 16:18:32 -07:00
2635c6c20c some type updates 2023-10-15 16:18:32 -07:00
c7f3e7ee52 Remove support for dose
Since there isn't GDS/OASIS level support for dose, this can be mostly
handled by using arbitrary layers/dtypes directly. Dose scaling isn't
handled as nicely that way, but it corresponds more directly to what
gets written to file.
2023-10-15 16:18:32 -07:00
f7a2edfe23 fix some type-related issues 2023-10-15 16:18:32 -07:00
a0ca53f57a get rid of "identifier" 2023-10-15 16:18:32 -07:00
jan
7ca017d993 wip again 2023-10-15 16:18:32 -07:00
jan
db9a6269a1 delete duplicate utils submodule 2023-10-15 16:18:32 -07:00
6f696bfc71 partial work on device libraries 2023-10-15 16:18:32 -07:00
f7b8f2db0c various fixes 2023-10-15 16:18:32 -07:00
e3511ed852 remove duplicatre __delitem__ 2023-10-15 16:18:32 -07:00
a4f89e6f48 improve docs 2023-10-15 16:18:32 -07:00
5f35e8c8e3 indirect type spec for Pattern 2023-10-15 16:18:32 -07:00
jan
52f0b4aa93 Add lib types 2023-10-15 16:18:32 -07:00
jan
c95b2f4c0d bifurcate Device into DeviceRef 2023-10-15 16:18:32 -07:00
7e1371c14d add notes about what is hard 2023-10-15 16:18:32 -07:00
e932687210 make error message prettier 2023-10-15 16:18:32 -07:00
jan
7aaf73cb37 WIP: make libraries and names first-class! 2023-10-15 16:18:32 -07:00
f834ec6be5 Avoid generating a container if only a single port is passed 2023-10-15 16:18:32 -07:00
885b259fb7 allow bounds to be passed as args 2023-10-15 16:18:32 -07:00
3f986957ac allow passing a single Tool to be used as the default 2023-10-15 16:18:32 -07:00
1c3c032434 Add functionality for building paths (single use wires/waveguides/etc) 2023-10-15 16:18:32 -07:00
afcbd315ae Fix extra vertex added during OASIS loading 2023-01-24 14:14:10 -08:00
71 changed files with 9663 additions and 5903 deletions

230
README.md
View File

@ -3,43 +3,226 @@
Masque is a Python module for designing lithography masks.
The general idea is to implement something resembling the GDSII file-format, but
with some vectorized element types (eg. circles, not just polygons), better support for
E-beam doses, and the ability to output to multiple formats.
with some vectorized element types (eg. circles, not just polygons) and the ability
to output to multiple formats.
- [Source repository](https://mpxd.net/code/jan/masque)
- [PyPI](https://pypi.org/project/masque)
- [Github mirror](https://github.com/anewusername/masque)
## Installation
Requirements:
* python >= 3.8
* python >= 3.11
* numpy
* klamath (used for `gdsii` i/o and library management)
* matplotlib (optional, used for `visualization` functions and `text`)
* ezdxf (optional, used for `dxf` i/o)
* fatamorgana (optional, used for `oasis` i/o)
* svgwrite (optional, used for `svg` output)
* freetype (optional, used for `text`)
* klamath (used for GDSII i/o)
Optional requirements:
* `ezdxf` (DXF i/o): ezdxf
* `oasis` (OASIS i/o): fatamorgana
* `svg` (SVG output): svgwrite
* `visualization` (shape plotting): matplotlib
* `text` (`Text` shape): matplotlib, freetype
Install with pip:
```bash
pip3 install 'masque[visualization,oasis,dxf,svg,text]'
pip install 'masque[oasis,dxf,svg,visualization,text]'
```
Alternatively, install from git
```bash
pip3 install git+https://mpxd.net/code/jan/masque.git@release
## Overview
A layout consists of a hierarchy of `Pattern`s stored in a single `Library`.
Each `Pattern` can contain `Ref`s pointing at other patterns, `Shape`s, `Label`s, and `Port`s.
`masque` departs from several "classic" GDSII paradigms:
- A `Pattern` object does not store its own name. A name is only assigned when the pattern is placed
into a `Library`, which is effectively a name->`Pattern` mapping.
- Layer info for `Shape`ss and `Label`s is not stored in the individual shape and label objects.
Instead, the layer is determined by the key for the container dict (e.g. `pattern.shapes[layer]`).
* This simplifies many common tasks: filtering `Shape`s by layer, remapping layers, and checking if
a layer is empty.
* Technically, this allows reusing the same shape or label object across multiple layers. This isn't
part of the standard workflow since a mixture of single-use and multi-use shapes could be confusing.
* This is similar to the approach used in [KLayout](https://www.klayout.de)
- `Ref` target names are also determined in the key of the container dict (e.g. `pattern.refs[target_name]`).
* This similarly simplifies filtering `Ref`s by target name, updating to a new target, and checking
if a given `Pattern` is referenced.
- `Pattern` names are set by their containing `Library` and are not stored in the `Pattern` objects.
* This guarantees that there are no duplicate pattern names within any given `Library`.
* Likewise, enumerating all the names (and all the `Pattern`s) in a `Library` is straightforward.
- Each `Ref`, `Shape`, or `Label` can be repeated multiple times by attaching a `repetition` object to it.
* This is similar to how OASIS reptitions are handled, and provides extra flexibility over the GDSII
approach of only allowing arrays through AREF (`Ref` + `repetition`).
- `Label`s do not have an orientation or presentation
* This is in line with how they are used in practice, and how they are represented in OASIS.
- Non-polygonal `Shape`s are allowed. For example, elliptical arcs are a basic shape type.
* This enables compatibility with OASIS (e.g. circles) and other formats.
* `Shape`s provide a `.to_polygons()` method for GDSII compatibility.
- Most coordinate values are stored as 64-bit floats internally.
* 1 earth radii in nanometers (6e15) is still represented without approximation (53 bit mantissa -> 2^53 > 9e15)
* Operations that would otherwise clip/round on are still represented approximately.
* Memory usage is usually dominated by other Python overhead.
- `Pattern` objects also contain `Port` information, which can be used to "snap" together
multiple sub-components by matching up the requested port offsets and rotations.
* Port rotations are defined as counter-clockwise angles from the +x axis.
* Ports point into the interior of their associated device.
* Port rotations may be `None` in the case of non-oriented ports.
* Ports have a `ptype` string which is compared in order to catch mismatched connections at build time.
* Ports can be exported into/imported from `Label`s stored directly in the layout,
editable from standard tools (e.g. KLayout). A default format is provided.
In one important way, `masque` stays very orthodox:
References are accomplished by listing the target's name, not its `Pattern` object.
- The main downside of this is that any operations that traverse the hierarchy require
both the `Pattern` and the `Library` which is contains its reference targets.
- This guarantees that names within a `Library` remain unique at all times.
* Since this can be tedious in cases where you don't actually care about the name of a
pattern, patterns whose names start with `SINGLE_USE_PREFIX` (default: an underscore)
may be silently renamed in order to maintain uniqueness.
See `masque.library.SINGLE_USE_PREFIX`, `masque.library._rename_patterns()`,
and `ILibrary.add()` for more details.
- Having all patterns accessible through the `Library` avoids having to perform a
tree traversal for every operation which needs to touch all `Pattern` objects
(e.g. deleting a layer everywhere or scaling all patterns).
- Since `Pattern` doesn't know its own name, you can't create a reference by passing in
a `Pattern` object -- you need to know its name.
- You *can* reference a `Pattern` before it is created, so long as you have already decided
on its name.
- Functions like `Pattern.place()` and `Pattern.plug()` need to receive a pattern's name
in order to create a reference, but they also need to access the pattern's ports.
* One way to provide this data is through an `Abstract`, generated via
`Library.abstract()` or through a `Library.abstract_view()`.
* Another way is use `Builder.place()` or `Builder.plug()`, which automatically creates
an `Abstract` from its internally-referenced `Library`.
## Glossary
- `Library`: A collection of named cells. OASIS or GDS "library" or file.
- `Tree`: Any `{name: pattern}` mapping which has only one topcell.
- `Pattern`: A collection of geometry, text labels, and reference to other patterns.
OASIS or GDS "Cell", DXF "Block".
- `Ref`: A reference to another pattern. GDS "AREF/SREF", OASIS "Placement".
- `Shape`: Individual geometric entity. OASIS or GDS "Geometry element", DXF "LWPolyline" or "Polyline".
- `repetition`: Repetition operation. OASIS "repetition".
GDS "AREF" is a `Ref` combined with a `Grid` repetition.
- `Label`: Text label. Not rendered into geometry. OASIS, GDS, DXF "Text".
- `annotation`: Additional metadata. OASIS or GDS "property".
## Syntax, shorthand, and design patterns
Most syntax and behavior should follow normal python conventions.
There are a few exceptions, either meant to catch common mistakes or to provide a shorthand for common operations:
### `Library` objects don't allow overwriting already-existing patterns
```python3
library['mycell'] = pattern0
library['mycell'] = pattern1 # Error! 'mycell' already exists and can't be overwritten
del library['mycell'] # We can explicitly delete it
library['mycell'] = pattern1 # And now it's ok to assign a new value
library.delete('mycell') # This also deletes all refs pointing to 'mycell' by default
```
## Translation
- `Pattern`: OASIS or GDS "Cell", DXF "Block"
- `SubPattern`: GDS "AREF/SREF", OASIS "Placement"
- `Shape`: OASIS or GDS "Geometry element", DXF "LWPolyline" or "Polyline"
- `repetition`: OASIS "repetition". GDS "AREF" is a `SubPattern` combined with a `Grid` repetition.
- `Label`: OASIS, GDS, DXF "Text".
- `annotation`: OASIS or GDS "property"
### Insert a newly-made hierarchical pattern (with children) into a layout
```python3
# Let's say we have a function which returns a new library containing one topcell (and possibly children)
tree = make_tree(...)
# To reference this cell in our layout, we have to add all its children to our `library` first:
top_name = tree.top() # get the name of the topcell
name_mapping = library.add(tree) # add all patterns from `tree`, renaming elgible conflicting patterns
new_name = name_mapping.get(top_name, top_name) # get the new name for the cell (in case it was auto-renamed)
my_pattern.ref(new_name, ...) # instantiate the cell
# This can be accomplished as follows
new_name = library << tree # Add `tree` into `library` and return the top cell's new name
my_pattern.ref(new_name, ...) # instantiate the cell
# In practice, you may do lots of
my_pattern.ref(lib << make_tree(...), ...)
# With a `Builder` and `place()`/`plug()` the `lib <<` portion can be implicit:
my_builder = Builder(library=lib, ...)
...
my_builder.place(make_tree(...))
```
We can also use this shorthand to quickly add and reference a single flat (as yet un-named) pattern:
```python3
anonymous_pattern = Pattern(...)
my_pattern.ref(lib << {'_tentative_name': anonymous_pattern}, ...)
```
### Place a hierarchical pattern into a layout, preserving its port info
```python3
# As above, we have a function that makes a new library containing one topcell (and possibly children)
tree = make_tree(...)
# We need to go get its port info to `place()` it into our existing layout,
new_name = library << tree # Add the tree to the library and return its name (see `<<` above)
abstract = library.abstract(tree) # An `Abstract` stores a pattern's name and its ports (but no geometry)
my_pattern.place(abstract, ...)
# With shorthand,
abstract = library <= tree
my_pattern.place(abstract, ...)
# or
my_pattern.place(library << make_tree(...), ...)
### Quickly add geometry, labels, or refs:
The long form for adding elements can be overly verbose:
```python3
my_pattern.shapes[layer].append(Polygon(vertices, ...))
my_pattern.labels[layer] += [Label('my text')]
my_pattern.refs[target_name].append(Ref(offset=..., ...))
```
There is shorthand for the most common elements:
```python3
my_pattern.polygon(layer=layer, vertices=vertices, ...)
my_pattern.rect(layer=layer, xctr=..., xmin=..., ymax=..., ly=...) # rectangle; pick 4 of 6 constraints
my_pattern.rect(layer=layer, ymin=..., ymax=..., xctr=..., lx=...)
my_pattern.path(...)
my_pattern.label(layer, 'my_text')
my_pattern.ref(target_name, offset=..., ...)
```
### Accessing ports
```python3
# Square brackets pull from the underlying `.ports` dict:
assert pattern['input'] is pattern.ports['input']
# And you can use them to read multiple ports at once:
assert pattern[('input', 'output')] == {
'input': pattern.ports['input'],
'output': pattern.ports['output'],
}
# But you shouldn't use them for anything except reading
pattern['input'] = Port(...) # Error!
has_input = ('input' in pattern) # Error!
```
### Building patterns
```python3
library = Library(...)
my_pattern_name, my_pattern = library.mkpat(some_name_generator())
...
def _make_my_subpattern() -> str:
# This function can draw from the outer scope (e.g. `library`) but will not pollute the outer scope
# (e.g. the variable `subpattern` will not be accessible from outside the function; you must load it
# from within `library`).
subpattern_name, subpattern = library.mkpat(...)
subpattern.rect(...)
...
return subpattern_name
my_pattern.ref(_make_my_subpattern(), offset=..., ...)
```
## TODO
@ -47,5 +230,8 @@ pip3 install git+https://mpxd.net/code/jan/masque.git@release
* Better interface for polygon operations (e.g. with `pyclipper`)
- de-embedding
- boolean ops
* Construct polygons from bitmap using `skimage.find_contours`
* Deal with shape repetitions for dxf, svg
* Tests tests tests
* check renderpather
* pather and renderpather examples
* context manager for retool
* allow a specific mismatch when connecting ports

View File

@ -2,29 +2,33 @@
import numpy
import masque
import masque.file.klamath
from masque import shapes
from masque.file import gdsii
from masque import Arc, Pattern
def main():
pat = masque.Pattern(name='ellip_grating')
for rmin in numpy.arange(10, 15, 0.5):
pat.shapes.append(shapes.Arc(
pat = Pattern()
layer = (0, 0)
pat.shapes[layer].extend([
Arc(
radii=(rmin, rmin),
width=0.1,
angles=(-numpy.pi/4, numpy.pi/4),
layer=(0, 0),
))
)
for rmin in numpy.arange(10, 15, 0.5)]
)
pat.labels.append(masque.Label(string='grating centerline', offset=(1, 0), layer=(1, 2)))
pat.label(string='grating centerline', offset=(1, 0), layer=(1, 2))
pat.scale_by(1000)
pat.visualize()
pat2 = pat.copy()
pat2.name = 'grating2'
masque.file.klamath.writefile((pat, pat2), 'out.gds.gz', 1e-9, 1e-3)
lib = {
'ellip_grating': pat,
'grating2': pat.copy(),
}
gdsii.writefile(lib, 'out.gds.gz', meters_per_unit=1e-9, logical_units_per_unit=1e-3)
if __name__ == '__main__':

View File

@ -0,0 +1,29 @@
import numpy
from pyclipper import (
Pyclipper, PT_CLIP, PT_SUBJECT, CT_UNION, CT_INTERSECTION, PFT_NONZERO,
scale_to_clipper, scale_from_clipper,
)
p = Pyclipper()
p.AddPaths([
[(-10, -10), (-10, 10), (-9, 10), (-9, -10)],
[(-10, 10), (10, 10), (10, 9), (-10, 9)],
[(10, 10), (10, -10), (9, -10), (9, 10)],
[(10, -10), (-10, -10), (-10, -9), (10, -9)],
], PT_SUBJECT, closed=True)
#p.Execute2?
#p.Execute?
p.Execute(PT_UNION, PT_NONZERO, PT_NONZERO)
p.Execute(CT_UNION, PT_NONZERO, PT_NONZERO)
p.Execute(CT_UNION, PFT_NONZERO, PFT_NONZERO)
p = Pyclipper()
p.AddPaths([
[(-10, -10), (-10, 10), (-9, 10), (-9, -10)],
[(-10, 10), (10, 10), (10, 9), (-10, 9)],
[(10, 10), (10, -10), (9, -10), (9, 10)],
[(10, -10), (-10, -10), (-10, -9), (10, -9)],
], PT_SUBJECT, closed=True)
r = p.Execute2(CT_UNION, PFT_NONZERO, PFT_NONZERO)
#r.Childs

43
examples/pic2mask.py Normal file
View File

@ -0,0 +1,43 @@
# pip install pillow scikit-image
# or
# sudo apt install python3-pil python3-skimage
from PIL import Image
from skimage.measure import find_contours
from matplotlib import pyplot
import numpy
from masque import Pattern, Polygon
from masque.file.gdsii import writefile
#
# Read the image into a numpy array
#
im = Image.open('./Desktop/Camera/IMG_20220626_091101.jpg')
aa = numpy.array(im.convert(mode='L').getdata()).reshape(im.height, im.width)
threshold = (aa.max() - aa.min()) / 2
#
# Find edge contours and plot them
#
contours = find_contours(aa, threshold)
pyplot.imshow(aa)
for contour in contours:
pyplot.plot(contour[:, 1], contour[:, 0], linewidth=2)
pyplot.show(block=False)
#
# Create the layout from the contours
#
pat = Pattern()
pat.shapes[(0, 0)].extend([
Polygon(vertices=vv) for vv in contours if len(vv) < 1_000
])
lib = {}
lib['my_mask_name'] = pat
writefile(lib, 'test_contours.gds', meters_per_unit=1e-9)

View File

@ -1,103 +1,138 @@
from pprint import pprint
from pathlib import Path
import numpy
from numpy import pi
import masque
import masque.file.gdsii
import masque.file.klamath
import masque.file.dxf
import masque.file.oasis
from masque import shapes, Pattern, SubPattern
from masque import Pattern, Ref, Arc, Library
from masque.repetition import Grid
from masque.file import gdsii, dxf, oasis
from pprint import pprint
def main():
pat = masque.Pattern(name='ellip_grating')
lib = Library()
cell_name = 'ellip_grating'
pat = masque.Pattern()
layer = (0, 0)
for rmin in numpy.arange(10, 15, 0.5):
pat.shapes.append(shapes.Arc(
pat.shapes[layer].append(Arc(
radii=(rmin, rmin),
width=0.1,
angles=(0*-numpy.pi/4, numpy.pi/4),
angles=(0 * -pi/4, pi/4),
annotations={'1': ['blah']},
))
pat.scale_by(1000)
# pat.visualize()
pat2 = pat.copy()
pat2.name = 'grating2'
pat3 = Pattern('sref_test')
pat3.subpatterns = [
SubPattern(pat, offset=(1e5, 3e5), annotations={'4': ['Hello I am the base subpattern']}),
SubPattern(pat, offset=(2e5, 3e5), rotation=pi/3),
SubPattern(pat, offset=(3e5, 3e5), rotation=pi/2),
SubPattern(pat, offset=(4e5, 3e5), rotation=pi),
SubPattern(pat, offset=(5e5, 3e5), rotation=3*pi/2),
SubPattern(pat, mirrored=(True, False), offset=(1e5, 4e5)),
SubPattern(pat, mirrored=(True, False), offset=(2e5, 4e5), rotation=pi/3),
SubPattern(pat, mirrored=(True, False), offset=(3e5, 4e5), rotation=pi/2),
SubPattern(pat, mirrored=(True, False), offset=(4e5, 4e5), rotation=pi),
SubPattern(pat, mirrored=(True, False), offset=(5e5, 4e5), rotation=3*pi/2),
SubPattern(pat, mirrored=(False, True), offset=(1e5, 5e5)),
SubPattern(pat, mirrored=(False, True), offset=(2e5, 5e5), rotation=pi/3),
SubPattern(pat, mirrored=(False, True), offset=(3e5, 5e5), rotation=pi/2),
SubPattern(pat, mirrored=(False, True), offset=(4e5, 5e5), rotation=pi),
SubPattern(pat, mirrored=(False, True), offset=(5e5, 5e5), rotation=3*pi/2),
SubPattern(pat, mirrored=(True, True), offset=(1e5, 6e5)),
SubPattern(pat, mirrored=(True, True), offset=(2e5, 6e5), rotation=pi/3),
SubPattern(pat, mirrored=(True, True), offset=(3e5, 6e5), rotation=pi/2),
SubPattern(pat, mirrored=(True, True), offset=(4e5, 6e5), rotation=pi),
SubPattern(pat, mirrored=(True, True), offset=(5e5, 6e5), rotation=3*pi/2),
]
pprint(pat3)
pprint(pat3.subpatterns)
lib[cell_name] = pat
print(f'\nAdded {cell_name}:')
pprint(pat.shapes)
rep = Grid(a_vector=[1e4, 0],
b_vector=[0, 1.5e4],
a_count=3,
b_count=2,)
pat4 = Pattern('aref_test')
pat4.subpatterns = [
SubPattern(pat, repetition=rep, offset=(1e5, 3e5)),
SubPattern(pat, repetition=rep, offset=(2e5, 3e5), rotation=pi/3),
SubPattern(pat, repetition=rep, offset=(3e5, 3e5), rotation=pi/2),
SubPattern(pat, repetition=rep, offset=(4e5, 3e5), rotation=pi),
SubPattern(pat, repetition=rep, offset=(5e5, 3e5), rotation=3*pi/2),
SubPattern(pat, repetition=rep, mirrored=(True, False), offset=(1e5, 4e5)),
SubPattern(pat, repetition=rep, mirrored=(True, False), offset=(2e5, 4e5), rotation=pi/3),
SubPattern(pat, repetition=rep, mirrored=(True, False), offset=(3e5, 4e5), rotation=pi/2),
SubPattern(pat, repetition=rep, mirrored=(True, False), offset=(4e5, 4e5), rotation=pi),
SubPattern(pat, repetition=rep, mirrored=(True, False), offset=(5e5, 4e5), rotation=3*pi/2),
SubPattern(pat, repetition=rep, mirrored=(False, True), offset=(1e5, 5e5)),
SubPattern(pat, repetition=rep, mirrored=(False, True), offset=(2e5, 5e5), rotation=pi/3),
SubPattern(pat, repetition=rep, mirrored=(False, True), offset=(3e5, 5e5), rotation=pi/2),
SubPattern(pat, repetition=rep, mirrored=(False, True), offset=(4e5, 5e5), rotation=pi),
SubPattern(pat, repetition=rep, mirrored=(False, True), offset=(5e5, 5e5), rotation=3*pi/2),
SubPattern(pat, repetition=rep, mirrored=(True, True), offset=(1e5, 6e5)),
SubPattern(pat, repetition=rep, mirrored=(True, True), offset=(2e5, 6e5), rotation=pi/3),
SubPattern(pat, repetition=rep, mirrored=(True, True), offset=(3e5, 6e5), rotation=pi/2),
SubPattern(pat, repetition=rep, mirrored=(True, True), offset=(4e5, 6e5), rotation=pi),
SubPattern(pat, repetition=rep, mirrored=(True, True), offset=(5e5, 6e5), rotation=3*pi/2),
new_name = lib.get_name(cell_name)
lib[new_name] = pat.copy()
print(f'\nAdded a copy of {cell_name} as {new_name}')
pat3 = Pattern()
pat3.refs[cell_name] = [
Ref(offset=(1e5, 3e5), annotations={'4': ['Hello I am the base Ref']}),
Ref(offset=(2e5, 3e5), rotation=pi/3),
Ref(offset=(3e5, 3e5), rotation=pi/2),
Ref(offset=(4e5, 3e5), rotation=pi),
Ref(offset=(5e5, 3e5), rotation=3*pi/2),
Ref(mirrored=True, offset=(1e5, 4e5)),
Ref(mirrored=True, offset=(2e5, 4e5), rotation=pi/3),
Ref(mirrored=True, offset=(3e5, 4e5), rotation=pi/2),
Ref(mirrored=True, offset=(4e5, 4e5), rotation=pi),
Ref(mirrored=True, offset=(5e5, 4e5), rotation=3*pi/2),
Ref(offset=(1e5, 5e5)).mirror_target(1),
Ref(offset=(2e5, 5e5), rotation=pi/3).mirror_target(1),
Ref(offset=(3e5, 5e5), rotation=pi/2).mirror_target(1),
Ref(offset=(4e5, 5e5), rotation=pi).mirror_target(1),
Ref(offset=(5e5, 5e5), rotation=3*pi/2).mirror_target(1),
Ref(offset=(1e5, 6e5)).mirror2d_target(True, True),
Ref(offset=(2e5, 6e5), rotation=pi/3).mirror2d_target(True, True),
Ref(offset=(3e5, 6e5), rotation=pi/2).mirror2d_target(True, True),
Ref(offset=(4e5, 6e5), rotation=pi).mirror2d_target(True, True),
Ref(offset=(5e5, 6e5), rotation=3*pi/2).mirror2d_target(True, True),
]
folder = 'layouts/'
masque.file.klamath.writefile((pat, pat2, pat3, pat4), folder + 'rep.gds.gz', 1e-9, 1e-3)
lib['sref_test'] = pat3
print('\nAdded sref_test:')
pprint(pat3)
pprint(pat3.refs)
cells = list(masque.file.klamath.readfile(folder + 'rep.gds.gz')[0].values())
masque.file.klamath.writefile(cells, folder + 'rerep.gds.gz', 1e-9, 1e-3)
rep = Grid(
a_vector=[1e4, 0],
b_vector=[0, 1.5e4],
a_count=3,
b_count=2,
)
pat4 = Pattern()
pat4.refs[cell_name] = [
Ref(repetition=rep, offset=(1e5, 3e5)),
Ref(repetition=rep, offset=(2e5, 3e5), rotation=pi/3),
Ref(repetition=rep, offset=(3e5, 3e5), rotation=pi/2),
Ref(repetition=rep, offset=(4e5, 3e5), rotation=pi),
Ref(repetition=rep, offset=(5e5, 3e5), rotation=3*pi/2),
Ref(repetition=rep, mirrored=True, offset=(1e5, 4e5)),
Ref(repetition=rep, mirrored=True, offset=(2e5, 4e5), rotation=pi/3),
Ref(repetition=rep, mirrored=True, offset=(3e5, 4e5), rotation=pi/2),
Ref(repetition=rep, mirrored=True, offset=(4e5, 4e5), rotation=pi),
Ref(repetition=rep, mirrored=True, offset=(5e5, 4e5), rotation=3*pi/2),
Ref(repetition=rep, offset=(1e5, 5e5)).mirror_target(1),
Ref(repetition=rep, offset=(2e5, 5e5), rotation=pi/3).mirror_target(1),
Ref(repetition=rep, offset=(3e5, 5e5), rotation=pi/2).mirror_target(1),
Ref(repetition=rep, offset=(4e5, 5e5), rotation=pi).mirror_target(1),
Ref(repetition=rep, offset=(5e5, 5e5), rotation=3*pi/2).mirror_target(1),
Ref(repetition=rep, offset=(1e5, 6e5)).mirror2d_target(True, True),
Ref(repetition=rep, offset=(2e5, 6e5), rotation=pi/3).mirror2d_target(True, True),
Ref(repetition=rep, offset=(3e5, 6e5), rotation=pi/2).mirror2d_target(True, True),
Ref(repetition=rep, offset=(4e5, 6e5), rotation=pi).mirror2d_target(True, True),
Ref(repetition=rep, offset=(5e5, 6e5), rotation=3*pi/2).mirror2d_target(True, True),
]
masque.file.dxf.writefile(pat4, folder + 'rep.dxf.gz')
dxf, info = masque.file.dxf.readfile(folder + 'rep.dxf.gz')
masque.file.dxf.writefile(dxf, folder + 'rerep.dxf.gz')
lib['aref_test'] = pat4
print('\nAdded aref_test')
folder = Path('./layouts/')
folder.mkdir(exist_ok=True)
print(f'...writing files to {folder}...')
gds1 = folder / 'rep.gds.gz'
gds2 = folder / 'rerep.gds.gz'
print(f'Initial write to {gds1}')
gdsii.writefile(lib, gds1, 1e-9, 1e-3)
print(f'Read back and rewrite to {gds2}')
readback_lib, _info = gdsii.readfile(gds1)
gdsii.writefile(readback_lib, gds2, 1e-9, 1e-3)
dxf1 = folder / 'rep.dxf.gz'
dxf2 = folder / 'rerep.dxf.gz'
print(f'Write aref_test to {dxf1}')
dxf.writefile(lib, 'aref_test', dxf1)
print(f'Read back and rewrite to {dxf2}')
dxf_lib, _info = dxf.readfile(dxf1)
print(Library(dxf_lib))
dxf.writefile(dxf_lib, 'Model', dxf2)
layer_map = {'base': (0,0), 'mylabel': (1,2)}
masque.file.oasis.writefile((pat, pat2, pat3, pat4), folder + 'rep.oas.gz', 1000, layer_map=layer_map)
oas, info = masque.file.oasis.readfile(folder + 'rep.oas.gz')
masque.file.oasis.writefile(list(oas.values()), folder + 'rerep.oas.gz', 1000, layer_map=layer_map)
print(info)
oas1 = folder / 'rep.oas'
oas2 = folder / 'rerep.oas'
print(f'Write lib to {oas1}')
oasis.writefile(lib, oas1, 1000, layer_map=layer_map)
print(f'Read back and rewrite to {oas2}')
oas_lib, oas_info = oasis.readfile(oas1)
oasis.writefile(oas_lib, oas2, 1000, layer_map=layer_map)
print('OASIS info:')
pprint(oas_info)
if __name__ == '__main__':

View File

@ -0,0 +1,39 @@
masque Tutorial
===============
Contents
--------
- [basic_shapes](basic_shapes.py):
* Draw basic geometry
* Export to GDS
- [devices](devices.py)
* Reference other patterns
* Add ports to a pattern
* Snap ports together to build a circuit
* Check for dangling references
- [library](library.py)
* Create a `LazyLibrary`, which loads / generates patterns only when they are first used
* Explore alternate ways of specifying a pattern for `.plug()` and `.place()`
* Design a pattern which is meant to plug into an existing pattern (via `.interface()`)
- [pather](pather.py)
* Use `Pather` to route individual wires and wire bundles
* Use `BasicTool` to generate paths
* Use `BasicTool` to automatically transition between path types
- [renderpather](rendpather.py)
* Use `RenderPather` and `PathTool` to build a layout similar to the one in [pather](pather.py),
but using `Path` shapes instead of `Polygon`s.
Additionaly, [pcgen](pcgen.py) is a utility module for generating photonic crystal lattices.
Running
-------
Run from inside the examples directory:
```bash
cd examples/tutorial
python3 basic_shapes.py
klayout -e basic_shapes.gds
```

View File

@ -1,21 +1,21 @@
from typing import Tuple, Sequence
from collections.abc import Sequence
import numpy
from numpy import pi
from masque import layer_t, Pattern, SubPattern, Label
from masque.shapes import Circle, Arc, Polygon
from masque.builder import Device, Port
from masque.library import Library, DeviceLibrary
from masque import (
layer_t, Pattern, Label, Port,
Circle, Arc, Polygon,
)
import masque.file.gdsii
# Note that masque units are arbitrary, and are only given
# physical significance when writing to a file.
GDS_OPTS = {
'meters_per_unit': 1e-9, # GDS database unit, 1 nanometer
'logical_units_per_unit': 1e-3, # GDS display unit, 1 micron
}
GDS_OPTS = dict(
meters_per_unit = 1e-9, # GDS database unit, 1 nanometer
logical_units_per_unit = 1e-3, # GDS display unit, 1 micron
)
def hole(
@ -30,11 +30,12 @@ def hole(
layer: Layer to draw the circle on.
Returns:
Pattern, named `'hole'`
Pattern containing a circle.
"""
pat = Pattern('hole', shapes=[
Circle(radius=radius, offset=(0, 0), layer=layer)
])
pat = Pattern()
pat.shapes[layer].append(
Circle(radius=radius, offset=(0, 0))
)
return pat
@ -50,7 +51,7 @@ def triangle(
layer: Layer to draw the circle on.
Returns:
Pattern, named `'triangle'`
Pattern containing a triangle
"""
vertices = numpy.array([
(numpy.cos( pi / 2), numpy.sin( pi / 2)),
@ -58,8 +59,9 @@ def triangle(
(numpy.cos( - pi / 6), numpy.sin( - pi / 6)),
]) * radius
pat = Pattern('triangle', shapes=[
Polygon(offset=(0, 0), layer=layer, vertices=vertices),
pat = Pattern()
pat.shapes[layer].extend([
Polygon(offset=(0, 0), vertices=vertices),
])
return pat
@ -78,37 +80,40 @@ def smile(
secondary_layer: Layer to draw eyes and smile on.
Returns:
Pattern, named `'smile'`
Pattern containing a smiley face
"""
# Make an empty pattern
pat = Pattern('smile')
pat = Pattern()
# Add all the shapes we want
pat.shapes += [
Circle(radius=radius, offset=(0, 0), layer=layer), # Outer circle
Circle(radius=radius / 10, offset=(radius / 3, radius / 3), layer=secondary_layer),
Circle(radius=radius / 10, offset=(-radius / 3, radius / 3), layer=secondary_layer),
Arc(radii=(radius * 2 / 3, radius * 2 / 3), # Underlying ellipse radii
pat.shapes[layer] += [
Circle(radius=radius, offset=(0, 0)), # Outer circle
]
pat.shapes[secondary_layer] += [
Circle(radius=radius / 10, offset=(radius / 3, radius / 3)),
Circle(radius=radius / 10, offset=(-radius / 3, radius / 3)),
Arc(
radii=(radius * 2 / 3, radius * 2 / 3), # Underlying ellipse radii
angles=(7 / 6 * pi, 11 / 6 * pi), # Angles limiting the arc
width=radius / 10,
offset=(0, 0),
layer=secondary_layer),
),
]
return pat
def main() -> None:
hole_pat = hole(1000)
smile_pat = smile(1000)
tri_pat = triangle(1000)
lib = {}
units_per_meter = 1e-9
units_per_display_unit = 1e-3
lib['hole'] = hole(1000)
lib['smile'] = smile(1000)
lib['triangle'] = triangle(1000)
masque.file.gdsii.writefile([hole_pat, tri_pat, smile_pat], 'basic_shapes.gds', **GDS_OPTS)
masque.file.gdsii.writefile(lib, 'basic_shapes.gds', **GDS_OPTS)
smile_pat.visualize()
lib['triangle'].visualize()
if __name__ == '__main__':

View File

@ -1,12 +1,14 @@
from typing import Tuple, Sequence, Dict
from collections.abc import Sequence, Mapping
import numpy
from numpy import pi
from masque import layer_t, Pattern, SubPattern, Label
from masque.shapes import Polygon
from masque.builder import Device, Port, port_utils
from masque.file.gdsii import writefile
from masque import (
layer_t, Pattern, Ref, Label, Builder, Port, Polygon,
Library, ILibraryView,
)
from masque.utils import ports2data
from masque.file.gdsii import writefile, check_valid_names
import pcgen
import basic_shapes
@ -17,40 +19,41 @@ LATTICE_CONSTANT = 512
RADIUS = LATTICE_CONSTANT / 2 * 0.75
def dev2pat(dev: Device) -> Pattern:
def ports_to_data(pat: Pattern) -> Pattern:
"""
Bake port information into the device.
Bake port information into the pattern.
This places a label at each port location on layer (3, 0) with text content
'name:ptype angle_deg'
"""
return port_utils.dev2pat(dev, layer=(3, 0))
return ports2data.ports_to_data(pat, layer=(3, 0))
def pat2dev(pat: Pattern) -> Device:
def data_to_ports(lib: Mapping[str, Pattern], name: str, pat: Pattern) -> Pattern:
"""
Scans the Pattern to determine port locations. Same format as `dev2pat`
Scan the Pattern to determine port locations. Same port format as `ports_to_data`
"""
return port_utils.pat2dev(pat, layers=[(3, 0)])
return ports2data.data_to_ports(layers=[(3, 0)], library=lib, pattern=pat, name=name)
def perturbed_l3(
lattice_constant: float,
hole: Pattern,
trench_dose: float = 1.0,
hole: str,
hole_lib: Mapping[str, Pattern],
trench_layer: layer_t = (1, 0),
shifts_a: Sequence[float] = (0.15, 0, 0.075),
shifts_r: Sequence[float] = (1.0, 1.0, 1.0),
xy_size: Tuple[int, int] = (10, 10),
xy_size: tuple[int, int] = (10, 10),
perturbed_radius: float = 1.1,
trench_width: float = 1200,
) -> Device:
) -> Pattern:
"""
Generate a `Device` representing a perturbed L3 cavity.
Generate a `Pattern` representing a perturbed L3 cavity.
Args:
lattice_constant: Distance between nearest neighbor holes
hole: `Pattern` object containing a single hole
trench_dose: Dose for the trenches. Default 1.0. (Hole dose is 1.0.)
hole: name of a `Pattern` containing a single hole
hole_lib: Library which contains the `Pattern` object for hole.
Necessary because we need to know how big it is...
trench_layer: Layer for the trenches, default `(1, 0)`.
shifts_a: passed to `pcgen.l3_shift`; specifies lattice constant
(1 - multiplicative factor) for shifting holes adjacent to
@ -66,8 +69,10 @@ def perturbed_l3(
trench width: Width of the undercut trenches. Default 1200.
Returns:
`Device` object representing the L3 design.
`Pattern` object representing the L3 design.
"""
print('Generating perturbed L3...')
# Get hole positions and radii
xyr = pcgen.l3_shift_perturbed_defect(mirror_dims=xy_size,
perturbed_radius=perturbed_radius,
@ -75,188 +80,206 @@ def perturbed_l3(
shifts_r=shifts_r)
# Build L3 cavity, using references to the provided hole pattern
pat = Pattern(f'L3p-a{lattice_constant:g}rp{perturbed_radius:g}')
pat.subpatterns += [
SubPattern(hole, scale=r,
offset=(lattice_constant * x,
pat = Pattern()
pat.refs[hole] += [
Ref(scale=r, offset=(lattice_constant * x,
lattice_constant * y))
for x, y, r in xyr]
# Add rectangular undercut aids
min_xy, max_xy = pat.get_bounds_nonempty()
min_xy, max_xy = pat.get_bounds_nonempty(hole_lib)
trench_dx = max_xy[0] - min_xy[0]
pat.shapes += [
Polygon.rect(ymin=max_xy[1], xmin=min_xy[0], lx=trench_dx, ly=trench_width,
layer=trench_layer, dose=trench_dose),
Polygon.rect(ymax=min_xy[1], xmin=min_xy[0], lx=trench_dx, ly=trench_width,
layer=trench_layer, dose=trench_dose),
pat.shapes[trench_layer] += [
Polygon.rect(ymin=max_xy[1], xmin=min_xy[0], lx=trench_dx, ly=trench_width),
Polygon.rect(ymax=min_xy[1], xmin=min_xy[0], lx=trench_dx, ly=trench_width),
]
# Ports are at outer extents of the device (with y=0)
extent = lattice_constant * xy_size[0]
ports = {
'input': Port((-extent, 0), rotation=0, ptype='pcwg'),
'output': Port((extent, 0), rotation=pi, ptype='pcwg'),
}
pat.ports = dict(
input=Port((-extent, 0), rotation=0, ptype='pcwg'),
output=Port((extent, 0), rotation=pi, ptype='pcwg'),
)
return Device(pat, ports)
ports_to_data(pat)
return pat
def waveguide(
lattice_constant: float,
hole: Pattern,
hole: str,
length: int,
mirror_periods: int,
) -> Device:
) -> Pattern:
"""
Generate a `Device` representing a photonic crystal line-defect waveguide.
Generate a `Pattern` representing a photonic crystal line-defect waveguide.
Args:
lattice_constant: Distance between nearest neighbor holes
hole: `Pattern` object containing a single hole
hole: name of a `Pattern` containing a single hole
length: Distance (number of mirror periods) between the input and output ports.
Ports are placed at lattice sites.
mirror_periods: Number of hole rows on each side of the line defect
Returns:
`Device` object representing the waveguide.
`Pattern` object representing the waveguide.
"""
# Generate hole locations
xy = pcgen.waveguide(length=length, num_mirror=mirror_periods)
# Build the pattern
pat = Pattern(f'_wg-a{lattice_constant:g}l{length}')
pat.subpatterns += [SubPattern(hole, offset=(lattice_constant * x,
pat = Pattern()
pat.refs[hole] += [
Ref(offset=(lattice_constant * x,
lattice_constant * y))
for x, y in xy]
# Ports are at outer edges, with y=0
extent = lattice_constant * length / 2
ports = {
'left': Port((-extent, 0), rotation=0, ptype='pcwg'),
'right': Port((extent, 0), rotation=pi, ptype='pcwg'),
}
return Device(pat, ports)
pat.ports = dict(
left=Port((-extent, 0), rotation=0, ptype='pcwg'),
right=Port((extent, 0), rotation=pi, ptype='pcwg'),
)
ports_to_data(pat)
return pat
def bend(
lattice_constant: float,
hole: Pattern,
hole: str,
mirror_periods: int,
) -> Device:
) -> Pattern:
"""
Generate a `Device` representing a 60-degree counterclockwise bend in a photonic crystal
Generate a `Pattern` representing a 60-degree counterclockwise bend in a photonic crystal
line-defect waveguide.
Args:
lattice_constant: Distance between nearest neighbor holes
hole: `Pattern` object containing a single hole
hole: name of a `Pattern` containing a single hole
mirror_periods: Minimum number of mirror periods on each side of the line defect.
Returns:
`Device` object representing the waveguide bend.
`Pattern` object representing the waveguide bend.
Ports are named 'left' (input) and 'right' (output).
"""
# Generate hole locations
xy = pcgen.wgbend(num_mirror=mirror_periods)
# Build the pattern
pat= Pattern(f'_wgbend-a{lattice_constant:g}l{mirror_periods}')
pat.subpatterns += [
SubPattern(hole, offset=(lattice_constant * x,
pat= Pattern()
pat.refs[hole] += [
Ref(offset=(lattice_constant * x,
lattice_constant * y))
for x, y in xy]
# Figure out port locations.
extent = lattice_constant * mirror_periods
ports = {
'left': Port((-extent, 0), rotation=0, ptype='pcwg'),
'right': Port((extent / 2,
pat.ports = dict(
left=Port((-extent, 0), rotation=0, ptype='pcwg'),
right=Port((extent / 2,
extent * numpy.sqrt(3) / 2),
rotation=pi * 4 / 3, ptype='pcwg'),
}
return Device(pat, ports)
)
ports_to_data(pat)
return pat
def y_splitter(
lattice_constant: float,
hole: Pattern,
hole: str,
mirror_periods: int,
) -> Device:
) -> Pattern:
"""
Generate a `Device` representing a photonic crystal line-defect waveguide y-splitter.
Generate a `Pattern` representing a photonic crystal line-defect waveguide y-splitter.
Args:
lattice_constant: Distance between nearest neighbor holes
hole: `Pattern` object containing a single hole
hole: name of a `Pattern` containing a single hole
mirror_periods: Minimum number of mirror periods on each side of the line defect.
Returns:
`Device` object representing the y-splitter.
`Pattern` object representing the y-splitter.
Ports are named 'in', 'top', and 'bottom'.
"""
# Generate hole locations
xy = pcgen.y_splitter(num_mirror=mirror_periods)
# Build pattern
pat = Pattern(f'_wgsplit_half-a{lattice_constant:g}l{mirror_periods}')
pat.subpatterns += [
SubPattern(hole, offset=(lattice_constant * x,
pat = Pattern()
pat.refs[hole] += [
Ref(offset=(lattice_constant * x,
lattice_constant * y))
for x, y in xy]
# Determine port locations
extent = lattice_constant * mirror_periods
ports = {
pat.ports = {
'in': Port((-extent, 0), rotation=0, ptype='pcwg'),
'top': Port((extent / 2, extent * numpy.sqrt(3) / 2), rotation=pi * 4 / 3, ptype='pcwg'),
'bot': Port((extent / 2, -extent * numpy.sqrt(3) / 2), rotation=pi * 2 / 3, ptype='pcwg'),
}
return Device(pat, ports)
ports_to_data(pat)
return pat
def main(interactive: bool = True):
def main(interactive: bool = True) -> None:
# Generate some basic hole patterns
smile = basic_shapes.smile(RADIUS)
hole = basic_shapes.hole(RADIUS)
shape_lib = {
'smile': basic_shapes.smile(RADIUS),
'hole': basic_shapes.hole(RADIUS),
}
# Build some devices
a = LATTICE_CONSTANT
wg10 = waveguide(lattice_constant=a, hole=hole, length=10, mirror_periods=5).rename('wg10')
wg05 = waveguide(lattice_constant=a, hole=hole, length=5, mirror_periods=5).rename('wg05')
wg28 = waveguide(lattice_constant=a, hole=hole, length=28, mirror_periods=5).rename('wg28')
bend0 = bend(lattice_constant=a, hole=hole, mirror_periods=5).rename('bend0')
ysplit = y_splitter(lattice_constant=a, hole=hole, mirror_periods=5).rename('ysplit')
l3cav = perturbed_l3(lattice_constant=a, hole=smile, xy_size=(4, 10)).rename('l3cav') # uses smile :)
# Autogenerate port labels so that GDS will also contain port data
for device in [wg10, wg05, wg28, l3cav, ysplit, bend0]:
dev2pat(device)
devices = {}
devices['wg05'] = waveguide(lattice_constant=a, hole='hole', length=5, mirror_periods=5)
devices['wg10'] = waveguide(lattice_constant=a, hole='hole', length=10, mirror_periods=5)
devices['wg28'] = waveguide(lattice_constant=a, hole='hole', length=28, mirror_periods=5)
devices['wg90'] = waveguide(lattice_constant=a, hole='hole', length=90, mirror_periods=5)
devices['bend0'] = bend(lattice_constant=a, hole='hole', mirror_periods=5)
devices['ysplit'] = y_splitter(lattice_constant=a, hole='hole', mirror_periods=5)
devices['l3cav'] = perturbed_l3(lattice_constant=a, hole='smile', hole_lib=shape_lib, xy_size=(4, 10)) # uses smile :)
# Turn our dict of devices into a Library.
# This provides some convenience functions in the future!
lib = Library(devices)
#
# Build a circuit
#
circ = Device(name='my_circuit', ports={})
# Create a `Builder`, and add the circuit to our library as "my_circuit".
circ = Builder(library=lib, name='my_circuit')
# Start by placing a waveguide. Call its ports "in" and "signal".
circ.place(wg10, offset=(0, 0), port_map={'left': 'in', 'right': 'signal'})
circ.place('wg10', offset=(0, 0), port_map={'left': 'in', 'right': 'signal'})
# Extend the signal path by attaching the "left" port of a waveguide.
# Since there is only one other port ("right") on the waveguide we
# are attaching (wg10), it automatically inherits the name "signal".
circ.plug(wg10, {'signal': 'left'})
circ.plug('wg10', {'signal': 'left'})
# We could have done the following instead:
# circ_pat = Pattern()
# lib['my_circuit'] = circ_pat
# circ_pat.place(lib.abstract('wg10'), ...)
# circ_pat.plug(lib.abstract('wg10'), ...)
# but `Builder` lets us omit some of the repetition of `lib.abstract(...)`, and uses similar
# syntax to `Pather` and `RenderPather`, which add wire/waveguide routing functionality.
# Attach a y-splitter to the signal path.
# Since the y-splitter has 3 ports total, we can't auto-inherit the
# port name, so we have to specify what we want to name the unattached
# ports. We can call them "signal1" and "signal2".
circ.plug(ysplit, {'signal': 'in'}, {'top': 'signal1', 'bot': 'signal2'})
circ.plug('ysplit', {'signal': 'in'}, {'top': 'signal1', 'bot': 'signal2'})
# Add a waveguide to both signal ports, inheriting their names.
circ.plug(wg05, {'signal1': 'left'})
circ.plug(wg05, {'signal2': 'left'})
circ.plug('wg05', {'signal1': 'left'})
circ.plug('wg05', {'signal2': 'left'})
# Add a bend to both ports.
# Our bend's ports "left" and "right" refer to the original counterclockwise
@ -265,22 +288,22 @@ def main(interactive: bool = True):
# to "signal2" to bend counterclockwise.
# We could also use `mirrored=(True, False)` to mirror one of the devices
# and then use same device port on both paths.
circ.plug(bend0, {'signal1': 'right'})
circ.plug(bend0, {'signal2': 'left'})
circ.plug('bend0', {'signal1': 'right'})
circ.plug('bend0', {'signal2': 'left'})
# We add some waveguides and a cavity to "signal1".
circ.plug(wg10, {'signal1': 'left'})
circ.plug(l3cav, {'signal1': 'input'})
circ.plug(wg10, {'signal1': 'left'})
circ.plug('wg10', {'signal1': 'left'})
circ.plug('l3cav', {'signal1': 'input'})
circ.plug('wg10', {'signal1': 'left'})
# "signal2" just gets a single of equivalent length
circ.plug(wg28, {'signal2': 'left'})
circ.plug('wg28', {'signal2': 'left'})
# Now we bend both waveguides back towards each other
circ.plug(bend0, {'signal1': 'right'})
circ.plug(bend0, {'signal2': 'left'})
circ.plug(wg05, {'signal1': 'left'})
circ.plug(wg05, {'signal2': 'left'})
circ.plug('bend0', {'signal1': 'right'})
circ.plug('bend0', {'signal2': 'left'})
circ.plug('wg05', {'signal1': 'left'})
circ.plug('wg05', {'signal2': 'left'})
# To join the waveguides, we attach a second y-junction.
# We plug "signal1" into the "bot" port, and "signal2" into the "top" port.
@ -288,23 +311,34 @@ def main(interactive: bool = True):
# This operation would raise an exception if the ports did not line up
# correctly (i.e. they required different rotations or translations of the
# y-junction device).
circ.plug(ysplit, {'signal1': 'bot', 'signal2': 'top'}, {'in': 'signal_out'})
circ.plug('ysplit', {'signal1': 'bot', 'signal2': 'top'}, {'in': 'signal_out'})
# Finally, add some more waveguide to "signal_out".
circ.plug(wg10, {'signal_out': 'left'})
# We can visualize the design. Usually it's easier to just view the GDS.
if interactive:
print('Visualizing... this step may be slow')
circ.pattern.visualize()
circ.plug('wg10', {'signal_out': 'left'})
# We can also add text labels for our circuit's ports.
# They will appear at the uppermost hierarchy level, while the individual
# device ports will appear further down, in their respective cells.
dev2pat(circ)
ports_to_data(circ.pattern)
# Write out to GDS
writefile(circ.pattern, 'circuit.gds', **GDS_OPTS)
# Check if we forgot to include any patterns... ooops!
if dangling := lib.dangling_refs():
print('Warning: The following patterns are referenced, but not present in the'
f' library! {dangling}')
print('We\'ll solve this by merging in shape_lib, which contains those shapes...')
lib.add(shape_lib)
assert not lib.dangling_refs()
# We can visualize the design. Usually it's easier to just view the GDS.
if interactive:
print('Visualizing... this step may be slow')
circ.pattern.visualize(lib)
#Write out to GDS, only keeping patterns referenced by our circuit (including itself)
subtree = lib.subtree('my_circuit') # don't include wg90, which we don't use
check_valid_names(subtree.keys())
writefile(subtree, 'circuit.gds', **GDS_OPTS)
if __name__ == '__main__':

View File

@ -1,81 +1,83 @@
from typing import Tuple, Sequence, Callable
from typing import Any
from collections.abc import Sequence, Callable
from pprint import pformat
import numpy
from numpy import pi
from masque.builder import Device
from masque.library import Library, LibDeviceLibrary
from masque import Pattern, Builder, LazyLibrary
from masque.file.gdsii import writefile, load_libraryfile
import pcgen
import basic_shapes
import devices
from devices import pat2dev, dev2pat
from devices import ports_to_data, data_to_ports
from basic_shapes import GDS_OPTS
def main() -> None:
# Define a `Library`-backed `DeviceLibrary`, which provides lazy evaluation
# for device generation code and lazy-loading of GDS contents.
device_lib = LibDeviceLibrary()
# Define a `LazyLibrary`, which provides lazy evaluation for generating
# patterns and lazy-loading of GDS contents.
lib = LazyLibrary()
#
# Load some devices from a GDS file
#
# Scan circuit.gds and prepare to lazy-load its contents
pattern_lib, _properties = load_libraryfile('circuit.gds', tag='mycirc01')
gds_lib, _properties = load_libraryfile('circuit.gds', postprocess=data_to_ports)
# Add it into the device library by providing a way to read port info
# This maintains the lazy evaluation from above, so no patterns
# are actually read yet.
device_lib.add_library(pattern_lib, pat2dev=pat2dev)
print('Devices loaded from GDS into library:\n' + pformat(list(device_lib.keys())))
lib.add(gds_lib)
print('Patterns loaded from GDS into library:\n' + pformat(list(lib.keys())))
#
# Add some new devices to the library, this time from python code rather than GDS
#
a = devices.LATTICE_CONSTANT
tri = basic_shapes.triangle(devices.RADIUS)
# Convenience function for adding devices
# This is roughly equivalent to
# `device_lib[name] = lambda: dev2pat(fn())`
# but it also guarantees that the resulting pattern is named `name`.
def add(name: str, fn: Callable[[], Device]) -> None:
device_lib.add_device(name=name, fn=fn, dev2pat=dev2pat)
lib['triangle'] = lambda: basic_shapes.triangle(devices.RADIUS)
opts: dict[str, Any] = dict(
lattice_constant = devices.LATTICE_CONSTANT,
hole = 'triangle',
)
# Triangle-based variants. These are defined here, but they won't run until they're
# retrieved from the library.
add('tri_wg10', lambda: devices.waveguide(lattice_constant=a, hole=tri, length=10, mirror_periods=5))
add('tri_wg05', lambda: devices.waveguide(lattice_constant=a, hole=tri, length=5, mirror_periods=5))
add('tri_wg28', lambda: devices.waveguide(lattice_constant=a, hole=tri, length=28, mirror_periods=5))
add('tri_bend0', lambda: devices.bend(lattice_constant=a, hole=tri, mirror_periods=5))
add('tri_ysplit', lambda: devices.y_splitter(lattice_constant=a, hole=tri, mirror_periods=5))
add('tri_l3cav', lambda: devices.perturbed_l3(lattice_constant=a, hole=tri, xy_size=(4, 10)))
lib['tri_wg10'] = lambda: devices.waveguide(length=10, mirror_periods=5, **opts)
lib['tri_wg05'] = lambda: devices.waveguide(length=5, mirror_periods=5, **opts)
lib['tri_wg28'] = lambda: devices.waveguide(length=28, mirror_periods=5, **opts)
lib['tri_bend0'] = lambda: devices.bend(mirror_periods=5, **opts)
lib['tri_ysplit'] = lambda: devices.y_splitter(mirror_periods=5, **opts)
lib['tri_l3cav'] = lambda: devices.perturbed_l3(xy_size=(4, 10), **opts, hole_lib=lib)
#
# Build a mixed waveguide with an L3 cavity in the middle
#
# Immediately start building from an instance of the L3 cavity
circ2 = device_lib['tri_l3cav'].build('mixed_wg_cav')
circ2 = Builder(library=lib, ports='tri_l3cav')
print(device_lib['wg10'].ports)
circ2.plug(device_lib['wg10'], {'input': 'right'})
circ2.plug(device_lib['wg10'], {'output': 'left'})
circ2.plug(device_lib['tri_wg10'], {'input': 'right'})
circ2.plug(device_lib['tri_wg10'], {'output': 'left'})
# First way to get abstracts is `lib.abstract(name)`
# We can use this syntax directly with `Pattern.plug()` and `Pattern.place()` as well as through `Builder`.
circ2.plug(lib.abstract('wg10'), {'input': 'right'})
# Second way to get abstracts is to use an AbstractView
# This also works directly with `Pattern.plug()` / `Pattern.place()`.
abstracts = lib.abstract_view()
circ2.plug(abstracts['wg10'], {'output': 'left'})
# Third way to specify an abstract works by automatically getting
# it from the library already within the Builder object.
# This wouldn't work if we only had a `Pattern` (not a `Builder`).
# Just pass the pattern name!
circ2.plug('tri_wg10', {'input': 'right'})
circ2.plug('tri_wg10', {'output': 'left'})
# Add the circuit to the device library.
# It has already been generated, so we can use `set_const` as a shorthand for
# `device_lib['mixed_wg_cav'] = lambda: circ2`
device_lib.set_const(circ2)
lib['mixed_wg_cav'] = circ2.pattern
#
@ -83,29 +85,26 @@ def main() -> None:
#
# We'll be designing against an existing device's interface...
circ3 = circ2.as_interface('loop_segment')
# ... that lets us continue from where we left off.
circ3.plug(device_lib['tri_bend0'], {'input': 'right'})
circ3.plug(device_lib['tri_bend0'], {'input': 'left'}, mirrored=(True, False)) # mirror since no tri y-symmetry
circ3.plug(device_lib['tri_bend0'], {'input': 'right'})
circ3.plug(device_lib['bend0'], {'output': 'left'})
circ3.plug(device_lib['bend0'], {'output': 'left'})
circ3.plug(device_lib['bend0'], {'output': 'left'})
circ3.plug(device_lib['tri_wg10'], {'input': 'right'})
circ3.plug(device_lib['tri_wg28'], {'input': 'right'})
circ3.plug(device_lib['tri_wg10'], {'input': 'right', 'output': 'left'})
circ3 = Builder.interface(source=circ2)
device_lib.set_const(circ3)
# ... that lets us continue from where we left off.
circ3.plug('tri_bend0', {'input': 'right'})
circ3.plug('tri_bend0', {'input': 'left'}, mirrored=True) # mirror since no tri y-symmetry
circ3.plug('tri_bend0', {'input': 'right'})
circ3.plug('bend0', {'output': 'left'})
circ3.plug('bend0', {'output': 'left'})
circ3.plug('bend0', {'output': 'left'})
circ3.plug('tri_wg10', {'input': 'right'})
circ3.plug('tri_wg28', {'input': 'right'})
circ3.plug('tri_wg10', {'input': 'right', 'output': 'left'})
lib['loop_segment'] = circ3.pattern
#
# Write all devices into a GDS file
#
# This line could be slow, since it generates or loads many of the devices
# since they were not all accessed above.
all_device_pats = [dev.pattern for dev in device_lib.values()]
writefile(all_device_pats, 'library.gds', **GDS_OPTS)
print('Writing library to file...')
writefile(lib, 'library.gds', **GDS_OPTS)
if __name__ == '__main__':
@ -116,22 +115,21 @@ if __name__ == '__main__':
#class prout:
# def place(
# self,
# other: Device,
# other: Pattern,
# label_layer: layer_t = 'WATLAYER',
# *,
# port_map: Optional[Dict[str, Optional[str]]] = None,
# port_map: Dict[str, str | None] | None = None,
# **kwargs,
# ) -> 'prout':
#
# Device.place(self, other, port_map=port_map, **kwargs)
# name: Optional[str]
# Pattern.place(self, other, port_map=port_map, **kwargs)
# name: str | None
# for name in other.ports:
# if port_map:
# assert(name is not None)
# name = port_map.get(name, name)
# if name is None:
# continue
# self.pattern.labels += [
# Label(string=name, offset=self.ports[name].offset, layer=layer)]
# self.pattern.label(string=name, offset=self.ports[name].offset, layer=label_layer)
# return self
#

277
examples/tutorial/pather.py Normal file
View File

@ -0,0 +1,277 @@
"""
Manual wire routing tutorial: Pather and BasicTool
"""
from collections.abc import Callable
from numpy import pi
from masque import Pather, RenderPather, Library, Pattern, Port, layer_t, map_layers
from masque.builder.tools import BasicTool, PathTool
from masque.file.gdsii import writefile
from basic_shapes import GDS_OPTS
#
# Define some basic wire widths, in nanometers
# M2 is the top metal; M1 is below it and connected with vias on V1
#
M1_WIDTH = 1000
V1_WIDTH = 500
M2_WIDTH = 4000
#
# First, we can define some functions for generating our wire geometry
#
def make_pad() -> Pattern:
"""
Create a pattern with a single rectangle of M2, with a single port on the bottom
Every pad will be an instance of the same pattern, so we will only call this function once.
"""
pat = Pattern()
pat.rect(layer='M2', xctr=0, yctr=0, lx=3 * M2_WIDTH, ly=4 * M2_WIDTH)
pat.ports['wire_port'] = Port((0, -2 * M2_WIDTH), rotation=pi / 2, ptype='m2wire')
return pat
def make_via(
layer_top: layer_t,
layer_via: layer_t,
layer_bot: layer_t,
width_top: float,
width_via: float,
width_bot: float,
ptype_top: str,
ptype_bot: str,
) -> Pattern:
"""
Generate three concentric squares, on the provided layers
(`layer_top`, `layer_via`, `layer_bot`) and with the provided widths
(`width_top`, `width_via`, `width_bot`).
Two ports are added, with the provided ptypes (`ptype_top`, `ptype_bot`).
They are placed at the left edge of the top layer and right edge of the
bottom layer, respectively.
We only have one via type, so we will only call this function once.
"""
pat = Pattern()
pat.rect(layer=layer_via, xctr=0, yctr=0, lx=width_via, ly=width_via)
pat.rect(layer=layer_bot, xctr=0, yctr=0, lx=width_bot, ly=width_bot)
pat.rect(layer=layer_top, xctr=0, yctr=0, lx=width_top, ly=width_top)
pat.ports = {
'top': Port(offset=(-width_top / 2, 0), rotation=0, ptype=ptype_top),
'bottom': Port(offset=(width_bot / 2, 0), rotation=pi, ptype=ptype_bot),
}
return pat
def make_bend(layer: layer_t, width: float, ptype: str) -> Pattern:
"""
Generate a triangular wire, with ports at the left (input) and bottom (output) edges.
This is effectively a clockwise wire bend.
Every bend will be the same, so we only need to call this twice (once each for M1 and M2).
We could call it additional times for different wire widths or bend types (e.g. squares).
"""
pat = Pattern()
pat.polygon(layer=layer, vertices=[(0, -width / 2), (0, width / 2), (width, -width / 2)])
pat.ports = {
'input': Port(offset=(0, 0), rotation=0, ptype=ptype),
'output': Port(offset=(width / 2, -width / 2), rotation=pi / 2, ptype=ptype),
}
return pat
def make_straight_wire(layer: layer_t, width: float, ptype: str, length: float) -> Pattern:
"""
Generate a straight wire with ports along either end (x=0 and x=length).
Every waveguide will be single-use, so we'll need to create lots of (mostly unique)
`Pattern`s, and this function will get called very often.
"""
pat = Pattern()
pat.rect(layer=layer, xmin=0, xmax=length, yctr=0, ly=width)
pat.ports = {
'input': Port(offset=(0, 0), rotation=0, ptype=ptype),
'output': Port(offset=(length, 0), rotation=pi, ptype=ptype),
}
return pat
def map_layer(layer: layer_t) -> layer_t:
"""
Map from a strings to GDS layer numbers
"""
layer_mapping = {
'M1': (10, 0),
'M2': (20, 0),
'V1': (30, 0),
}
return layer_mapping.get(layer, layer)
#
# Now we can start building up our library (collection of static cells) and pathing tools.
#
# If any of the operations below are confusing, you can cross-reference against the `RenderPather`
# tutorial, which handles some things more explicitly (e.g. via placement) and simplifies others
# (e.g. geometry definition).
#
def main() -> None:
# Build some patterns (static cells) using the above functions and store them in a library
library = Library()
library['pad'] = make_pad()
library['m1_bend'] = make_bend(layer='M1', ptype='m1wire', width=M1_WIDTH)
library['m2_bend'] = make_bend(layer='M2', ptype='m2wire', width=M2_WIDTH)
library['v1_via'] = make_via(
layer_top='M2',
layer_via='V1',
layer_bot='M1',
width_top=M2_WIDTH,
width_via=V1_WIDTH,
width_bot=M1_WIDTH,
ptype_bot='m1wire',
ptype_top='m2wire',
)
#
# Now, define two tools.
# M1_tool will route on M1, using wires with M1_WIDTH
# M2_tool will route on M2, using wires with M2_WIDTH
# Both tools are able to automatically transition from the other wire type (with a via)
#
# Note that while we use BasicTool for this tutorial, you can define your own `Tool`
# with arbitrary logic inside -- e.g. with single-use bends, complex transition rules,
# transmission line geometry, or other features.
#
M1_tool = BasicTool(
straight = (
# First, we need a function which takes in a length and spits out an M1 wire
lambda length: make_straight_wire(layer='M1', ptype='m1wire', width=M1_WIDTH, length=length),
'input', # When we get a pattern from make_straight_wire, use the port named 'input' as the input
'output', # and use the port named 'output' as the output
),
bend = (
library.abstract('m1_bend'), # When we need a bend, we'll reference the pattern we generated earlier
'input', # To orient it clockwise, use the port named 'input' as the input
'output', # and 'output' as the output
),
transitions = { # We can automate transitions for different (normally incompatible) port types
'm2wire': ( # For example, when we're attaching to a port with type 'm2wire'
library.abstract('v1_via'), # we can place a V1 via
'top', # using the port named 'top' as the input (i.e. the M2 side of the via)
'bottom', # and using the port named 'bottom' as the output
),
},
default_out_ptype = 'm1wire', # Unless otherwise requested, we'll default to trying to stay on M1
)
M2_tool = BasicTool(
straight = (
# Again, we use make_straight_wire, but this time we set parameters for M2
lambda length: make_straight_wire(layer='M2', ptype='m2wire', width=M2_WIDTH, length=length),
'input',
'output',
),
bend = (
library.abstract('m2_bend'), # and we use an M2 bend
'input',
'output',
),
transitions = {
'm1wire': (
library.abstract('v1_via'), # We still use the same via,
'bottom', # but the input port is now 'bottom'
'top', # and the output port is now 'top'
),
},
default_out_ptype = 'm2wire', # We default to trying to stay on M2
)
#
# Create a new pather which writes to `library` and uses `M2_tool` as its default tool.
# Then, place some pads and start routing wires!
#
pather = Pather(library, tools=M2_tool)
# Place two pads, and define their ports as 'VCC' and 'GND'
pather.place('pad', offset=(18_000, 30_000), port_map={'wire_port': 'VCC'})
pather.place('pad', offset=(18_000, 60_000), port_map={'wire_port': 'GND'})
# Add some labels to make the pads easier to distinguish
pather.pattern.label(layer='M2', string='VCC', offset=(18e3, 30e3))
pather.pattern.label(layer='M2', string='GND', offset=(18e3, 60e3))
# Path VCC forward (in this case south) and turn clockwise 90 degrees (ccw=False)
# The total distance forward (including the bend's forward component) must be 6um
pather.path('VCC', ccw=False, length=6_000)
# Now path VCC to x=0. This time, don't include any bend (ccw=None).
# Note that if we tried y=0 here, we would get an error since the VCC port is facing in the x-direction.
pather.path_to('VCC', ccw=None, x=0)
# Path GND forward by 5um, turning clockwise 90 degrees.
# This time we use shorthand (bool(0) == False) and omit the parameter labels
# Note that although ccw=0 is equivalent to ccw=False, ccw=None is not!
pather.path('GND', 0, 5_000)
# This time, path GND until it matches the current x-coordinate of VCC. Don't place a bend.
pather.path_to('GND', None, x=pather['VCC'].offset[0])
# Now, start using M1_tool for GND.
# Since we have defined an M2-to-M1 transition for BasicPather, we don't need to place one ourselves.
# If we wanted to place our via manually, we could add `pather.plug('m1_via', {'GND': 'top'})` here
# and achieve the same result without having to define any transitions in M1_tool.
# Note that even though we have changed the tool used for GND, the via doesn't get placed until
# the next time we draw a path on GND (the pather.mpath() statement below).
pather.retool(M1_tool, keys=['GND'])
# Bundle together GND and VCC, and path the bundle forward and counterclockwise.
# Pick the distance so that the leading/outermost wire (in this case GND) ends up at x=-10_000.
# Other wires in the bundle (in this case VCC) should be spaced at 5_000 pitch (so VCC ends up at x=-5_000)
#
# Since we recently retooled GND, its path starts with a via down to M1 (included in the distance
# calculation), and its straight segment and bend will be drawn using M1 while VCC's are drawn with M2.
pather.mpath(['GND', 'VCC'], ccw=True, xmax=-10_000, spacing=5_000)
# Now use M1_tool as the default tool for all ports/signals.
# Since VCC does not have an explicitly assigned tool, it will now transition down to M1.
pather.retool(M1_tool)
# Path the GND + VCC bundle forward and counterclockwise by 90 degrees.
# The total extension (travel distance along the forward direction) for the longest segment (in
# this case the segment being added to GND) should be exactly 50um.
# After turning, the wire pitch should be reduced only 1.2um.
pather.mpath(['GND', 'VCC'], ccw=True, emax=50_000, spacing=1_200)
# Make a U-turn with the bundle and expand back out to 4.5um wire pitch.
# Here, emin specifies the travel distance for the shortest segment. For the first mpath() call
# that applies to VCC, and for teh second call, that applies to GND; the relative lengths of the
# segments depend on their starting positions and their ordering within the bundle.
pather.mpath(['GND', 'VCC'], ccw=False, emin=1_000, spacing=1_200)
pather.mpath(['GND', 'VCC'], ccw=False, emin=2_000, spacing=4_500)
# Now, set the default tool back to M2_tool. Note that GND remains on M1 since it has been
# explicitly assigned a tool. We could `del pather.tools['GND']` to force it to use the default.
pather.retool(M2_tool)
# Now path both ports to x=-28_000.
# When ccw is not None, xmin constrains the trailing/innermost port to stop at the target x coordinate,
# However, with ccw=None, all ports stop at the same coordinate, and so specifying xmin= or xmax= is
# equivalent.
pather.mpath(['GND', 'VCC'], None, xmin=-28_000)
# Further extend VCC out to x=-50_000, and specify that we would like to get an output on M1.
# This results in a via at the end of the wire (instead of having one at the start like we got
# when using pather.retool().
pather.path_to('VCC', None, -50_000, out_ptype='m1wire')
# Save the pather's pattern into our library
library['Pather_and_BasicTool'] = pather.pattern
# Convert from text-based layers to numeric layers for GDS, and output the file
library.map_layers(map_layer)
writefile(library, 'pather.gds', **GDS_OPTS)
if __name__ == '__main__':
main()

View File

@ -2,7 +2,7 @@
Routines for creating normalized 2D lattices and common photonic crystal
cavity designs.
"""
from typing import Sequence, Tuple
from collection.abc import Sequence
import numpy
from numpy.typing import ArrayLike, NDArray
@ -29,8 +29,11 @@ def triangular_lattice(
Returns:
`[[x0, y0], [x1, 1], ...]` denoting lattice sites.
"""
sx, sy = numpy.meshgrid(numpy.arange(dims[0], dtype=float),
numpy.arange(dims[1], dtype=float), indexing='ij')
sx, sy = numpy.meshgrid(
numpy.arange(dims[0], dtype=float),
numpy.arange(dims[1], dtype=float),
indexing='ij',
)
sx[sy % 2 == 1] += 0.5
sy *= numpy.sqrt(3) / 2
@ -230,8 +233,8 @@ def ln_shift_defect(
# Shift holes
# Expand shifts as necessary
tmp_a = numpy.array(shifts_a)
tmp_r = numpy.array(shifts_r)
tmp_a = numpy.asarray(shifts_a)
tmp_r = numpy.asarray(shifts_r)
n_shifted = max(tmp_a.size, tmp_r.size)
shifts_a = numpy.ones(n_shifted)

View File

@ -0,0 +1,96 @@
"""
Manual wire routing tutorial: RenderPather an PathTool
"""
from collections.abc import Callable
from masque import RenderPather, Library, Pattern, Port, layer_t, map_layers
from masque.builder.tools import PathTool
from masque.file.gdsii import writefile
from basic_shapes import GDS_OPTS
from pather import M1_WIDTH, V1_WIDTH, M2_WIDTH, map_layer, make_pad, make_via
def main() -> None:
#
# To illustrate the advantages of using `RenderPather`, we use `PathTool` instead
# of `BasicTool`. `PathTool` lacks some sophistication (e.g. no automatic transitions)
# but when used with `RenderPather`, it can consolidate multiple routing steps into
# a single `Path` shape.
#
# We'll try to nearly replicate the layout from the `Pather` tutorial; see `pather.py`
# for more detailed descriptions of the individual pathing steps.
#
# First, we make a library and generate some of the same patterns as in the pather tutorial
library = Library()
library['pad'] = make_pad()
library['v1_via'] = make_via(
layer_top='M2',
layer_via='V1',
layer_bot='M1',
width_top=M2_WIDTH,
width_via=V1_WIDTH,
width_bot=M1_WIDTH,
ptype_bot='m1wire',
ptype_top='m2wire',
)
# `PathTool` is more limited than `BasicTool`. It only generates one type of shape
# (`Path`), so it only needs to know what layer to draw on, what width to draw with,
# and what port type to present.
M1_ptool = PathTool(layer='M1', width=M1_WIDTH, ptype='m1wire')
M2_ptool = PathTool(layer='M2', width=M2_WIDTH, ptype='m2wire')
rpather = RenderPather(tools=M2_ptool, library=library)
# As in the pather tutorial, we make soem pads and labels...
rpather.place('pad', offset=(18_000, 30_000), port_map={'wire_port': 'VCC'})
rpather.place('pad', offset=(18_000, 60_000), port_map={'wire_port': 'GND'})
rpather.pattern.label(layer='M2', string='VCC', offset=(18e3, 30e3))
rpather.pattern.label(layer='M2', string='GND', offset=(18e3, 60e3))
# ...and start routing the signals.
rpather.path('VCC', ccw=False, length=6_000)
rpather.path_to('VCC', ccw=None, x=0)
rpather.path('GND', 0, 5_000)
rpather.path_to('GND', None, x=rpather['VCC'].offset[0])
# `PathTool` doesn't know how to transition betwen metal layers, so we have to
# `plug` the via into the GND wire ourselves.
rpather.plug('v1_via', {'GND': 'top'})
rpather.retool(M1_ptool, keys=['GND'])
rpather.mpath(['GND', 'VCC'], ccw=True, xmax=-10_000, spacing=5_000)
# Same thing on the VCC wire when it goes down to M1.
rpather.plug('v1_via', {'VCC': 'top'})
rpather.retool(M1_ptool)
rpather.mpath(['GND', 'VCC'], ccw=True, emax=50_000, spacing=1_200)
rpather.mpath(['GND', 'VCC'], ccw=False, emin=1_000, spacing=1_200)
rpather.mpath(['GND', 'VCC'], ccw=False, emin=2_000, spacing=4_500)
# And again when VCC goes back up to M2.
rpather.plug('v1_via', {'VCC': 'bottom'})
rpather.retool(M2_ptool)
rpather.mpath(['GND', 'VCC'], None, xmin=-28_000)
# Finally, since PathTool has no conception of transitions, we can't
# just ask it to transition to an 'm1wire' port at the end of the final VCC segment.
# Instead, we have to calculate the via size ourselves, and adjust the final position
# to account for it.
via_size = abs(
library['v1_via'].ports['top'].offset[0]
- library['v1_via'].ports['bottom'].offset[0]
)
rpather.path_to('VCC', None, -50_000 + via_size)
rpather.plug('v1_via', {'VCC': 'top'})
rpather.render()
library['RenderPather_and_PathTool'] = rpather.pattern
# Convert from text-based layers to numeric layers for GDS, and output the file
library.map_layers(map_layer)
writefile(library, 'render_pather.gds', **GDS_OPTS)
if __name__ == '__main__':
main()

View File

@ -1,16 +1,16 @@
"""
masque 2D CAD library
masque is an attempt to make a relatively small library for designing lithography
masque is an attempt to make a relatively compact library for designing lithography
masks. The general idea is to implement something resembling the GDSII and OASIS file-formats,
but with some additional vectorized element types (eg. ellipses, not just polygons), better
support for E-beam doses, and the ability to interface with multiple file formats.
but with some additional vectorized element types (eg. ellipses, not just polygons), and the
ability to interface with multiple file formats.
`Pattern` is a basic object containing a 2D lithography mask, composed of a list of `Shape`
objects, a list of `Label` objects, and a list of references to other `Patterns` (using
`SubPattern`).
`Ref`).
`SubPattern` provides basic support for nesting `Pattern` objects within each other, by adding
`Ref` provides basic support for nesting `Pattern` objects within each other, by adding
offset, rotation, scaling, repetition, and other such properties to a Pattern reference.
Note that the methods for these classes try to avoid copying wherever possible, so unless
@ -20,24 +20,73 @@
NOTES ON INTERNALS
==========================
- Many of `masque`'s classes make use of `__slots__` to make them faster / smaller.
Since `__slots__` doesn't play well with multiple inheritance, the `masque.utils.AutoSlots`
metaclass is used to auto-generate slots based on superclass type annotations.
- File I/O submodules are imported by `masque.file` to avoid creating hard dependencies on
external file-format reader/writers
- Pattern locking/unlocking is quite slow for large hierarchies.
Since `__slots__` doesn't play well with multiple inheritance, often they are left
empty for superclasses and it is the subclass's responsibility to set them correctly.
- File I/O submodules are not imported by `masque.file` to avoid creating hard dependencies
on external file-format reader/writers
- Try to accept the broadest-possible inputs: e.g., don't demand an `ILibraryView` if you
can accept a `Mapping[str, Pattern]` and wrap it in a `LibraryView` internally.
"""
from .error import PatternError, PatternLockedError
from .shapes import Shape
from .label import Label
from .subpattern import SubPattern
from .pattern import Pattern
from .utils import layer_t, annotations_t
from .library import Library, DeviceLibrary
from .utils import (
layer_t as layer_t,
annotations_t as annotations_t,
SupportsBool as SupportsBool,
)
from .error import (
MasqueError as MasqueError,
PatternError as PatternError,
LibraryError as LibraryError,
BuildError as BuildError,
)
from .shapes import (
Shape as Shape,
Polygon as Polygon,
Path as Path,
Circle as Circle,
Arc as Arc,
Ellipse as Ellipse,
)
from .label import Label as Label
from .ref import Ref as Ref
from .pattern import (
Pattern as Pattern,
map_layers as map_layers,
map_targets as map_targets,
chain_elements as chain_elements,
)
from .library import (
ILibraryView as ILibraryView,
ILibrary as ILibrary,
LibraryView as LibraryView,
Library as Library,
LazyLibrary as LazyLibrary,
AbstractView as AbstractView,
TreeView as TreeView,
Tree as Tree,
)
from .ports import (
Port as Port,
PortList as PortList,
)
from .abstract import Abstract as Abstract
from .builder import (
Builder as Builder,
Tool as Tool,
Pather as Pather,
RenderPather as RenderPather,
RenderStep as RenderStep,
BasicTool as BasicTool,
PathTool as PathTool,
)
from .utils import (
ports2data as ports2data,
oneshot as oneshot,
)
__author__ = 'Jan Petykiewicz'
__version__ = '2.7'
__version__ = '3.2'
version = __version__ # legacy

217
masque/abstract.py Normal file
View File

@ -0,0 +1,217 @@
from typing import Self
import copy
import logging
import numpy
from numpy.typing import ArrayLike
from .ref import Ref
from .ports import PortList, Port
from .utils import rotation_matrix_2d
#if TYPE_CHECKING:
# from .builder import Builder, Tool
# from .library import ILibrary
logger = logging.getLogger(__name__)
class Abstract(PortList):
"""
An `Abstract` is a container for a name and associated ports.
When snapping a sub-component to an existing pattern, only the name (not contained
in a `Pattern` object) and port info is needed, and not the geometry itself.
"""
__slots__ = ('name', '_ports')
name: str
""" Name of the pattern this device references """
_ports: dict[str, Port]
""" Uniquely-named ports which can be used to instances together"""
@property
def ports(self) -> dict[str, Port]:
return self._ports
@ports.setter
def ports(self, value: dict[str, Port]) -> None:
self._ports = value
def __init__(
self,
name: str,
ports: dict[str, Port],
) -> None:
self.name = name
self.ports = copy.deepcopy(ports)
# TODO do we want to store a Ref instead of just a name? then we can translate/rotate/mirror...
def __repr__(self) -> str:
s = f'<Abstract {self.name} ['
for name, port in self.ports.items():
s += f'\n\t{name}: {port}'
s += ']>'
return s
def translate_ports(self, offset: ArrayLike) -> Self:
"""
Translates all ports by the given offset.
Args:
offset: (x, y) to translate by
Returns:
self
"""
for port in self.ports.values():
port.translate(offset)
return self
def scale_by(self, c: float) -> Self:
"""
Scale this Abstract by the given value
(all port offsets are scaled)
Args:
c: factor to scale by
Returns:
self
"""
for port in self.ports.values():
port.offset *= c
return self
def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self:
"""
Rotate the Abstract around the a location.
Args:
pivot: (x, y) location to rotate around
rotation: Angle to rotate by (counter-clockwise, radians)
Returns:
self
"""
pivot = numpy.asarray(pivot, dtype=float)
self.translate_ports(-pivot)
self.rotate_ports(rotation)
self.rotate_port_offsets(rotation)
self.translate_ports(+pivot)
return self
def rotate_port_offsets(self, rotation: float) -> Self:
"""
Rotate the offsets of all ports around (0, 0)
Args:
rotation: Angle to rotate by (counter-clockwise, radians)
Returns:
self
"""
for port in self.ports.values():
port.offset = rotation_matrix_2d(rotation) @ port.offset
return self
def rotate_ports(self, rotation: float) -> Self:
"""
Rotate each port around its offset (i.e. in place)
Args:
rotation: Angle to rotate by (counter-clockwise, radians)
Returns:
self
"""
for port in self.ports.values():
port.rotate(rotation)
return self
def mirror_port_offsets(self, across_axis: int = 0) -> Self:
"""
Mirror the offsets of all shapes, labels, and refs across an axis
Args:
across_axis: Axis to mirror across
(0: mirror across x axis, 1: mirror across y axis)
Returns:
self
"""
for port in self.ports.values():
port.offset[across_axis - 1] *= -1
return self
def mirror_ports(self, across_axis: int = 0) -> Self:
"""
Mirror each port's rotation across an axis, relative to its
offset
Args:
across_axis: Axis to mirror across
(0: mirror across x axis, 1: mirror across y axis)
Returns:
self
"""
for port in self.ports.values():
port.mirror(across_axis)
return self
def mirror(self, across_axis: int = 0) -> Self:
"""
Mirror the Pattern across an axis
Args:
axis: Axis to mirror across
(0: mirror across x axis, 1: mirror across y axis)
Returns:
self
"""
self.mirror_ports(across_axis)
self.mirror_port_offsets(across_axis)
return self
def apply_ref_transform(self, ref: Ref) -> Self:
"""
Apply the transform from a `Ref` to the ports of this `Abstract`.
This changes the port locations to where they would be in the Ref's parent pattern.
Args:
ref: The ref whose transform should be applied.
Returns:
self
"""
if ref.mirrored:
self.mirror()
self.rotate_ports(ref.rotation)
self.rotate_port_offsets(ref.rotation)
self.translate_ports(ref.offset)
return self
def undo_ref_transform(self, ref: Ref) -> Self:
"""
Apply the inverse transform from a `Ref` to the ports of this `Abstract`.
This changes the port locations to where they would be in the Ref's target (from the parent).
Args:
ref: The ref whose (inverse) transform should be applied.
Returns:
self
# TODO test undo_ref_transform
"""
self.translate_ports(-ref.offset)
self.rotate_port_offsets(-ref.rotation)
self.rotate_ports(-ref.rotation)
if ref.mirrored:
self.mirror(0)
return self

View File

@ -1,2 +1,10 @@
from .devices import Port, Device
from .utils import ell
from .builder import Builder as Builder
from .pather import Pather as Pather
from .renderpather import RenderPather as RenderPather
from .utils import ell as ell
from .tools import (
Tool as Tool,
RenderStep as RenderStep,
BasicTool as BasicTool,
PathTool as PathTool,
)

436
masque/builder/builder.py Normal file
View File

@ -0,0 +1,436 @@
"""
Simplified Pattern assembly (`Builder`)
"""
from typing import Self
from collections.abc import Sequence, Mapping
import copy
import logging
from functools import wraps
from numpy.typing import ArrayLike
from ..pattern import Pattern
from ..library import ILibrary, TreeView
from ..error import BuildError
from ..ports import PortList, Port
from ..abstract import Abstract
logger = logging.getLogger(__name__)
class Builder(PortList):
"""
A `Builder` is a helper object used for snapping together multiple
lower-level patterns at their `Port`s.
The `Builder` mostly just holds context, in the form of a `Library`,
in addition to its underlying pattern. This simplifies some calls
to `plug` and `place`, by making the library implicit.
`Builder` can also be `set_dead()`, at which point further calls to `plug()`
and `place()` are ignored (intended for debugging).
Examples: Creating a Builder
===========================
- `Builder(library, ports={'A': port_a, 'C': port_c}, name='mypat')` makes
an empty pattern, adds the given ports, and places it into `library`
under the name `'mypat'`.
- `Builder(library)` makes an empty pattern with no ports. The pattern
is not added into `library` and must later be added with e.g.
`library['mypat'] = builder.pattern`
- `Builder(library, pattern=pattern, name='mypat')` uses an existing
pattern (including its ports) and sets `library['mypat'] = pattern`.
- `Builder.interface(other_pat, port_map=['A', 'B'], library=library)`
makes a new (empty) pattern, copies over ports 'A' and 'B' from
`other_pat`, and creates additional ports 'in_A' and 'in_B' facing
in the opposite directions. This can be used to build a device which
can plug into `other_pat` (using the 'in_*' ports) but which does not
itself include `other_pat` as a subcomponent.
- `Builder.interface(other_builder, ...)` does the same thing as
`Builder.interface(other_builder.pattern, ...)` but also uses
`other_builder.library` as its library by default.
Examples: Adding to a pattern
=============================
- `my_device.plug(subdevice, {'A': 'C', 'B': 'B'}, map_out={'D': 'myport'})`
instantiates `subdevice` into `my_device`, plugging ports 'A' and 'B'
of `my_device` into ports 'C' and 'B' of `subdevice`. The connected ports
are removed and any unconnected ports from `subdevice` are added to
`my_device`. Port 'D' of `subdevice` (unconnected) is renamed to 'myport'.
- `my_device.plug(wire, {'myport': 'A'})` places port 'A' of `wire` at 'myport'
of `my_device`. If `wire` has only two ports (e.g. 'A' and 'B'), no `map_out`,
argument is provided, and the `inherit_name` argument is not explicitly
set to `False`, the unconnected port of `wire` is automatically renamed to
'myport'. This allows easy extension of existing ports without changing
their names or having to provide `map_out` each time `plug` is called.
- `my_device.place(pad, offset=(10, 10), rotation=pi / 2, port_map={'A': 'gnd'})`
instantiates `pad` at the specified (x, y) offset and with the specified
rotation, adding its ports to those of `my_device`. Port 'A' of `pad` is
renamed to 'gnd' so that further routing can use this signal or net name
rather than the port name on the original `pad` device.
"""
__slots__ = ('pattern', 'library', '_dead')
pattern: Pattern
""" Layout of this device """
library: ILibrary
"""
Library from which patterns should be referenced
"""
_dead: bool
""" If True, plug()/place() are skipped (for debugging)"""
@property
def ports(self) -> dict[str, Port]:
return self.pattern.ports
@ports.setter
def ports(self, value: dict[str, Port]) -> None:
self.pattern.ports = value
def __init__(
self,
library: ILibrary,
*,
pattern: Pattern | None = None,
ports: str | Mapping[str, Port] | None = None,
name: str | None = None,
) -> None:
"""
Args:
library: The library from which referenced patterns will be taken
pattern: The pattern which will be modified by subsequent operations.
If `None` (default), a new pattern is created.
ports: Allows specifying the initial set of ports, if `pattern` does
not already have any ports (or is not provided). May be a string,
in which case it is interpreted as a name in `library`.
Default `None` (no ports).
name: If specified, `library[name]` is set to `self.pattern`.
"""
self._dead = False
self.library = library
if pattern is not None:
self.pattern = pattern
else:
self.pattern = Pattern()
if ports is not None:
if self.pattern.ports:
raise BuildError('Ports supplied for pattern with pre-existing ports!')
if isinstance(ports, str):
ports = library.abstract(ports).ports
self.pattern.ports.update(copy.deepcopy(dict(ports)))
if name is not None:
library[name] = self.pattern
@classmethod
def interface(
cls: type['Builder'],
source: PortList | Mapping[str, Port] | str,
*,
library: ILibrary | None = None,
in_prefix: str = 'in_',
out_prefix: str = '',
port_map: dict[str, str] | Sequence[str] | None = None,
name: str | None = None,
) -> 'Builder':
"""
Wrapper for `Pattern.interface()`, which returns a Builder instead.
Args:
source: A collection of ports (e.g. Pattern, Builder, or dict)
from which to create the interface. May be a pattern name if
`library` is provided.
library: Library from which existing patterns should be referenced,
and to which the new one should be added (if named). If not provided,
`source.library` must exist and will be used.
in_prefix: Prepended to port names for newly-created ports with
reversed directions compared to the current device.
out_prefix: Prepended to port names for ports which are directly
copied from the current device.
port_map: Specification for ports to copy into the new device:
- If `None`, all ports are copied.
- If a sequence, only the listed ports are copied
- If a mapping, the listed ports (keys) are copied and
renamed (to the values).
Returns:
The new builder, with an empty pattern and 2x as many ports as
listed in port_map.
Raises:
`PortError` if `port_map` contains port names not present in the
current device.
`PortError` if applying the prefixes results in duplicate port
names.
"""
if library is None:
if hasattr(source, 'library') and isinstance(source.library, ILibrary):
library = source.library
else:
raise BuildError('No library was given, and `source.library` does not have one either.')
if isinstance(source, str):
source = library.abstract(source).ports
pat = Pattern.interface(source, in_prefix=in_prefix, out_prefix=out_prefix, port_map=port_map)
new = Builder(library=library, pattern=pat, name=name)
return new
@wraps(Pattern.label)
def label(self, *args, **kwargs) -> Self:
self.pattern.label(*args, **kwargs)
return self
@wraps(Pattern.ref)
def ref(self, *args, **kwargs) -> Self:
self.pattern.ref(*args, **kwargs)
return self
@wraps(Pattern.polygon)
def polygon(self, *args, **kwargs) -> Self:
self.pattern.polygon(*args, **kwargs)
return self
@wraps(Pattern.rect)
def rect(self, *args, **kwargs) -> Self:
self.pattern.rect(*args, **kwargs)
return self
# Note: We're a superclass of `Pather`, where path() means something different...
#@wraps(Pattern.path)
#def path(self, *args, **kwargs) -> Self:
# self.pattern.path(*args, **kwargs)
# return self
def plug(
self,
other: Abstract | str | Pattern | TreeView,
map_in: dict[str, str],
map_out: dict[str, str | None] | None = None,
*,
mirrored: bool = False,
inherit_name: bool = True,
set_rotation: bool | None = None,
append: bool = False,
) -> Self:
"""
Wrapper around `Pattern.plug` which allows a string for `other`.
The `Builder`'s library is used to dereference the string (or `Abstract`, if
one is passed with `append=True`). If a `TreeView` is passed, it is first
added into `self.library`.
Args:
other: An `Abstract`, string, `Pattern`, or `TreeView` describing the
device to be instatiated. If it is a `TreeView`, it is first
added into `self.library`, after which the topcell is plugged;
an equivalent statement is `self.plug(self.library << other, ...)`.
map_in: dict of `{'self_port': 'other_port'}` mappings, specifying
port connections between the two devices.
map_out: dict of `{'old_name': 'new_name'}` mappings, specifying
new names for ports in `other`.
mirrored: Enables mirroring `other` across the x axis prior to
connecting any ports.
inherit_name: If `True`, and `map_in` specifies only a single port,
and `map_out` is `None`, and `other` has only two ports total,
then automatically renames the output port of `other` to the
name of the port from `self` that appears in `map_in`. This
makes it easy to extend a device with simple 2-port devices
(e.g. wires) without providing `map_out` each time `plug` is
called. See "Examples" above for more info. Default `True`.
set_rotation: If the necessary rotation cannot be determined from
the ports being connected (i.e. all pairs have at least one
port with `rotation=None`), `set_rotation` must be provided
to indicate how much `other` should be rotated. Otherwise,
`set_rotation` must remain `None`.
append: If `True`, `other` is appended instead of being referenced.
Note that this does not flatten `other`, so its refs will still
be refs (now inside `self`).
Returns:
self
Raises:
`PortError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other_names`.
`PortError` if there are any duplicate names after `map_in` and `map_out`
are applied.
`PortError` if the specified port mapping is not achieveable (the ports
do not line up)
"""
if self._dead:
logger.error('Skipping plug() since device is dead')
return self
if not isinstance(other, str | Abstract | Pattern):
# We got a Tree; add it into self.library and grab an Abstract for it
other = self.library << other
if isinstance(other, str):
other = self.library.abstract(other)
if append and isinstance(other, Abstract):
other = self.library[other.name]
self.pattern.plug(
other=other,
map_in=map_in,
map_out=map_out,
mirrored=mirrored,
inherit_name=inherit_name,
set_rotation=set_rotation,
append=append,
)
return self
def place(
self,
other: Abstract | str | Pattern | TreeView,
*,
offset: ArrayLike = (0, 0),
rotation: float = 0,
pivot: ArrayLike = (0, 0),
mirrored: bool = False,
port_map: dict[str, str | None] | None = None,
skip_port_check: bool = False,
append: bool = False,
) -> Self:
"""
Wrapper around `Pattern.place` which allows a string or `TreeView` for `other`.
The `Builder`'s library is used to dereference the string (or `Abstract`, if
one is passed with `append=True`). If a `TreeView` is passed, it is first
added into `self.library`.
Args:
other: An `Abstract`, string, `Pattern`, or `TreeView` describing the
device to be instatiated. If it is a `TreeView`, it is first
added into `self.library`, after which the topcell is plugged;
an equivalent statement is `self.plug(self.library << other, ...)`.
offset: Offset at which to place the instance. Default (0, 0).
rotation: Rotation applied to the instance before placement. Default 0.
pivot: Rotation is applied around this pivot point (default (0, 0)).
Rotation is applied prior to translation (`offset`).
mirrored: Whether theinstance should be mirrored across the x axis.
Mirroring is applied before translation and rotation.
port_map: dict of `{'old_name': 'new_name'}` mappings, specifying
new names for ports in the instantiated device. New names can be
`None`, which will delete those ports.
skip_port_check: Can be used to skip the internal call to `check_ports`,
in case it has already been performed elsewhere.
append: If `True`, `other` is appended instead of being referenced.
Note that this does not flatten `other`, so its refs will still
be refs (now inside `self`).
Returns:
self
Raises:
`PortError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other.ports`.
`PortError` if there are any duplicate names after `map_in` and `map_out`
are applied.
"""
if self._dead:
logger.error('Skipping place() since device is dead')
return self
if not isinstance(other, str | Abstract | Pattern):
# We got a Tree; add it into self.library and grab an Abstract for it
other = self.library << other
if isinstance(other, str):
other = self.library.abstract(other)
if append and isinstance(other, Abstract):
other = self.library[other.name]
self.pattern.place(
other=other,
offset=offset,
rotation=rotation,
pivot=pivot,
mirrored=mirrored,
port_map=port_map,
skip_port_check=skip_port_check,
append=append,
)
return self
def translate(self, offset: ArrayLike) -> Self:
"""
Translate the pattern and all ports.
Args:
offset: (x, y) distance to translate by
Returns:
self
"""
self.pattern.translate_elements(offset)
return self
def rotate_around(self, pivot: ArrayLike, angle: float) -> Self:
"""
Rotate the pattern and all ports.
Args:
angle: angle (radians, counterclockwise) to rotate by
pivot: location to rotate around
Returns:
self
"""
self.pattern.rotate_around(pivot, angle)
for port in self.ports.values():
port.rotate_around(pivot, angle)
return self
def mirror(self, axis: int = 0) -> Self:
"""
Mirror the pattern and all ports across the specified axis.
Args:
axis: Axis to mirror across (x=0, y=1)
Returns:
self
"""
self.pattern.mirror(axis)
return self
def set_dead(self) -> Self:
"""
Disallows further changes through `plug()` or `place()`.
This is meant for debugging:
```
dev.plug(a, ...)
dev.set_dead() # added for debug purposes
dev.plug(b, ...) # usually raises an error, but now skipped
dev.plug(c, ...) # also skipped
dev.pattern.visualize() # shows the device as of the set_dead() call
```
Returns:
self
"""
self._dead = True
return self
def __repr__(self) -> str:
s = f'<Builder {self.pattern} L({len(self.library)})>'
return s

View File

@ -1,764 +0,0 @@
from typing import Dict, Iterable, List, Tuple, Union, TypeVar, Any, Iterator, Optional, Sequence
from typing import overload, KeysView, ValuesView
import copy
import warnings
import traceback
import logging
from collections import Counter
import numpy
from numpy import pi
from numpy.typing import ArrayLike, NDArray
from ..pattern import Pattern
from ..subpattern import SubPattern
from ..traits import PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable
from ..utils import AutoSlots, rotation_matrix_2d
from ..error import DeviceError
logger = logging.getLogger(__name__)
P = TypeVar('P', bound='Port')
D = TypeVar('D', bound='Device')
O = TypeVar('O', bound='Device')
class Port(PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable, metaclass=AutoSlots):
"""
A point at which a `Device` can be snapped to another `Device`.
Each port has an `offset` ((x, y) position) and may also have a
`rotation` (orientation) and a `ptype` (port type).
The `rotation` is an angle, in radians, measured counterclockwise
from the +x axis, pointing inwards into the device which owns the port.
The rotation may be set to `None`, indicating that any orientation is
allowed (e.g. for a DC electrical port). It is stored modulo 2pi.
The `ptype` is an arbitrary string, default of `unk` (unknown).
"""
__slots__ = ('ptype', '_rotation')
_rotation: Optional[float]
""" radians counterclockwise from +x, pointing into device body.
Can be `None` to signify undirected port """
ptype: str
""" Port types must match to be plugged together if both are non-zero """
def __init__(
self,
offset: ArrayLike,
rotation: Optional[float],
ptype: str = 'unk',
) -> None:
self.offset = offset
self.rotation = rotation
self.ptype = ptype
@property
def rotation(self) -> Optional[float]:
""" Rotation, radians counterclockwise, pointing into device body. Can be None. """
return self._rotation
@rotation.setter
def rotation(self, val: float) -> None:
if val is None:
self._rotation = None
else:
if not numpy.size(val) == 1:
raise DeviceError('Rotation must be a scalar')
self._rotation = val % (2 * pi)
def get_bounds(self):
return numpy.vstack((self.offset, self.offset))
def set_ptype(self: P, ptype: str) -> P:
""" Chainable setter for `ptype` """
self.ptype = ptype
return self
def mirror(self: P, axis: int) -> P:
self.offset[1 - axis] *= -1
if self.rotation is not None:
self.rotation *= -1
self.rotation += axis * pi
return self
def rotate(self: P, rotation: float) -> P:
if self.rotation is not None:
self.rotation += rotation
return self
def set_rotation(self: P, rotation: Optional[float]) -> P:
self.rotation = rotation
return self
def __repr__(self) -> str:
if self.rotation is None:
rot = 'any'
else:
rot = str(numpy.rad2deg(self.rotation))
return f'<{self.offset}, {rot}, [{self.ptype}]>'
class Device(Copyable, Mirrorable):
"""
A `Device` is a combination of a `Pattern` with a set of named `Port`s
which can be used to "snap" devices together to make complex layouts.
`Device`s can be as simple as one or two ports (e.g. an electrical pad
or wire), but can also be used to build and represent a large routed
layout (e.g. a logical block with multiple I/O connections or even a
full chip).
For convenience, ports can be read out using square brackets:
- `device['A'] == Port((0, 0), 0)`
- `device[['A', 'B']] == {'A': Port((0, 0), 0), 'B': Port((0, 0), pi)}`
Examples: Creating a Device
===========================
- `Device(pattern, ports={'A': port_a, 'C': port_c})` uses an existing
pattern and defines some ports.
- `Device(name='my_dev_name', ports=None)` makes a new empty pattern with
default ports ('A' and 'B', in opposite directions, at (0, 0)).
- `my_device.build('my_layout')` makes a new pattern and instantiates
`my_device` in it with offset (0, 0) as a base for further building.
- `my_device.as_interface('my_component', port_map=['A', 'B'])` makes a new
(empty) pattern, copies over ports 'A' and 'B' from `my_device`, and
creates additional ports 'in_A' and 'in_B' facing in the opposite
directions. This can be used to build a device which can plug into
`my_device` (using the 'in_*' ports) but which does not itself include
`my_device` as a subcomponent.
Examples: Adding to a Device
============================
- `my_device.plug(subdevice, {'A': 'C', 'B': 'B'}, map_out={'D': 'myport'})`
instantiates `subdevice` into `my_device`, plugging ports 'A' and 'B'
of `my_device` into ports 'C' and 'B' of `subdevice`. The connected ports
are removed and any unconnected ports from `subdevice` are added to
`my_device`. Port 'D' of `subdevice` (unconnected) is renamed to 'myport'.
- `my_device.plug(wire, {'myport': 'A'})` places port 'A' of `wire` at 'myport'
of `my_device`. If `wire` has only two ports (e.g. 'A' and 'B'), no `map_out`,
argument is provided, and the `inherit_name` argument is not explicitly
set to `False`, the unconnected port of `wire` is automatically renamed to
'myport'. This allows easy extension of existing ports without changing
their names or having to provide `map_out` each time `plug` is called.
- `my_device.place(pad, offset=(10, 10), rotation=pi / 2, port_map={'A': 'gnd'})`
instantiates `pad` at the specified (x, y) offset and with the specified
rotation, adding its ports to those of `my_device`. Port 'A' of `pad` is
renamed to 'gnd' so that further routing can use this signal or net name
rather than the port name on the original `pad` device.
"""
__slots__ = ('pattern', 'ports', '_dead')
pattern: Pattern
""" Layout of this device """
ports: Dict[str, Port]
""" Uniquely-named ports which can be used to snap to other Device instances"""
_dead: bool
""" If True, plug()/place() are skipped (for debugging)"""
def __init__(
self,
pattern: Optional[Pattern] = None,
ports: Optional[Dict[str, Port]] = None,
*,
name: Optional[str] = None,
) -> None:
"""
If `ports` is `None`, two default ports ('A' and 'B') are created.
Both are placed at (0, 0) and have default `ptype`, but 'A' has rotation 0
(attached devices will be placed to the left) and 'B' has rotation
pi (attached devices will be placed to the right).
"""
if pattern is not None:
if name is not None:
raise DeviceError('Only one of `pattern` and `name` may be specified')
self.pattern = pattern
else:
if name is None:
raise DeviceError('Must specify either `pattern` or `name`')
self.pattern = Pattern(name=name)
if ports is None:
self.ports = {
'A': Port([0, 0], rotation=0),
'B': Port([0, 0], rotation=pi),
}
else:
self.ports = copy.deepcopy(ports)
self._dead = False
@overload
def __getitem__(self, key: str) -> Port:
pass
@overload
def __getitem__(self, key: Union[List[str], Tuple[str], KeysView[str], ValuesView[str]]) -> Dict[str, Port]:
pass
def __getitem__(self, key: Union[str, Iterable[str]]) -> Union[Port, Dict[str, Port]]:
"""
For convenience, ports can be read out using square brackets:
- `device['A'] == Port((0, 0), 0)`
- `device[['A', 'B']] == {'A': Port((0, 0), 0),
'B': Port((0, 0), pi)}`
"""
if isinstance(key, str):
return self.ports[key]
else:
return {k: self.ports[k] for k in key}
def rename_ports(
self: D,
mapping: Dict[str, Optional[str]],
overwrite: bool = False,
) -> D:
"""
Renames ports as specified by `mapping`.
Ports can be explicitly deleted by mapping them to `None`.
Args:
mapping: Dict of `{'old_name': 'new_name'}` pairs. Names can be mapped
to `None` to perform an explicit deletion. `'new_name'` can also
overwrite an existing non-renamed port to implicitly delete it if
`overwrite` is set to `True`.
overwrite: Allows implicit deletion of ports if set to `True`; see `mapping`.
Returns:
self
"""
if not overwrite:
duplicates = (set(self.ports.keys()) - set(mapping.keys())) & set(mapping.values())
if duplicates:
raise DeviceError(f'Unrenamed ports would be overwritten: {duplicates}')
renamed = {mapping[k]: self.ports.pop(k) for k in mapping.keys()}
if None in renamed:
del renamed[None]
self.ports.update(renamed) # type: ignore
return self
def check_ports(
self: D,
other_names: Iterable[str],
map_in: Optional[Dict[str, str]] = None,
map_out: Optional[Dict[str, Optional[str]]] = None,
) -> D:
"""
Given the provided port mappings, check that:
- All of the ports specified in the mappings exist
- There are no duplicate port names after all the mappings are performed
Args:
other_names: List of port names being considered for inclusion into
`self.ports` (before mapping)
map_in: Dict of `{'self_port': 'other_port'}` mappings, specifying
port connections between the two devices.
map_out: Dict of `{'old_name': 'new_name'}` mappings, specifying
new names for unconnected `other_names` ports.
Returns:
self
Raises:
`DeviceError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other_names`.
`DeviceError` if there are any duplicate names after `map_in` and `map_out`
are applied.
"""
if map_in is None:
map_in = {}
if map_out is None:
map_out = {}
other = set(other_names)
missing_inkeys = set(map_in.keys()) - set(self.ports.keys())
if missing_inkeys:
raise DeviceError(f'`map_in` keys not present in device: {missing_inkeys}')
missing_invals = set(map_in.values()) - other
if missing_invals:
raise DeviceError(f'`map_in` values not present in other device: {missing_invals}')
missing_outkeys = set(map_out.keys()) - other
if missing_outkeys:
raise DeviceError(f'`map_out` keys not present in other device: {missing_outkeys}')
orig_remaining = set(self.ports.keys()) - set(map_in.keys())
other_remaining = other - set(map_out.keys()) - set(map_in.values())
mapped_vals = set(map_out.values())
mapped_vals.discard(None)
conflicts_final = orig_remaining & (other_remaining | mapped_vals)
if conflicts_final:
raise DeviceError(f'Device ports conflict with existing ports: {conflicts_final}')
conflicts_partial = other_remaining & mapped_vals
if conflicts_partial:
raise DeviceError(f'`map_out` targets conflict with non-mapped outputs: {conflicts_partial}')
map_out_counts = Counter(map_out.values())
map_out_counts[None] = 0
conflicts_out = {k for k, v in map_out_counts.items() if v > 1}
if conflicts_out:
raise DeviceError(f'Duplicate targets in `map_out`: {conflicts_out}')
return self
def build(self, name: str) -> 'Device':
"""
Begin building a new device around an instance of the current device
(rather than modifying the current device).
Args:
name: A name for the new device
Returns:
The new `Device` object.
"""
pat = Pattern(name)
pat.addsp(self.pattern)
new = Device(pat, ports=self.ports)
return new
def as_interface(
self,
name: str,
in_prefix: str = 'in_',
out_prefix: str = '',
port_map: Optional[Union[Dict[str, str], Sequence[str]]] = None
) -> 'Device':
"""
Begin building a new device based on all or some of the ports in the
current device. Do not include the current device; instead use it
to define ports (the "interface") for the new device.
The ports specified by `port_map` (default: all ports) are copied to
new device, and additional (input) ports are created facing in the
opposite directions. The specified `in_prefix` and `out_prefix` are
prepended to the port names to differentiate them.
By default, the flipped ports are given an 'in_' prefix and unflipped
ports keep their original names, enabling intuitive construction of
a device that will "plug into" the current device; the 'in_*' ports
are used for plugging the devices together while the original port
names are used for building the new device.
Another use-case could be to build the new device using the 'in_'
ports, creating a new device which could be used in place of the
current device.
Args:
name: Name for the new device
in_prefix: Prepended to port names for newly-created ports with
reversed directions compared to the current device.
out_prefix: Prepended to port names for ports which are directly
copied from the current device.
port_map: Specification for ports to copy into the new device:
- If `None`, all ports are copied.
- If a sequence, only the listed ports are copied
- If a mapping, the listed ports (keys) are copied and
renamed (to the values).
Returns:
The new device, with an empty pattern and 2x as many ports as
listed in port_map.
Raises:
`DeviceError` if `port_map` contains port names not present in the
current device.
`DeviceError` if applying the prefixes results in duplicate port
names.
"""
if port_map:
if isinstance(port_map, dict):
missing_inkeys = set(port_map.keys()) - set(self.ports.keys())
orig_ports = {port_map[k]: v for k, v in self.ports.items() if k in port_map}
else:
port_set = set(port_map)
missing_inkeys = port_set - set(self.ports.keys())
orig_ports = {k: v for k, v in self.ports.items() if k in port_set}
if missing_inkeys:
raise DeviceError(f'`port_map` keys not present in device: {missing_inkeys}')
else:
orig_ports = self.ports
ports_in = {f'{in_prefix}{name}': port.deepcopy().rotate(pi)
for name, port in orig_ports.items()}
ports_out = {f'{out_prefix}{name}': port.deepcopy()
for name, port in orig_ports.items()}
duplicates = set(ports_out.keys()) & set(ports_in.keys())
if duplicates:
raise DeviceError(f'Duplicate keys after prefixing, try a different prefix: {duplicates}')
new = Device(name=name, ports={**ports_in, **ports_out})
return new
def plug(
self: D,
other: O,
map_in: Dict[str, str],
map_out: Optional[Dict[str, Optional[str]]] = None,
*,
mirrored: Tuple[bool, bool] = (False, False),
inherit_name: bool = True,
set_rotation: Optional[bool] = None,
) -> D:
"""
Instantiate the device `other` into the current device, connecting
the ports specified by `map_in` and renaming the unconnected
ports specified by `map_out`.
Examples:
=========
- `my_device.plug(subdevice, {'A': 'C', 'B': 'B'}, map_out={'D': 'myport'})`
instantiates `subdevice` into `my_device`, plugging ports 'A' and 'B'
of `my_device` into ports 'C' and 'B' of `subdevice`. The connected ports
are removed and any unconnected ports from `subdevice` are added to
`my_device`. Port 'D' of `subdevice` (unconnected) is renamed to 'myport'.
- `my_device.plug(wire, {'myport': 'A'})` places port 'A' of `wire` at 'myport'
of `my_device`. If `wire` has only two ports (e.g. 'A' and 'B'), no `map_out`,
argument is provided, and the `inherit_name` argument is not explicitly
set to `False`, the unconnected port of `wire` is automatically renamed to
'myport'. This allows easy extension of existing ports without changing
their names or having to provide `map_out` each time `plug` is called.
Args:
other: A device to instantiate into the current device.
map_in: Dict of `{'self_port': 'other_port'}` mappings, specifying
port connections between the two devices.
map_out: Dict of `{'old_name': 'new_name'}` mappings, specifying
new names for ports in `other`.
mirrored: Enables mirroring `other` across the x or y axes prior
to connecting any ports.
inherit_name: If `True`, and `map_in` specifies only a single port,
and `map_out` is `None`, and `other` has only two ports total,
then automatically renames the output port of `other` to the
name of the port from `self` that appears in `map_in`. This
makes it easy to extend a device with simple 2-port devices
(e.g. wires) without providing `map_out` each time `plug` is
called. See "Examples" above for more info. Default `True`.
set_rotation: If the necessary rotation cannot be determined from
the ports being connected (i.e. all pairs have at least one
port with `rotation=None`), `set_rotation` must be provided
to indicate how much `other` should be rotated. Otherwise,
`set_rotation` must remain `None`.
Returns:
self
Raises:
`DeviceError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other_names`.
`DeviceError` if there are any duplicate names after `map_in` and `map_out`
are applied.
`DeviceError` if the specified port mapping is not achieveable (the ports
do not line up)
"""
if self._dead:
logger.error('Skipping plug() since device is dead')
return self
if (inherit_name
and not map_out
and len(map_in) == 1
and len(other.ports) == 2):
out_port_name = next(iter(set(other.ports.keys()) - set(map_in.values())))
map_out = {out_port_name: next(iter(map_in.keys()))}
if map_out is None:
map_out = {}
map_out = copy.deepcopy(map_out)
self.check_ports(other.ports.keys(), map_in, map_out)
translation, rotation, pivot = self.find_transform(other, map_in, mirrored=mirrored,
set_rotation=set_rotation)
# get rid of plugged ports
for ki, vi in map_in.items():
del self.ports[ki]
map_out[vi] = None
self.place(other, offset=translation, rotation=rotation, pivot=pivot,
mirrored=mirrored, port_map=map_out, skip_port_check=True)
return self
def place(
self: D,
other: O,
*,
offset: ArrayLike = (0, 0),
rotation: float = 0,
pivot: ArrayLike = (0, 0),
mirrored: Tuple[bool, bool] = (False, False),
port_map: Optional[Dict[str, Optional[str]]] = None,
skip_port_check: bool = False,
) -> D:
"""
Instantiate the device `other` into the current device, adding its
ports to those of the current device (but not connecting any ports).
Mirroring is applied before rotation; translation (`offset`) is applied last.
Examples:
=========
- `my_device.place(pad, offset=(10, 10), rotation=pi / 2, port_map={'A': 'gnd'})`
instantiates `pad` at the specified (x, y) offset and with the specified
rotation, adding its ports to those of `my_device`. Port 'A' of `pad` is
renamed to 'gnd' so that further routing can use this signal or net name
rather than the port name on the original `pad` device.
Args:
other: A device to instantiate into the current device.
offset: Offset at which to place `other`. Default (0, 0).
rotation: Rotation applied to `other` before placement. Default 0.
pivot: Rotation is applied around this pivot point (default (0, 0)).
Rotation is applied prior to translation (`offset`).
mirrored: Whether `other` should be mirrored across the x and y axes.
Mirroring is applied before translation and rotation.
port_map: Dict of `{'old_name': 'new_name'}` mappings, specifying
new names for ports in `other`. New names can be `None`, which will
delete those ports.
skip_port_check: Can be used to skip the internal call to `check_ports`,
in case it has already been performed elsewhere.
Returns:
self
Raises:
`DeviceError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other_names`.
`DeviceError` if there are any duplicate names after `map_in` and `map_out`
are applied.
"""
if self._dead:
logger.error('Skipping place() since device is dead')
return self
if port_map is None:
port_map = {}
if not skip_port_check:
self.check_ports(other.ports.keys(), map_in=None, map_out=port_map)
ports = {}
for name, port in other.ports.items():
new_name = port_map.get(name, name)
if new_name is None:
continue
ports[new_name] = port
for name, port in ports.items():
p = port.deepcopy()
p.mirror2d(mirrored)
p.rotate_around(pivot, rotation)
p.translate(offset)
self.ports[name] = p
sp = SubPattern(other.pattern, mirrored=mirrored)
sp.rotate_around(pivot, rotation)
sp.translate(offset)
self.pattern.subpatterns.append(sp)
return self
def find_transform(
self: D,
other: O,
map_in: Dict[str, str],
*,
mirrored: Tuple[bool, bool] = (False, False),
set_rotation: Optional[bool] = None,
) -> Tuple[NDArray[numpy.float64], float, NDArray[numpy.float64]]:
"""
Given a device `other` and a mapping `map_in` specifying port connections,
find the transform which will correctly align the specified ports.
Args:
other: a device
map_in: Dict of `{'self_port': 'other_port'}` mappings, specifying
port connections between the two devices.
mirrored: Mirrors `other` across the x or y axes prior to
connecting any ports.
set_rotation: If the necessary rotation cannot be determined from
the ports being connected (i.e. all pairs have at least one
port with `rotation=None`), `set_rotation` must be provided
to indicate how much `other` should be rotated. Otherwise,
`set_rotation` must remain `None`.
Returns:
- The (x, y) translation (performed last)
- The rotation (radians, counterclockwise)
- The (x, y) pivot point for the rotation
The rotation should be performed before the translation.
"""
s_ports = self[map_in.keys()]
o_ports = other[map_in.values()]
s_offsets = numpy.array([p.offset for p in s_ports.values()])
o_offsets = numpy.array([p.offset for p in o_ports.values()])
s_types = [p.ptype for p in s_ports.values()]
o_types = [p.ptype for p in o_ports.values()]
s_rotations = numpy.array([p.rotation if p.rotation is not None else 0 for p in s_ports.values()])
o_rotations = numpy.array([p.rotation if p.rotation is not None else 0 for p in o_ports.values()])
s_has_rot = numpy.array([p.rotation is not None for p in s_ports.values()], dtype=bool)
o_has_rot = numpy.array([p.rotation is not None for p in o_ports.values()], dtype=bool)
has_rot = s_has_rot & o_has_rot
if mirrored[0]:
o_offsets[:, 1] *= -1
o_rotations *= -1
if mirrored[1]:
o_offsets[:, 0] *= -1
o_rotations *= -1
o_rotations += pi
type_conflicts = numpy.array([st != ot and st != 'unk' and ot != 'unk'
for st, ot in zip(s_types, o_types)])
if type_conflicts.any():
ports = numpy.where(type_conflicts)
msg = 'Ports have conflicting types:\n'
for nn, (k, v) in enumerate(map_in.items()):
if type_conflicts[nn]:
msg += f'{k} | {s_types[nn]}:{o_types[nn]} | {v}\n'
msg = ''.join(traceback.format_stack()) + '\n' + msg
warnings.warn(msg, stacklevel=2)
rotations = numpy.mod(s_rotations - o_rotations - pi, 2 * pi)
if not has_rot.any():
if set_rotation is None:
DeviceError('Must provide set_rotation if rotation is indeterminate')
rotations[:] = set_rotation
else:
rotations[~has_rot] = rotations[has_rot][0]
if not numpy.allclose(rotations[:1], rotations):
rot_deg = numpy.rad2deg(rotations)
msg = f'Port orientations do not match:\n'
for nn, (k, v) in enumerate(map_in.items()):
msg += f'{k} | {rot_deg[nn]:g} | {v}\n'
raise DeviceError(msg)
pivot = o_offsets[0].copy()
rotate_offsets_around(o_offsets, pivot, rotations[0])
translations = s_offsets - o_offsets
if not numpy.allclose(translations[:1], translations):
msg = f'Port translations do not match:\n'
for nn, (k, v) in enumerate(map_in.items()):
msg += f'{k} | {translations[nn]} | {v}\n'
raise DeviceError(msg)
return translations[0], rotations[0], o_offsets[0]
def translate(self: D, offset: ArrayLike) -> D:
"""
Translate the pattern and all ports.
Args:
offset: (x, y) distance to translate by
Returns:
self
"""
self.pattern.translate_elements(offset)
for port in self.ports.values():
port.translate(offset)
return self
def rotate_around(self: D, pivot: ArrayLike, angle: float) -> D:
"""
Translate the pattern and all ports.
Args:
offset: (x, y) distance to translate by
Returns:
self
"""
self.pattern.rotate_around(pivot, angle)
for port in self.ports.values():
port.rotate_around(pivot, angle)
return self
def mirror(self: D, axis: int) -> D:
"""
Translate the pattern and all ports across the specified axis.
Args:
axis: Axis to mirror across (x=0, y=1)
Returns:
self
"""
self.pattern.mirror(axis)
for p in self.ports.values():
p.mirror(axis)
return self
def set_dead(self: D) -> D:
"""
Disallows further changes through `plug()` or `place()`.
This is meant for debugging:
```
dev.plug(a, ...)
dev.set_dead() # added for debug purposes
dev.plug(b, ...) # usually raises an error, but now skipped
dev.plug(c, ...) # also skipped
dev.pattern.visualize() # shows the device as of the set_dead() call
```
Returns:
self
"""
self._dead = True
return self
def rename(self: D, name: str) -> D:
"""
Renames the pattern and returns the device
Args:
name: The new name
Returns:
self
"""
self.pattern.name = name
return self
def __repr__(self) -> str:
s = f'<Device {self.pattern} ['
for name, port in self.ports.items():
s += f'\n\t{name}: {port}'
s += ']>'
return s
def rotate_offsets_around(
offsets: NDArray[numpy.float64],
pivot: NDArray[numpy.float64],
angle: float,
) -> NDArray[numpy.float64]:
offsets -= pivot
offsets[:] = (rotation_matrix_2d(angle) @ offsets.T).T
offsets += pivot
return offsets

694
masque/builder/pather.py Normal file
View File

@ -0,0 +1,694 @@
"""
Manual wire/waveguide routing (`Pather`)
"""
from typing import Self
from collections.abc import Sequence, MutableMapping, Mapping
import copy
import logging
from pprint import pformat
import numpy
from numpy import pi
from numpy.typing import ArrayLike
from ..pattern import Pattern
from ..library import ILibrary, SINGLE_USE_PREFIX
from ..error import PortError, BuildError
from ..ports import PortList, Port
from ..abstract import Abstract
from ..utils import SupportsBool, rotation_matrix_2d
from .tools import Tool
from .utils import ell
from .builder import Builder
logger = logging.getLogger(__name__)
class Pather(Builder):
"""
An extension of `Builder` which provides functionality for routing and attaching
single-use patterns (e.g. wires or waveguides) and bundles / buses of such patterns.
`Pather` is mostly concerned with calculating how long each wire should be. It calls
out to `Tool.path` functions provided by subclasses of `Tool` to build the actual patterns.
`Tool`s are assigned on a per-port basis and stored in `.tools`; a key of `None` represents
a "default" `Tool` used for all ports which do not have a port-specific `Tool` assigned.
Examples: Creating a Pather
===========================
- `Pather(library, tools=my_tool)` makes an empty pattern with no ports. The pattern
is not added into `library` and must later be added with e.g.
`library['mypat'] = pather.pattern`.
The default wire/waveguide generating tool for all ports is set to `my_tool`.
- `Pather(library, ports={'in': Port(...), 'out': ...}, name='mypat', tools=my_tool)`
makes an empty pattern, adds the given ports, and places it into `library`
under the name `'mypat'`. The default wire/waveguide generating tool
for all ports is set to `my_tool`
- `Pather(..., tools={'in': top_metal_40um, 'out': bottom_metal_1um, None: my_tool})`
assigns specific tools to individual ports, and `my_tool` as a default for ports
which are not specified.
- `Pather.interface(other_pat, port_map=['A', 'B'], library=library, tools=my_tool)`
makes a new (empty) pattern, copies over ports 'A' and 'B' from
`other_pat`, and creates additional ports 'in_A' and 'in_B' facing
in the opposite directions. This can be used to build a device which
can plug into `other_pat` (using the 'in_*' ports) but which does not
itself include `other_pat` as a subcomponent.
- `Pather.interface(other_pather, ...)` does the same thing as
`Builder.interface(other_builder.pattern, ...)` but also uses
`other_builder.library` as its library by default.
Examples: Adding to a pattern
=============================
- `pather.path('my_port', ccw=True, distance)` creates a "wire" for which the output
port is `distance` units away along the axis of `'my_port'` and rotated 90 degrees
counterclockwise (since `ccw=True`) relative to `'my_port'`. The wire is `plug`ged
into the existing `'my_port'`, causing the port to move to the wire's output.
There is no formal guarantee about how far off-axis the output will be located;
there may be a significant width to the bend that is used to accomplish the 90 degree
turn. However, an error is raised if `distance` is too small to fit the bend.
- `pather.path('my_port', ccw=None, distance)` creates a straight wire with a length
of `distance` and `plug`s it into `'my_port'`.
- `pather.path_to('my_port', ccw=False, position)` creates a wire which starts at
`'my_port'` and has its output at the specified `position`, pointing 90 degrees
clockwise relative to the input. Again, the off-axis position or distance to the
output is not specified, so `position` takes the form of a single coordinate. To
ease debugging, position may be specified as `x=position` or `y=position` and an
error will be raised if the wrong coordinate is given.
- `pather.mpath(['A', 'B', 'C'], ..., spacing=spacing)` is a superset of `path`
and `path_to` which can act on multiple ports simultaneously. Each port's wire is
generated using its own `Tool` (or the default tool if left unspecified).
The output ports are spaced out by `spacing` along the input ports' axis, unless
`ccw=None` is specified (i.e. no bends) in which case they all end at the same
destination coordinate.
- `pather.plug(wire, {'myport': 'A'})` places port 'A' of `wire` at 'myport'
of `pather.pattern`. If `wire` has only two ports (e.g. 'A' and 'B'), no `map_out`,
argument is provided, and the `inherit_name` argument is not explicitly
set to `False`, the unconnected port of `wire` is automatically renamed to
'myport'. This allows easy extension of existing ports without changing
their names or having to provide `map_out` each time `plug` is called.
- `pather.place(pad, offset=(10, 10), rotation=pi / 2, port_map={'A': 'gnd'})`
instantiates `pad` at the specified (x, y) offset and with the specified
rotation, adding its ports to those of `pather.pattern`. Port 'A' of `pad` is
renamed to 'gnd' so that further routing can use this signal or net name
rather than the port name on the original `pad` device.
- `pather.retool(tool)` or `pather.retool(tool, ['in', 'out', None])` can change
which tool is used for the given ports (or as the default tool). Useful
when placing vias or using multiple waveguide types along a route.
"""
__slots__ = ('tools',)
library: ILibrary
"""
Library from which existing patterns should be referenced, and to which
new ones should be added
"""
tools: dict[str | None, Tool]
"""
Tool objects are used to dynamically generate new single-use `Pattern`s
(e.g wires or waveguides) to be plugged into this device. A key of `None`
indicates the default `Tool`.
"""
def __init__(
self,
library: ILibrary,
*,
pattern: Pattern | None = None,
ports: str | Mapping[str, Port] | None = None,
tools: Tool | MutableMapping[str | None, Tool] | None = None,
name: str | None = None,
) -> None:
"""
Args:
library: The library from which referenced patterns will be taken,
and where new patterns (e.g. generated by the `tools`) will be placed.
pattern: The pattern which will be modified by subsequent operations.
If `None` (default), a new pattern is created.
ports: Allows specifying the initial set of ports, if `pattern` does
not already have any ports (or is not provided). May be a string,
in which case it is interpreted as a name in `library`.
Default `None` (no ports).
tools: A mapping of {port: tool} which specifies what `Tool` should be used
to generate waveguide or wire segments when `path`/`path_to`/`mpath`
are called. Relies on `Tool.path` implementations.
name: If specified, `library[name]` is set to `self.pattern`.
"""
self._dead = False
self.library = library
if pattern is not None:
self.pattern = pattern
else:
self.pattern = Pattern()
if ports is not None:
if self.pattern.ports:
raise BuildError('Ports supplied for pattern with pre-existing ports!')
if isinstance(ports, str):
ports = library.abstract(ports).ports
self.pattern.ports.update(copy.deepcopy(dict(ports)))
if name is not None:
library[name] = self.pattern
if tools is None:
self.tools = {}
elif isinstance(tools, Tool):
self.tools = {None: tools}
else:
self.tools = dict(tools)
@classmethod
def from_builder(
cls: type['Pather'],
builder: Builder,
*,
tools: Tool | MutableMapping[str | None, Tool] | None = None,
) -> 'Pather':
"""
Construct a `Pather` by adding tools to a `Builder`.
Args:
builder: Builder to turn into a Pather
tools: Tools for the `Pather`
Returns:
A new Pather object, using `builder.library` and `builder.pattern`.
"""
new = Pather(library=builder.library, tools=tools, pattern=builder.pattern)
return new
@classmethod
def interface(
cls: type['Pather'],
source: PortList | Mapping[str, Port] | str,
*,
library: ILibrary | None = None,
tools: Tool | MutableMapping[str | None, Tool] | None = None,
in_prefix: str = 'in_',
out_prefix: str = '',
port_map: dict[str, str] | Sequence[str] | None = None,
name: str | None = None,
) -> 'Pather':
"""
Wrapper for `Pattern.interface()`, which returns a Pather instead.
Args:
source: A collection of ports (e.g. Pattern, Builder, or dict)
from which to create the interface. May be a pattern name if
`library` is provided.
library: Library from which existing patterns should be referenced,
and to which the new one should be added (if named). If not provided,
`source.library` must exist and will be used.
tools: `Tool`s which will be used by the pather for generating new wires
or waveguides (via `path`/`path_to`/`mpath`).
in_prefix: Prepended to port names for newly-created ports with
reversed directions compared to the current device.
out_prefix: Prepended to port names for ports which are directly
copied from the current device.
port_map: Specification for ports to copy into the new device:
- If `None`, all ports are copied.
- If a sequence, only the listed ports are copied
- If a mapping, the listed ports (keys) are copied and
renamed (to the values).
Returns:
The new pather, with an empty pattern and 2x as many ports as
listed in port_map.
Raises:
`PortError` if `port_map` contains port names not present in the
current device.
`PortError` if applying the prefixes results in duplicate port
names.
"""
if library is None:
if hasattr(source, 'library') and isinstance(source.library, ILibrary):
library = source.library
else:
raise BuildError('No library provided (and not present in `source.library`')
if tools is None and hasattr(source, 'tools') and isinstance(source.tools, dict):
tools = source.tools
if isinstance(source, str):
source = library.abstract(source).ports
pat = Pattern.interface(source, in_prefix=in_prefix, out_prefix=out_prefix, port_map=port_map)
new = Pather(library=library, pattern=pat, name=name, tools=tools)
return new
def __repr__(self) -> str:
s = f'<Pather {self.pattern} L({len(self.library)}) {pformat(self.tools)}>'
return s
def retool(
self,
tool: Tool,
keys: str | Sequence[str | None] | None = None,
) -> Self:
"""
Update the `Tool` which will be used when generating `Pattern`s for the ports
given by `keys`.
Args:
tool: The new `Tool` to use for the given ports.
keys: Which ports the tool should apply to. `None` indicates the default tool,
used when there is no matching entry in `self.tools` for the port in question.
Returns:
self
"""
if keys is None or isinstance(keys, str):
self.tools[keys] = tool
else:
for key in keys:
self.tools[key] = tool
return self
def path(
self,
portspec: str,
ccw: SupportsBool | None,
length: float,
*,
tool_port_names: tuple[str, str] = ('A', 'B'),
plug_into: str | None = None,
**kwargs,
) -> Self:
"""
Create a "wire"/"waveguide" and `plug` it into the port `portspec`, with the aim
of traveling exactly `length` distance.
The wire will travel `length` distance along the port's axis, an an unspecified
(tool-dependent) distance in the perpendicular direction. The output port will
be rotated (or not) based on the `ccw` parameter.
Args:
portspec: The name of the port into which the wire will be plugged.
ccw: If `None`, the output should be along the same axis as the input.
Otherwise, cast to bool and turn counterclockwise if True
and clockwise otherwise.
length: The total distance from input to output, along the input's axis only.
(There may be a tool-dependent offset along the other axis.)
tool_port_names: The names of the ports on the generated pattern. It is unlikely
that you will need to change these. The first port is the input (to be
connected to `portspec`).
plug_into: If not None, attempts to plug the wire's output port into the provided
port on `self`.
Returns:
self
Raises:
BuildError if `distance` is too small to fit the bend (if a bend is present).
LibraryError if no valid name could be picked for the pattern.
"""
if self._dead:
logger.error('Skipping path() since device is dead')
return self
tool = self.tools.get(portspec, self.tools[None])
in_ptype = self.pattern[portspec].ptype
tree = tool.path(ccw, length, in_ptype=in_ptype, port_names=tool_port_names, **kwargs)
abstract = self.library << tree
if plug_into is not None:
output = {plug_into: tool_port_names[1]}
else:
output = {}
return self.plug(abstract, {portspec: tool_port_names[0], **output})
def path_to(
self,
portspec: str,
ccw: SupportsBool | None,
position: float | None = None,
*,
x: float | None = None,
y: float | None = None,
tool_port_names: tuple[str, str] = ('A', 'B'),
plug_into: str | None = None,
**kwargs,
) -> Self:
"""
Create a "wire"/"waveguide" and `plug` it into the port `portspec`, with the aim
of ending exactly at a target position.
The wire will travel so that the output port will be placed at exactly the target
position along the input port's axis. There can be an unspecified (tool-dependent)
offset in the perpendicular direction. The output port will be rotated (or not)
based on the `ccw` parameter.
Args:
portspec: The name of the port into which the wire will be plugged.
ccw: If `None`, the output should be along the same axis as the input.
Otherwise, cast to bool and turn counterclockwise if True
and clockwise otherwise.
position: The final port position, along the input's axis only.
(There may be a tool-dependent offset along the other axis.)
Only one of `position`, `x`, and `y` may be specified.
x: The final port position along the x axis.
`portspec` must refer to a horizontal port if `x` is passed, otherwise a
BuildError will be raised.
y: The final port position along the y axis.
`portspec` must refer to a vertical port if `y` is passed, otherwise a
BuildError will be raised.
tool_port_names: The names of the ports on the generated pattern. It is unlikely
that you will need to change these. The first port is the input (to be
connected to `portspec`).
plug_into: If not None, attempts to plug the wire's output port into the provided
port on `self`.
Returns:
self
Raises:
BuildError if `position`, `x`, or `y` is too close to fit the bend (if a bend
is present).
BuildError if `x` or `y` is specified but does not match the axis of `portspec`.
BuildError if more than one of `x`, `y`, and `position` is specified.
"""
if self._dead:
logger.error('Skipping path_to() since device is dead')
return self
pos_count = sum(vv is not None for vv in (position, x, y))
if pos_count > 1:
raise BuildError('Only one of `position`, `x`, and `y` may be specified at once')
if pos_count < 1:
raise BuildError('One of `position`, `x`, and `y` must be specified')
port = self.pattern[portspec]
if port.rotation is None:
raise PortError(f'Port {portspec} has no rotation and cannot be used for path_to()')
if not numpy.isclose(port.rotation % (pi / 2), 0):
raise BuildError('path_to was asked to route from non-manhattan port')
is_horizontal = numpy.isclose(port.rotation % pi, 0)
if is_horizontal:
if y is not None:
raise BuildError('Asked to path to y-coordinate, but port is horizontal')
if position is None:
position = x
else:
if x is not None:
raise BuildError('Asked to path to x-coordinate, but port is vertical')
if position is None:
position = y
x0, y0 = port.offset
if is_horizontal:
if numpy.sign(numpy.cos(port.rotation)) == numpy.sign(position - x0):
raise BuildError(f'path_to routing to behind source port: x0={x0:g} to {position:g}')
length = numpy.abs(position - x0)
else:
if numpy.sign(numpy.sin(port.rotation)) == numpy.sign(position - y0):
raise BuildError(f'path_to routing to behind source port: y0={y0:g} to {position:g}')
length = numpy.abs(position - y0)
return self.path(
portspec,
ccw,
length,
tool_port_names=tool_port_names,
plug_into=plug_into,
**kwargs,
)
def path_into(
self,
portspec_src: str,
portspec_dst: str,
*,
tool_port_names: tuple[str, str] = ('A', 'B'),
out_ptype: str | None = None,
plug_destination: bool = True,
**kwargs,
) -> Self:
"""
Create a "wire"/"waveguide" and traveling between the ports `portspec_src` and
`portspec_dst`, and `plug` it into both (or just the source port).
Only unambiguous scenarios are allowed:
- Straight connector between facing ports
- Single 90 degree bend
- Jog between facing ports
(jog is done as late as possible, i.e. only 2 L-shaped segments are used)
By default, the destination's `pytpe` will be used as the `out_ptype` for the
wire, and the `portspec_dst` will be plugged (i.e. removed).
Args:
portspec_src: The name of the starting port into which the wire will be plugged.
portspec_dst: The name of the destination port.
tool_port_names: The names of the ports on the generated pattern. It is unlikely
that you will need to change these. The first port is the input (to be
connected to `portspec`).
out_ptype: Passed to the pathing tool in order to specify the desired port type
to be generated at the destination end. If `None` (default), the destination
port's `ptype` will be used.
Returns:
self
Raises:
PortError if either port does not have a specified rotation.
BuildError if and invalid port config is encountered:
- Non-manhattan ports
- U-bend
- Destination too close to (or behind) source
"""
if self._dead:
logger.error('Skipping path_into() since device is dead')
return self
port_src = self.pattern[portspec_src]
port_dst = self.pattern[portspec_dst]
if out_ptype is None:
out_ptype = port_dst.ptype
if port_src.rotation is None:
raise PortError(f'Port {portspec_src} has no rotation and cannot be used for path_into()')
if port_dst.rotation is None:
raise PortError(f'Port {portspec_dst} has no rotation and cannot be used for path_into()')
if not numpy.isclose(port_src.rotation % (pi / 2), 0):
raise BuildError('path_into was asked to route from non-manhattan port')
if not numpy.isclose(port_dst.rotation % (pi / 2), 0):
raise BuildError('path_into was asked to route to non-manhattan port')
src_is_horizontal = numpy.isclose(port_src.rotation % pi, 0)
dst_is_horizontal = numpy.isclose(port_dst.rotation % pi, 0)
xs, ys = port_src.offset
xd, yd = port_dst.offset
angle = (port_dst.rotation - port_src.rotation) % (2 * pi)
src_ne = port_src.rotation % (2 * pi) > (3 * pi / 4) # path from src will go north or east
def get_jog(ccw: SupportsBool, length: float) -> float:
tool = self.tools.get(portspec_src, self.tools[None])
in_ptype = 'unk' # Could use port_src.ptype, but we're assuming this is after one bend already...
tree2 = tool.path(ccw, length, in_ptype=in_ptype, port_names=('A', 'B'), out_ptype=out_ptype, **kwargs)
top2 = tree2.top_pattern()
jog = rotation_matrix_2d(top2['A'].rotation) @ (top2['B'].offset - top2['A'].offset)
return jog[1]
dst_extra_args = {'out_ptype': out_ptype}
if plug_destination:
dst_extra_args['plug_into'] = portspec_dst
src_args = {**kwargs, 'tool_port_names': tool_port_names}
dst_args = {**src_args, **dst_extra_args}
if src_is_horizontal and not dst_is_horizontal:
# single bend should suffice
self.path_to(portspec_src, angle > pi, x=xd, **src_args)
self.path_to(portspec_src, None, y=yd, **dst_args)
elif dst_is_horizontal and not src_is_horizontal:
# single bend should suffice
self.path_to(portspec_src, angle > pi, y=yd, **src_args)
self.path_to(portspec_src, None, x=xd, **dst_args)
elif numpy.isclose(angle, pi):
if src_is_horizontal and ys == yd:
# straight connector
self.path_to(portspec_src, None, x=xd, **dst_args)
elif not src_is_horizontal and xs == xd:
# straight connector
self.path_to(portspec_src, None, y=yd, **dst_args)
elif src_is_horizontal:
# figure out how much x our y-segment (2nd) takes up, then path based on that
y_len = numpy.abs(yd - ys)
ccw2 = src_ne != (yd > ys)
jog = get_jog(ccw2, y_len) * numpy.sign(xd - xs)
self.path_to(portspec_src, not ccw2, x=xd - jog, **src_args)
self.path_to(portspec_src, ccw2, y=yd, **dst_args)
else:
# figure out how much y our x-segment (2nd) takes up, then path based on that
x_len = numpy.abs(xd - xs)
ccw2 = src_ne != (xd < xs)
jog = get_jog(ccw2, x_len) * numpy.sign(yd - ys)
self.path_to(portspec_src, not ccw2, y=yd - jog, **src_args)
self.path_to(portspec_src, ccw2, x=xd, **dst_args)
elif numpy.isclose(angle, 0):
raise BuildError('Don\'t know how to route a U-bend at this time!')
else:
raise BuildError(f'Don\'t know how to route ports with relative angle {angle}')
return self
def mpath(
self,
portspec: str | Sequence[str],
ccw: SupportsBool | None,
*,
spacing: float | ArrayLike | None = None,
set_rotation: float | None = None,
tool_port_names: tuple[str, str] = ('A', 'B'),
force_container: bool = False,
base_name: str = SINGLE_USE_PREFIX + 'mpath',
**kwargs,
) -> Self:
"""
`mpath` is a superset of `path` and `path_to` which can act on bundles or buses
of "wires or "waveguides".
The wires will travel so that the output ports will be placed at well-defined
locations along the axis of their input ports, but may have arbitrary (tool-
dependent) offsets in the perpendicular direction.
If `ccw` is not `None`, the wire bundle will turn 90 degres in either the
clockwise (`ccw=False`) or counter-clockwise (`ccw=True`) direction. Within the
bundle, the center-to-center wire spacings after the turn are set by `spacing`,
which is required when `ccw` is not `None`. The final position of bundle as a
whole can be set in a number of ways:
=A>---------------------------V turn direction: `ccw=False`
=B>-------------V |
=C>-----------------------V |
=D=>----------------V |
|
x---x---x---x `spacing` (can be scalar or array)
<--------------> `emin=`
<------> `bound_type='min_past_furthest', bound=`
<--------------------------------> `emax=`
x `pmin=`
x `pmax=`
- `emin=`, equivalent to `bound_type='min_extension', bound=`
The total extension value for the furthest-out port (B in the diagram).
- `emax=`, equivalent to `bound_type='max_extension', bound=`:
The total extension value for the closest-in port (C in the diagram).
- `pmin=`, equivalent to `xmin=`, `ymin=`, or `bound_type='min_position', bound=`:
The coordinate of the innermost bend (D's bend).
The x/y versions throw an error if they do not match the port axis (for debug)
- `pmax=`, `xmax=`, `ymax=`, or `bound_type='max_position', bound=`:
The coordinate of the outermost bend (A's bend).
The x/y versions throw an error if they do not match the port axis (for debug)
- `bound_type='min_past_furthest', bound=`:
The distance between furthest out-port (B) and the innermost bend (D's bend).
If `ccw=None`, final output positions (along the input axis) of all wires will be
identical (i.e. wires will all be cut off evenly). In this case, `spacing=None` is
required. In this case, `emin=` and `emax=` are equivalent to each other, and
`pmin=`, `pmax=`, `xmin=`, etc. are also equivalent to each other.
Args:
portspec: The names of the ports which are to be routed.
ccw: If `None`, the outputs should be along the same axis as the inputs.
Otherwise, cast to bool and turn 90 degrees counterclockwise if `True`
and clockwise otherwise.
spacing: Center-to-center distance between output ports along the input port's axis.
Must be provided if (and only if) `ccw` is not `None`.
set_rotation: If the provided ports have `rotation=None`, this can be used
to set a rotation for them.
tool_port_names: The names of the ports on the generated pattern. It is unlikely
that you will need to change these. The first port is the input (to be
connected to `portspec`).
force_container: If `False` (default), and only a single port is provided, the
generated wire for that port will be referenced directly, rather than being
wrapped in an additonal `Pattern`.
base_name: Name to use for the generated `Pattern`. This will be passed through
`self.library.get_name()` to get a unique name for each new `Pattern`.
Returns:
self
Raises:
BuildError if the implied length for any wire is too close to fit the bend
(if a bend is requested).
BuildError if `xmin`/`xmax` or `ymin`/`ymax` is specified but does not
match the axis of `portspec`.
BuildError if an incorrect bound type or spacing is specified.
"""
if self._dead:
logger.error('Skipping mpath() since device is dead')
return self
bound_types = set()
if 'bound_type' in kwargs:
bound_types.add(kwargs['bound_type'])
bound = kwargs['bound']
del kwargs['bound_type']
del kwargs['bound']
for bt in ('emin', 'emax', 'pmin', 'pmax', 'xmin', 'xmax', 'ymin', 'ymax', 'min_past_furthest'):
if bt in kwargs:
bound_types.add(bt)
bound = kwargs[bt]
del kwargs[bt]
if not bound_types:
raise BuildError('No bound type specified for mpath')
if len(bound_types) > 1:
raise BuildError(f'Too many bound types specified for mpath: {bound_types}')
bound_type = tuple(bound_types)[0]
if isinstance(portspec, str):
portspec = [portspec]
ports = self.pattern[tuple(portspec)]
extensions = ell(ports, ccw, spacing=spacing, bound=bound, bound_type=bound_type, set_rotation=set_rotation)
if len(ports) == 1 and not force_container:
# Not a bus, so having a container just adds noise to the layout
port_name = tuple(portspec)[0]
return self.path(port_name, ccw, extensions[port_name], tool_port_names=tool_port_names, **kwargs)
bld = Pather.interface(source=ports, library=self.library, tools=self.tools)
for port_name, length in extensions.items():
bld.path(port_name, ccw, length, tool_port_names=tool_port_names, **kwargs)
name = self.library.get_name(base_name)
self.library[name] = bld.pattern
return self.plug(Abstract(name, bld.pattern.ports), {sp: 'in_' + sp for sp in ports}) # TODO safe to use 'in_'?
# TODO def bus_join()?
def flatten(self) -> Self:
"""
Flatten the contained pattern, using the contained library to resolve references.
Returns:
self
"""
self.pattern.flatten(self.library)
return self

View File

@ -1,112 +0,0 @@
"""
Functions for writing port data into a Pattern (`dev2pat`) and retrieving it (`pat2dev`).
These use the format 'name:ptype angle_deg' written into labels, which are placed at
the port locations. This particular approach is just a sensible default; feel free to
to write equivalent functions for your own format or alternate storage methods.
"""
from typing import Sequence
import logging
import numpy
from ..pattern import Pattern
from ..label import Label
from ..utils import rotation_matrix_2d, layer_t
from .devices import Device, Port
logger = logging.getLogger(__name__)
def dev2pat(device: Device, layer: layer_t) -> Pattern:
"""
Place a text label at each port location, specifying the port data in the format
'name:ptype angle_deg'
This can be used to debug port locations or to automatically generate ports
when reading in a GDS file.
NOTE that `device` is modified by this function, and `device.pattern` is returned.
Args:
device: The device which is to have its ports labeled. MODIFIED in-place.
layer: The layer on which the labels will be placed.
Returns:
`device.pattern`
"""
for name, port in device.ports.items():
if port.rotation is None:
angle_deg = numpy.inf
else:
angle_deg = numpy.rad2deg(port.rotation)
device.pattern.labels += [
Label(string=f'{name}:{port.ptype} {angle_deg:g}', layer=layer, offset=port.offset)
]
return device.pattern
def pat2dev(
pattern: Pattern,
layers: Sequence[layer_t],
max_depth: int = 999_999,
skip_subcells: bool = True,
) -> Device:
"""
Examine `pattern` for labels specifying port info, and use that info
to build a `Device` object.
Labels are assumed to be placed at the port locations, and have the format
'name:ptype angle_deg'
Args:
pattern: Pattern object to scan for labels.
layers: Search for labels on all the given layers.
max_depth: Maximum hierarcy depth to search. Default 999_999.
Reduce this to 0 to avoid ever searching subcells.
skip_subcells: If port labels are found at a given hierarcy level,
do not continue searching at deeper levels. This allows subcells
to contain their own port info (and thus become their own Devices).
Default True.
Returns:
The constructed Device object. Port labels are not removed from the pattern.
"""
ports = {} # Note: could do a list here, if they're not unique
annotated_cells = set()
def find_ports_each(pat, hierarchy, transform, memo) -> Pattern:
if len(hierarchy) > max_depth - 1:
return pat
if skip_subcells and any(parent in annotated_cells for parent in hierarchy):
return pat
labels = [ll for ll in pat.labels if ll.layer in layers]
if len(labels) == 0:
return pat
if skip_subcells:
annotated_cells.add(pat)
mirr_factor = numpy.array((1, -1)) ** transform[3]
rot_matrix = rotation_matrix_2d(transform[2])
for label in labels:
name, property_string = label.string.split(':')
properties = property_string.split(' ')
ptype = properties[0]
angle_deg = float(properties[1]) if len(ptype) else 0
xy_global = transform[:2] + rot_matrix @ (label.offset * mirr_factor)
angle = numpy.deg2rad(angle_deg) * mirr_factor[0] * mirr_factor[1] + transform[2]
if name in ports:
logger.info(f'Duplicate port {name} in pattern {pattern.name}')
ports[name] = Port(offset=xy_global, rotation=angle, ptype=ptype)
return pat
pattern.dfs(visit_before=find_ports_each, transform=True)
return Device(pattern, ports)

View File

@ -0,0 +1,703 @@
"""
Pather with batched (multi-step) rendering
"""
from typing import Self
from collections.abc import Sequence, Mapping, MutableMapping
import copy
import logging
from collections import defaultdict
from pprint import pformat
import numpy
from numpy import pi
from numpy.typing import ArrayLike
from ..pattern import Pattern
from ..library import ILibrary
from ..error import PortError, BuildError
from ..ports import PortList, Port
from ..abstract import Abstract
from ..utils import SupportsBool
from .tools import Tool, RenderStep
from .utils import ell
logger = logging.getLogger(__name__)
class RenderPather(PortList):
"""
`RenderPather` is an alternative to `Pather` which uses the `path`/`path_to`/`mpath`
functions to plan out wire paths without incrementally generating the layout. Instead,
it waits until `render` is called, at which point it draws all the planned segments
simultaneously. This allows it to e.g. draw each wire using a single `Path` or
`Polygon` shape instead of multiple rectangles.
`RenderPather` calls out to `Tool.planL` and `Tool.render` to provide tool-specific
dimensions and build the final geometry for each wire. `Tool.planL` provides the
output port data (relative to the input) for each segment. The tool, input and output
ports are placed into a `RenderStep`, and a sequence of `RenderStep`s is stored for
each port. When `render` is called, it bundles `RenderStep`s into batches which use
the same `Tool`, and passes each batch to the relevant tool's `Tool.render` to build
the geometry.
See `Pather` for routing examples. After routing is complete, `render` must be called
to generate the final geometry.
"""
__slots__ = ('pattern', 'library', 'paths', 'tools', '_dead', )
pattern: Pattern
""" Layout of this device """
library: ILibrary
""" Library from which patterns should be referenced """
_dead: bool
""" If True, plug()/place() are skipped (for debugging) """
paths: defaultdict[str, list[RenderStep]]
""" Per-port list of operations, to be used by `render` """
tools: dict[str | None, Tool]
"""
Tool objects are used to dynamically generate new single-use Devices
(e.g wires or waveguides) to be plugged into this device.
"""
@property
def ports(self) -> dict[str, Port]:
return self.pattern.ports
@ports.setter
def ports(self, value: dict[str, Port]) -> None:
self.pattern.ports = value
def __init__(
self,
library: ILibrary,
*,
pattern: Pattern | None = None,
ports: str | Mapping[str, Port] | None = None,
tools: Tool | MutableMapping[str | None, Tool] | None = None,
name: str | None = None,
) -> None:
"""
Args:
library: The library from which referenced patterns will be taken,
and where new patterns (e.g. generated by the `tools`) will be placed.
pattern: The pattern which will be modified by subsequent operations.
If `None` (default), a new pattern is created.
ports: Allows specifying the initial set of ports, if `pattern` does
not already have any ports (or is not provided). May be a string,
in which case it is interpreted as a name in `library`.
Default `None` (no ports).
tools: A mapping of {port: tool} which specifies what `Tool` should be used
to generate waveguide or wire segments when `path`/`path_to`/`mpath`
are called. Relies on `Tool.planL` and `Tool.render` implementations.
name: If specified, `library[name]` is set to `self.pattern`.
"""
self._dead = False
self.paths = defaultdict(list)
self.library = library
if pattern is not None:
self.pattern = pattern
else:
self.pattern = Pattern()
if ports is not None:
if self.pattern.ports:
raise BuildError('Ports supplied for pattern with pre-existing ports!')
if isinstance(ports, str):
if library is None:
raise BuildError('Ports given as a string, but `library` was `None`!')
ports = library.abstract(ports).ports
self.pattern.ports.update(copy.deepcopy(dict(ports)))
if name is not None:
if library is None:
raise BuildError('Name was supplied, but no library was given!')
library[name] = self.pattern
if tools is None:
self.tools = {}
elif isinstance(tools, Tool):
self.tools = {None: tools}
else:
self.tools = dict(tools)
@classmethod
def interface(
cls: type['RenderPather'],
source: PortList | Mapping[str, Port] | str,
*,
library: ILibrary | None = None,
tools: Tool | MutableMapping[str | None, Tool] | None = None,
in_prefix: str = 'in_',
out_prefix: str = '',
port_map: dict[str, str] | Sequence[str] | None = None,
name: str | None = None,
) -> 'RenderPather':
"""
Wrapper for `Pattern.interface()`, which returns a RenderPather instead.
Args:
source: A collection of ports (e.g. Pattern, Builder, or dict)
from which to create the interface. May be a pattern name if
`library` is provided.
library: Library from which existing patterns should be referenced,
and to which the new one should be added (if named). If not provided,
`source.library` must exist and will be used.
tools: `Tool`s which will be used by the pather for generating new wires
or waveguides (via `path`/`path_to`/`mpath`).
in_prefix: Prepended to port names for newly-created ports with
reversed directions compared to the current device.
out_prefix: Prepended to port names for ports which are directly
copied from the current device.
port_map: Specification for ports to copy into the new device:
- If `None`, all ports are copied.
- If a sequence, only the listed ports are copied
- If a mapping, the listed ports (keys) are copied and
renamed (to the values).
Returns:
The new `RenderPather`, with an empty pattern and 2x as many ports as
listed in port_map.
Raises:
`PortError` if `port_map` contains port names not present in the
current device.
`PortError` if applying the prefixes results in duplicate port
names.
"""
if library is None:
if hasattr(source, 'library') and isinstance(source.library, ILibrary):
library = source.library
else:
raise BuildError('No library provided (and not present in `source.library`')
if tools is None and hasattr(source, 'tools') and isinstance(source.tools, dict):
tools = source.tools
if isinstance(source, str):
source = library.abstract(source).ports
pat = Pattern.interface(source, in_prefix=in_prefix, out_prefix=out_prefix, port_map=port_map)
new = RenderPather(library=library, pattern=pat, name=name, tools=tools)
return new
def plug(
self,
other: Abstract | str,
map_in: dict[str, str],
map_out: dict[str, str | None] | None = None,
*,
mirrored: bool = False,
inherit_name: bool = True,
set_rotation: bool | None = None,
append: bool = False,
) -> Self:
"""
Wrapper for `Pattern.plug` which adds a `RenderStep` with opcode 'P'
for any affected ports. This separates any future `RenderStep`s on the
same port into a new batch, since the plugged device interferes with drawing.
Args:
other: An `Abstract`, string, or `Pattern` describing the device to be instatiated.
map_in: dict of `{'self_port': 'other_port'}` mappings, specifying
port connections between the two devices.
map_out: dict of `{'old_name': 'new_name'}` mappings, specifying
new names for ports in `other`.
mirrored: Enables mirroring `other` across the x axis prior to
connecting any ports.
inherit_name: If `True`, and `map_in` specifies only a single port,
and `map_out` is `None`, and `other` has only two ports total,
then automatically renames the output port of `other` to the
name of the port from `self` that appears in `map_in`. This
makes it easy to extend a device with simple 2-port devices
(e.g. wires) without providing `map_out` each time `plug` is
called. See "Examples" above for more info. Default `True`.
set_rotation: If the necessary rotation cannot be determined from
the ports being connected (i.e. all pairs have at least one
port with `rotation=None`), `set_rotation` must be provided
to indicate how much `other` should be rotated. Otherwise,
`set_rotation` must remain `None`.
append: If `True`, `other` is appended instead of being referenced.
Note that this does not flatten `other`, so its refs will still
be refs (now inside `self`).
Returns:
self
Raises:
`PortError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other_names`.
`PortError` if there are any duplicate names after `map_in` and `map_out`
are applied.
`PortError` if the specified port mapping is not achieveable (the ports
do not line up)
"""
if self._dead:
logger.error('Skipping plug() since device is dead')
return self
other_tgt: Pattern | Abstract
if isinstance(other, str):
other_tgt = self.library.abstract(other)
if append and isinstance(other, Abstract):
other_tgt = self.library[other.name]
# get rid of plugged ports
for kk in map_in:
if kk in self.paths:
self.paths[kk].append(RenderStep('P', None, self.ports[kk].copy(), self.ports[kk].copy(), None))
plugged = map_in.values()
for name, port in other_tgt.ports.items():
if name in plugged:
continue
new_name = map_out.get(name, name) if map_out is not None else name
if new_name is not None and new_name in self.paths:
self.paths[new_name].append(RenderStep('P', None, port.copy(), port.copy(), None))
self.pattern.plug(
other=other_tgt,
map_in=map_in,
map_out=map_out,
mirrored=mirrored,
inherit_name=inherit_name,
set_rotation=set_rotation,
append=append,
)
return self
def place(
self,
other: Abstract | str,
*,
offset: ArrayLike = (0, 0),
rotation: float = 0,
pivot: ArrayLike = (0, 0),
mirrored: bool = False,
port_map: dict[str, str | None] | None = None,
skip_port_check: bool = False,
append: bool = False,
) -> Self:
"""
Wrapper for `Pattern.place` which adds a `RenderStep` with opcode 'P'
for any affected ports. This separates any future `RenderStep`s on the
same port into a new batch, since the placed device interferes with drawing.
Note that mirroring is applied before rotation; translation (`offset`) is applied last.
Args:
other: An `Abstract` or `Pattern` describing the device to be instatiated.
offset: Offset at which to place the instance. Default (0, 0).
rotation: Rotation applied to the instance before placement. Default 0.
pivot: Rotation is applied around this pivot point (default (0, 0)).
Rotation is applied prior to translation (`offset`).
mirrored: Whether theinstance should be mirrored across the x axis.
Mirroring is applied before translation and rotation.
port_map: dict of `{'old_name': 'new_name'}` mappings, specifying
new names for ports in the instantiated pattern. New names can be
`None`, which will delete those ports.
skip_port_check: Can be used to skip the internal call to `check_ports`,
in case it has already been performed elsewhere.
append: If `True`, `other` is appended instead of being referenced.
Note that this does not flatten `other`, so its refs will still
be refs (now inside `self`).
Returns:
self
Raises:
`PortError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other.ports`.
`PortError` if there are any duplicate names after `map_in` and `map_out`
are applied.
"""
if self._dead:
logger.error('Skipping place() since device is dead')
return self
other_tgt: Pattern | Abstract
if isinstance(other, str):
other_tgt = self.library.abstract(other)
if append and isinstance(other, Abstract):
other_tgt = self.library[other.name]
for name, port in other_tgt.ports.items():
new_name = port_map.get(name, name) if port_map is not None else name
if new_name is not None and new_name in self.paths:
self.paths[new_name].append(RenderStep('P', None, port.copy(), port.copy(), None))
self.pattern.place(
other=other_tgt,
offset=offset,
rotation=rotation,
pivot=pivot,
mirrored=mirrored,
port_map=port_map,
skip_port_check=skip_port_check,
append=append,
)
return self
def retool(
self,
tool: Tool,
keys: str | Sequence[str | None] | None = None,
) -> Self:
"""
Update the `Tool` which will be used when generating `Pattern`s for the ports
given by `keys`.
Args:
tool: The new `Tool` to use for the given ports.
keys: Which ports the tool should apply to. `None` indicates the default tool,
used when there is no matching entry in `self.tools` for the port in question.
Returns:
self
"""
if keys is None or isinstance(keys, str):
self.tools[keys] = tool
else:
for key in keys:
self.tools[key] = tool
return self
def path(
self,
portspec: str,
ccw: SupportsBool | None,
length: float,
**kwargs,
) -> Self:
"""
Plan a "wire"/"waveguide" extending from the port `portspec`, with the aim
of traveling exactly `length` distance.
The wire will travel `length` distance along the port's axis, an an unspecified
(tool-dependent) distance in the perpendicular direction. The output port will
be rotated (or not) based on the `ccw` parameter.
`RenderPather.render` must be called after all paths have been fully planned.
Args:
portspec: The name of the port into which the wire will be plugged.
ccw: If `None`, the output should be along the same axis as the input.
Otherwise, cast to bool and turn counterclockwise if True
and clockwise otherwise.
length: The total distance from input to output, along the input's axis only.
(There may be a tool-dependent offset along the other axis.)
Returns:
self
Raises:
BuildError if `distance` is too small to fit the bend (if a bend is present).
LibraryError if no valid name could be picked for the pattern.
"""
if self._dead:
logger.error('Skipping path() since device is dead')
return self
port = self.pattern[portspec]
in_ptype = port.ptype
port_rot = port.rotation
assert port_rot is not None # TODO allow manually setting rotation for RenderPather.path()?
tool = self.tools.get(portspec, self.tools[None])
# ask the tool for bend size (fill missing dx or dy), check feasibility, and get out_ptype
out_port, data = tool.planL(ccw, length, in_ptype=in_ptype, **kwargs)
# Update port
out_port.rotate_around((0, 0), pi + port_rot)
out_port.translate(port.offset)
step = RenderStep('L', tool, port.copy(), out_port.copy(), data)
self.paths[portspec].append(step)
self.pattern.ports[portspec] = out_port.copy()
return self
def path_to(
self,
portspec: str,
ccw: SupportsBool | None,
position: float | None = None,
*,
x: float | None = None,
y: float | None = None,
**kwargs,
) -> Self:
"""
Plan a "wire"/"waveguide" extending from the port `portspec`, with the aim
of ending exactly at a target position.
The wire will travel so that the output port will be placed at exactly the target
position along the input port's axis. There can be an unspecified (tool-dependent)
offset in the perpendicular direction. The output port will be rotated (or not)
based on the `ccw` parameter.
`RenderPather.render` must be called after all paths have been fully planned.
Args:
portspec: The name of the port into which the wire will be plugged.
ccw: If `None`, the output should be along the same axis as the input.
Otherwise, cast to bool and turn counterclockwise if True
and clockwise otherwise.
position: The final port position, along the input's axis only.
(There may be a tool-dependent offset along the other axis.)
Only one of `position`, `x`, and `y` may be specified.
x: The final port position along the x axis.
`portspec` must refer to a horizontal port if `x` is passed, otherwise a
BuildError will be raised.
y: The final port position along the y axis.
`portspec` must refer to a vertical port if `y` is passed, otherwise a
BuildError will be raised.
Returns:
self
Raises:
BuildError if `position`, `x`, or `y` is too close to fit the bend (if a bend
is present).
BuildError if `x` or `y` is specified but does not match the axis of `portspec`.
BuildError if more than one of `x`, `y`, and `position` is specified.
"""
if self._dead:
logger.error('Skipping path_to() since device is dead')
return self
pos_count = sum(vv is not None for vv in (position, x, y))
if pos_count > 1:
raise BuildError('Only one of `position`, `x`, and `y` may be specified at once')
if pos_count < 1:
raise BuildError('One of `position`, `x`, and `y` must be specified')
port = self.pattern[portspec]
if port.rotation is None:
raise PortError(f'Port {portspec} has no rotation and cannot be used for path_to()')
if not numpy.isclose(port.rotation % (pi / 2), 0):
raise BuildError('path_to was asked to route from non-manhattan port')
is_horizontal = numpy.isclose(port.rotation % pi, 0)
if is_horizontal:
if y is not None:
raise BuildError('Asked to path to y-coordinate, but port is horizontal')
if position is None:
position = x
else:
if x is not None:
raise BuildError('Asked to path to x-coordinate, but port is vertical')
if position is None:
position = y
x0, y0 = port.offset
if is_horizontal:
if numpy.sign(numpy.cos(port.rotation)) == numpy.sign(position - x0):
raise BuildError(f'path_to routing to behind source port: x0={x0:g} to {position:g}')
length = numpy.abs(position - x0)
else:
if numpy.sign(numpy.sin(port.rotation)) == numpy.sign(position - y0):
raise BuildError(f'path_to routing to behind source port: y0={y0:g} to {position:g}')
length = numpy.abs(position - y0)
return self.path(portspec, ccw, length, **kwargs)
def mpath(
self,
portspec: str | Sequence[str],
ccw: SupportsBool | None,
*,
spacing: float | ArrayLike | None = None,
set_rotation: float | None = None,
**kwargs,
) -> Self:
"""
`mpath` is a superset of `path` and `path_to` which can act on bundles or buses
of "wires or "waveguides".
See `Pather.mpath` for details.
Args:
portspec: The names of the ports which are to be routed.
ccw: If `None`, the outputs should be along the same axis as the inputs.
Otherwise, cast to bool and turn 90 degrees counterclockwise if `True`
and clockwise otherwise.
spacing: Center-to-center distance between output ports along the input port's axis.
Must be provided if (and only if) `ccw` is not `None`.
set_rotation: If the provided ports have `rotation=None`, this can be used
to set a rotation for them.
Returns:
self
Raises:
BuildError if the implied length for any wire is too close to fit the bend
(if a bend is requested).
BuildError if `xmin`/`xmax` or `ymin`/`ymax` is specified but does not
match the axis of `portspec`.
BuildError if an incorrect bound type or spacing is specified.
"""
if self._dead:
logger.error('Skipping mpath() since device is dead')
return self
bound_types = set()
if 'bound_type' in kwargs:
bound_types.add(kwargs['bound_type'])
bound = kwargs['bound']
for bt in ('emin', 'emax', 'pmin', 'pmax', 'xmin', 'xmax', 'ymin', 'ymax', 'min_past_furthest'):
if bt in kwargs:
bound_types.add(bt)
bound = kwargs[bt]
if not bound_types:
raise BuildError('No bound type specified for mpath')
if len(bound_types) > 1:
raise BuildError(f'Too many bound types specified for mpath: {bound_types}')
bound_type = tuple(bound_types)[0]
if isinstance(portspec, str):
portspec = [portspec]
ports = self.pattern[tuple(portspec)]
extensions = ell(ports, ccw, spacing=spacing, bound=bound, bound_type=bound_type, set_rotation=set_rotation)
if len(ports) == 1:
# Not a bus, so having a container just adds noise to the layout
port_name = tuple(portspec)[0]
self.path(port_name, ccw, extensions[port_name])
else:
for port_name, length in extensions.items():
self.path(port_name, ccw, length)
return self
def render(
self,
append: bool = True,
) -> Self:
"""
Generate the geometry which has been planned out with `path`/`path_to`/etc.
Args:
append: If `True`, the rendered geometry will be directly appended to
`self.pattern`. Note that it will not be flattened, so if only one
layer of hierarchy is eliminated.
Returns:
self
"""
lib = self.library
tool_port_names = ('A', 'B')
pat = Pattern()
def render_batch(portspec: str, batch: list[RenderStep], append: bool) -> None:
assert batch[0].tool is not None
name = lib << batch[0].tool.render(batch, port_names=tool_port_names)
pat.ports[portspec] = batch[0].start_port.copy()
if append:
pat.plug(lib[name], {portspec: tool_port_names[0]}, append=append)
del lib[name] # NOTE if the rendered pattern has refs, those are now in `pat` but not flattened
else:
pat.plug(lib.abstract(name), {portspec: tool_port_names[0]}, append=append)
for portspec, steps in self.paths.items():
batch: list[RenderStep] = []
for step in steps:
appendable_op = step.opcode in ('L', 'S', 'U')
same_tool = batch and step.tool == batch[0].tool
# If we can't continue a batch, render it
if batch and (not appendable_op or not same_tool):
render_batch(portspec, batch, append)
batch = []
# batch is emptied already if we couldn't continue it
if appendable_op:
batch.append(step)
# Opcodes which break the batch go below this line
if not appendable_op and portspec in pat.ports:
del pat.ports[portspec]
#If the last batch didn't end yet
if batch:
render_batch(portspec, batch, append)
self.paths.clear()
pat.ports.clear()
self.pattern.append(pat)
return self
def translate(self, offset: ArrayLike) -> Self:
"""
Translate the pattern and all ports.
Args:
offset: (x, y) distance to translate by
Returns:
self
"""
self.pattern.translate_elements(offset)
return self
def rotate_around(self, pivot: ArrayLike, angle: float) -> Self:
"""
Rotate the pattern and all ports.
Args:
angle: angle (radians, counterclockwise) to rotate by
pivot: location to rotate around
Returns:
self
"""
self.pattern.rotate_around(pivot, angle)
return self
def mirror(self, axis: int) -> Self:
"""
Mirror the pattern and all ports across the specified axis.
Args:
axis: Axis to mirror across (x=0, y=1)
Returns:
self
"""
self.pattern.mirror(axis)
return self
def set_dead(self) -> Self:
"""
Disallows further changes through `plug()` or `place()`.
This is meant for debugging:
```
dev.plug(a, ...)
dev.set_dead() # added for debug purposes
dev.plug(b, ...) # usually raises an error, but now skipped
dev.plug(c, ...) # also skipped
dev.pattern.visualize() # shows the device as of the set_dead() call
```
Returns:
self
"""
self._dead = True
return self
def __repr__(self) -> str:
s = f'<Pather {self.pattern} L({len(self.library)}) {pformat(self.tools)}>'
return s

553
masque/builder/tools.py Normal file
View File

@ -0,0 +1,553 @@
"""
Tools are objects which dynamically generate simple single-use devices (e.g. wires or waveguides)
# TODO document all tools
"""
from typing import Literal, Any
from collections.abc import Sequence, Callable
from abc import ABCMeta # , abstractmethod # TODO any way to make Tool ok with implementing only one method?
from dataclasses import dataclass
import numpy
from numpy.typing import NDArray
from numpy import pi
from ..utils import SupportsBool, rotation_matrix_2d, layer_t
from ..ports import Port
from ..pattern import Pattern
from ..abstract import Abstract
from ..library import ILibrary, Library, SINGLE_USE_PREFIX
from ..error import BuildError
@dataclass(frozen=True, slots=True)
class RenderStep:
"""
Representation of a single saved operation, used by `RenderPather` and passed
to `Tool.render()` when `RenderPather.render()` is called.
"""
opcode: Literal['L', 'S', 'U', 'P']
""" What operation is being performed.
L: planL (straight, optionally with a single bend)
S: planS (s-bend)
U: planU (u-bend)
P: plug
"""
tool: 'Tool | None'
""" The current tool. May be `None` if `opcode='P'` """
start_port: Port
end_port: Port
data: Any
""" Arbitrary tool-specific data"""
def __post_init__(self) -> None:
if self.opcode != 'P' and self.tool is None:
raise BuildError('Got tool=None but the opcode is not "P"')
class Tool:
"""
Interface for path (e.g. wire or waveguide) generation.
Note that subclasses may implement only a subset of the methods and leave others
unimplemented (e.g. in cases where they don't make sense or the required components
are impractical or unavailable).
"""
def path(
self,
ccw: SupportsBool | None,
length: float,
*,
in_ptype: str | None = None,
out_ptype: str | None = None,
port_names: tuple[str, str] = ('A', 'B'),
**kwargs,
) -> Library:
"""
Create a wire or waveguide that travels exactly `length` distance along the axis
of its input port.
Used by `Pather`.
The output port must be exactly `length` away along the input port's axis, but
may be placed an additional (unspecified) distance away along the perpendicular
direction. The output port should be rotated (or not) based on the value of
`ccw`.
The input and output ports should be compatible with `in_ptype` and
`out_ptype`, respectively. They should also be named `port_names[0]` and
`port_names[1]`, respectively.
Args:
ccw: If `None`, the output should be along the same axis as the input.
Otherwise, cast to bool and turn counterclockwise if True
and clockwise otherwise.
length: The total distance from input to output, along the input's axis only.
(There may be a tool-dependent offset along the other axis.)
in_ptype: The `ptype` of the port into which this wire's input will be `plug`ged.
out_ptype: The `ptype` of the port into which this wire's output will be `plug`ged.
port_names: The output pattern will have its input port named `port_names[0]` and
its output named `port_names[1]`.
kwargs: Custom tool-specific parameters.
Returns:
A pattern tree containing the requested L-shaped (or straight) wire or waveguide
Raises:
BuildError if an impossible or unsupported geometry is requested.
"""
raise NotImplementedError(f'path() not implemented for {type(self)}')
def planL(
self,
ccw: SupportsBool | None,
length: float,
*,
in_ptype: str | None = None,
out_ptype: str | None = None,
**kwargs,
) -> tuple[Port, Any]:
"""
Plan a wire or waveguide that travels exactly `length` distance along the axis
of its input port.
Used by `RenderPather`.
The output port must be exactly `length` away along the input port's axis, but
may be placed an additional (unspecified) distance away along the perpendicular
direction. The output port should be rotated (or not) based on the value of
`ccw`.
The input and output ports should be compatible with `in_ptype` and
`out_ptype`, respectively.
Args:
ccw: If `None`, the output should be along the same axis as the input.
Otherwise, cast to bool and turn counterclockwise if True
and clockwise otherwise.
length: The total distance from input to output, along the input's axis only.
(There may be a tool-dependent offset along the other axis.)
in_ptype: The `ptype` of the port into which this wire's input will be `plug`ged.
out_ptype: The `ptype` of the port into which this wire's output will be `plug`ged.
kwargs: Custom tool-specific parameters.
Returns:
The calculated output `Port` for the wire.
Any tool-specifc data, to be stored in `RenderStep.data`, for use during rendering.
Raises:
BuildError if an impossible or unsupported geometry is requested.
"""
raise NotImplementedError(f'planL() not implemented for {type(self)}')
def planS(
self,
length: float,
jog: float,
*,
in_ptype: str | None = None,
out_ptype: str | None = None,
**kwargs,
) -> tuple[Port, Any]:
"""
Plan a wire or waveguide that travels exactly `length` distance along the axis
of its input port and `jog` distance along the perpendicular axis (i.e. an S-bend).
Used by `RenderPather`.
The output port must have an orientation rotated by pi from the input port.
The input and output ports should be compatible with `in_ptype` and
`out_ptype`, respectively.
Args:
length: The total distance from input to output, along the input's axis only.
jog: The total offset from the input to output, along the perpendicular axis.
A positive number implies a rightwards shift (i.e. clockwise bend followed
by a counterclockwise bend)
in_ptype: The `ptype` of the port into which this wire's input will be `plug`ged.
out_ptype: The `ptype` of the port into which this wire's output will be `plug`ged.
kwargs: Custom tool-specific parameters.
Returns:
The calculated output `Port` for the wire.
Any tool-specifc data, to be stored in `RenderStep.data`, for use during rendering.
Raises:
BuildError if an impossible or unsupported geometry is requested.
"""
raise NotImplementedError(f'planS() not implemented for {type(self)}')
def planU(
self,
jog: float,
*,
in_ptype: str | None = None,
out_ptype: str | None = None,
**kwargs,
) -> tuple[Port, Any]:
"""
# NOTE: TODO: U-bend is WIP; this interface may change in the future.
Plan a wire or waveguide that travels exactly `jog` distance along the axis
perpendicular to its input port (i.e. a U-bend).
Used by `RenderPather`.
The output port must have an orientation identical to the input port.
The input and output ports should be compatible with `in_ptype` and
`out_ptype`, respectively.
Args:
jog: The total offset from the input to output, along the perpendicular axis.
A positive number implies a rightwards shift (i.e. clockwise bend followed
by a counterclockwise bend)
in_ptype: The `ptype` of the port into which this wire's input will be `plug`ged.
out_ptype: The `ptype` of the port into which this wire's output will be `plug`ged.
kwargs: Custom tool-specific parameters.
Returns:
The calculated output `Port` for the wire.
Any tool-specifc data, to be stored in `RenderStep.data`, for use during rendering.
Raises:
BuildError if an impossible or unsupported geometry is requested.
"""
raise NotImplementedError(f'planU() not implemented for {type(self)}')
def render(
self,
batch: Sequence[RenderStep],
*,
port_names: Sequence[str] = ('A', 'B'), # noqa: ARG002 (unused)
**kwargs, # noqa: ARG002 (unused)
) -> ILibrary:
"""
Render the provided `batch` of `RenderStep`s into geometry, returning a tree
(a Library with a single topcell).
Args:
batch: A sequence of `RenderStep` objects containing the ports and data
provided by this tool's `planL`/`planS`/`planU` functions.
port_names: The topcell's input and output ports should be named
`port_names[0]` and `port_names[1]` respectively.
kwargs: Custom tool-specific parameters.
"""
assert not batch or batch[0].tool == self
raise NotImplementedError(f'render() not implemented for {type(self)}')
abstract_tuple_t = tuple[Abstract, str, str]
@dataclass
class BasicTool(Tool, metaclass=ABCMeta):
"""
A simple tool which relies on a single pre-rendered `bend` pattern, a function
for generating straight paths, and a table of pre-rendered `transitions` for converting
from non-native ptypes.
"""
straight: tuple[Callable[[float], Pattern], str, str]
""" `create_straight(length: float), in_port_name, out_port_name` """
bend: abstract_tuple_t # Assumed to be clockwise
""" `clockwise_bend_abstract, in_port_name, out_port_name` """
transitions: dict[str, abstract_tuple_t]
""" `{ptype: (transition_abstract`, ptype_port_name, other_port_name), ...}` """
default_out_ptype: str
""" Default value for out_ptype """
@dataclass(frozen=True, slots=True)
class LData:
""" Data for planL """
straight_length: float
ccw: SupportsBool | None
in_transition: abstract_tuple_t | None
out_transition: abstract_tuple_t | None
def path(
self,
ccw: SupportsBool | None,
length: float,
*,
in_ptype: str | None = None,
out_ptype: str | None = None,
port_names: tuple[str, str] = ('A', 'B'),
**kwargs,
) -> Library:
_out_port, data = self.planL(
ccw,
length,
in_ptype=in_ptype,
out_ptype=out_ptype,
)
gen_straight, sport_in, sport_out = self.straight
tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'path')
pat.add_port_pair(names=port_names)
if data.in_transition:
ipat, iport_theirs, _iport_ours = data.in_transition
pat.plug(ipat, {port_names[1]: iport_theirs})
if not numpy.isclose(data.straight_length, 0):
straight = tree <= {SINGLE_USE_PREFIX + 'straight': gen_straight(data.straight_length, **kwargs)}
pat.plug(straight, {port_names[1]: sport_in})
if data.ccw is not None:
bend, bport_in, bport_out = self.bend
pat.plug(bend, {port_names[1]: bport_in}, mirrored=bool(ccw))
if data.out_transition:
opat, oport_theirs, oport_ours = data.out_transition
pat.plug(opat, {port_names[1]: oport_ours})
return tree
def planL(
self,
ccw: SupportsBool | None,
length: float,
*,
in_ptype: str | None = None,
out_ptype: str | None = None,
**kwargs, # noqa: ARG002 (unused)
) -> tuple[Port, LData]:
# TODO check all the math for L-shaped bends
if ccw is not None:
bend, bport_in, bport_out = self.bend
angle_in = bend.ports[bport_in].rotation
angle_out = bend.ports[bport_out].rotation
assert angle_in is not None
assert angle_out is not None
bend_dxy = rotation_matrix_2d(-angle_in) @ (
bend.ports[bport_out].offset
- bend.ports[bport_in].offset
)
bend_angle = angle_out - angle_in
if bool(ccw):
bend_dxy[1] *= -1
bend_angle *= -1
else:
bend_dxy = numpy.zeros(2)
bend_angle = 0
in_transition = self.transitions.get('unk' if in_ptype is None else in_ptype, None)
if in_transition is not None:
ipat, iport_theirs, iport_ours = in_transition
irot = ipat.ports[iport_theirs].rotation
assert irot is not None
itrans_dxy = rotation_matrix_2d(-irot) @ (
ipat.ports[iport_ours].offset
- ipat.ports[iport_theirs].offset
)
else:
itrans_dxy = numpy.zeros(2)
out_transition = self.transitions.get('unk' if out_ptype is None else out_ptype, None)
if out_transition is not None:
opat, oport_theirs, oport_ours = out_transition
orot = opat.ports[oport_ours].rotation
assert orot is not None
otrans_dxy = rotation_matrix_2d(-orot + bend_angle) @ (
opat.ports[oport_theirs].offset
- opat.ports[oport_ours].offset
)
else:
otrans_dxy = numpy.zeros(2)
if out_transition is not None:
out_ptype_actual = opat.ports[oport_theirs].ptype
elif ccw is not None:
out_ptype_actual = bend.ports[bport_out].ptype
else:
out_ptype_actual = self.default_out_ptype
straight_length = length - bend_dxy[0] - itrans_dxy[0] - otrans_dxy[0]
bend_run = bend_dxy[1] + itrans_dxy[1] + otrans_dxy[1]
if straight_length < 0:
raise BuildError(
f'Asked to draw path with total length {length:,g}, shorter than required bends and transitions:\n'
f'bend: {bend_dxy[0]:,g} in_trans: {itrans_dxy[0]:,g} out_trans: {otrans_dxy[0]:,g}'
)
data = self.LData(straight_length, ccw, in_transition, out_transition)
out_port = Port((length, bend_run), rotation=bend_angle, ptype=out_ptype_actual)
return out_port, data
def render(
self,
batch: Sequence[RenderStep],
*,
port_names: Sequence[str] = ('A', 'B'),
append: bool = True,
**kwargs,
) -> ILibrary:
tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'path')
pat.add_port_pair(names=(port_names[0], port_names[1]))
gen_straight, sport_in, _sport_out = self.straight
for step in batch:
straight_length, ccw, in_transition, out_transition = step.data
assert step.tool == self
if step.opcode == 'L':
if in_transition:
ipat, iport_theirs, _iport_ours = in_transition
pat.plug(ipat, {port_names[1]: iport_theirs})
if not numpy.isclose(straight_length, 0):
straight_pat = gen_straight(straight_length, **kwargs)
if append:
pat.plug(straight_pat, {port_names[1]: sport_in}, append=True)
else:
straight = tree <= {SINGLE_USE_PREFIX + 'straight': straight_pat}
pat.plug(straight, {port_names[1]: sport_in}, append=True)
if ccw is not None:
bend, bport_in, bport_out = self.bend
pat.plug(bend, {port_names[1]: bport_in}, mirrored=bool(ccw))
if out_transition:
opat, oport_theirs, oport_ours = out_transition
pat.plug(opat, {port_names[1]: oport_ours})
return tree
@dataclass
class PathTool(Tool, metaclass=ABCMeta):
"""
A tool which draws `Path` geometry elements.
If `planL` / `render` are used, the `Path` elements can cover >2 vertices;
with `path` only individual rectangles will be drawn.
"""
layer: layer_t
""" Layer to draw on """
width: float
""" `Path` width """
ptype: str = 'unk'
""" ptype for any ports in patterns generated by this tool """
#@dataclass(frozen=True, slots=True)
#class LData:
# dxy: NDArray[numpy.float64]
#def __init__(self, layer: layer_t, width: float, ptype: str = 'unk') -> None:
# Tool.__init__(self)
# self.layer = layer
# self.width = width
# self.ptype: str
def path(
self,
ccw: SupportsBool | None,
length: float,
*,
in_ptype: str | None = None,
out_ptype: str | None = None,
port_names: tuple[str, str] = ('A', 'B'),
**kwargs, # noqa: ARG002 (unused)
) -> Library:
out_port, dxy = self.planL(
ccw,
length,
in_ptype=in_ptype,
out_ptype=out_ptype,
)
tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'path')
pat.path(layer=self.layer, width=self.width, vertices=[(0, 0), (length, 0)])
if ccw is None:
out_rot = pi
elif bool(ccw):
out_rot = -pi / 2
else:
out_rot = pi / 2
pat.ports = {
port_names[0]: Port((0, 0), rotation=0, ptype=self.ptype),
port_names[1]: Port(dxy, rotation=out_rot, ptype=self.ptype),
}
return tree
def planL(
self,
ccw: SupportsBool | None,
length: float,
*,
in_ptype: str | None = None, # noqa: ARG002 (unused)
out_ptype: str | None = None,
**kwargs, # noqa: ARG002 (unused)
) -> tuple[Port, NDArray[numpy.float64]]:
# TODO check all the math for L-shaped bends
if out_ptype and out_ptype != self.ptype:
raise BuildError(f'Requested {out_ptype=} does not match path ptype {self.ptype}')
if ccw is not None:
bend_dxy = numpy.array([1, -1]) * self.width / 2
bend_angle = pi / 2
if bool(ccw):
bend_dxy[1] *= -1
bend_angle *= -1
else:
bend_dxy = numpy.zeros(2)
bend_angle = pi
straight_length = length - bend_dxy[0]
bend_run = bend_dxy[1]
if straight_length < 0:
raise BuildError(
f'Asked to draw path with total length {length:,g}, shorter than required bend: {bend_dxy[0]:,g}'
)
data = numpy.array((length, bend_run))
out_port = Port(data, rotation=bend_angle, ptype=self.ptype)
return out_port, data
def render(
self,
batch: Sequence[RenderStep],
*,
port_names: Sequence[str] = ('A', 'B'),
**kwargs, # noqa: ARG002 (unused)
) -> ILibrary:
path_vertices = [batch[0].start_port.offset]
for step in batch:
assert step.tool == self
port_rot = step.start_port.rotation
assert port_rot is not None
if step.opcode == 'L':
length, bend_run = step.data
dxy = rotation_matrix_2d(port_rot + pi) @ (length, 0)
#path_vertices.append(step.start_port.offset)
path_vertices.append(step.start_port.offset + dxy)
else:
raise BuildError(f'Unrecognized opcode "{step.opcode}"')
if (path_vertices[-1] != batch[-1].end_port.offset).any():
# If the path ends in a bend, we need to add the final vertex
path_vertices.append(batch[-1].end_port.offset)
tree, pat = Library.mktree(SINGLE_USE_PREFIX + 'path')
pat.path(layer=self.layer, width=self.width, vertices=path_vertices)
pat.ports = {
port_names[0]: batch[0].start_port.copy().rotate(pi),
port_names[1]: batch[-1].end_port.copy().rotate(pi),
}
return tree

View File

@ -1,26 +1,27 @@
from typing import Dict, Tuple, List, Optional, Union, Any, cast, Sequence, TYPE_CHECKING
from typing import SupportsFloat, cast, TYPE_CHECKING
from collections.abc import Mapping, Sequence
from pprint import pformat
import numpy
from numpy import pi
from numpy.typing import ArrayLike
from numpy.typing import ArrayLike, NDArray
from ..utils import rotation_matrix_2d
from ..utils import rotation_matrix_2d, SupportsBool
from ..error import BuildError
if TYPE_CHECKING:
from .devices import Port
from ..ports import Port
def ell(
ports: Dict[str, 'Port'],
ccw: Optional[bool],
ports: Mapping[str, 'Port'],
ccw: SupportsBool | None,
bound_type: str,
bound: Union[float, ArrayLike],
bound: float | ArrayLike,
*,
spacing: Optional[Union[float, ArrayLike]] = None,
set_rotation: Optional[float] = None,
) -> Dict[str, float]:
spacing: float | ArrayLike | None = None,
set_rotation: float | None = None,
) -> dict[str, float]:
"""
Calculate extension for each port in order to build a 90-degree bend with the provided
channel spacing:
@ -53,9 +54,9 @@ def ell(
The distance between furthest out-port (B) and the innermost bend (D's bend).
- 'max_extension' or 'emax':
The total extension value for the closest-in port (C in the diagram).
- 'min_position' or 'pmin':
- 'min_position', 'pmin', 'xmin', 'ymin':
The coordinate of the innermost bend (D's bend).
- 'max_position' or 'pmax':
- 'max_position', 'pmax', 'xmax', 'ymax':
The coordinate of the outermost bend (A's bend).
`bound` can also be a vector. If specifying an extension (e.g. 'min_extension',
@ -109,6 +110,12 @@ def ell(
raise BuildError('set_rotation must be specified if no ports have rotations!')
rotations = numpy.full_like(has_rotation, set_rotation, dtype=float)
is_horizontal = numpy.isclose(rotations[0] % pi, 0)
if bound_type in ('ymin', 'ymax') and is_horizontal:
raise BuildError(f'Asked for {bound_type} position but ports are pointing along the x-axis!')
if bound_type in ('xmin', 'xmax') and not is_horizontal:
raise BuildError(f'Asked for {bound_type} position but ports are pointing along the y-axis!')
direction = rotations[0] + pi # direction we want to travel in (+pi relative to port)
rot_matrix = rotation_matrix_2d(-direction)
@ -116,6 +123,8 @@ def ell(
orig_offsets = numpy.array([p.offset for p in ports.values()])
rot_offsets = (rot_matrix @ orig_offsets.T).T
# ordering_base = rot_offsets.T * [[1], [-1 if ccw else 1]] # could work, but this is actually a more complex routing problem
# y_order = numpy.lexsort(ordering_base) # (need to make sure we don't collide with the next input port @ same y)
y_order = ((-1 if ccw else 1) * rot_offsets[:, 1]).argsort(kind='stable')
y_ind = numpy.empty_like(y_order, dtype=int)
y_ind[y_order] = numpy.arange(y_ind.shape[0])
@ -135,6 +144,7 @@ def ell(
# D-----------| `d_to_align[3]`
#
d_to_align = x_start.max() - x_start # distance to travel to align all
offsets: NDArray[numpy.float64]
if bound_type == 'min_past_furthest':
# A------------------V `d_to_exit[0]`
# B-----V `d_to_exit[1]`
@ -154,6 +164,7 @@ def ell(
travel = d_to_align - (ch_offsets.max() - ch_offsets)
offsets = travel - travel.min().clip(max=0)
rot_bound: SupportsFloat
if bound_type in ('emin', 'min_extension',
'emax', 'max_extension',
'min_past_furthest',):
@ -182,15 +193,16 @@ def ell(
rot_bound = -bound if neg else bound
min_possible = x_start + offsets
if bound_type in ('pmax', 'max_position'):
if bound_type in ('pmax', 'max_position', 'xmax', 'ymax'):
extension = rot_bound - min_possible.max()
elif bound_type in ('pmin', 'min_position'):
elif bound_type in ('pmin', 'min_position', 'xmin', 'ymin'):
extension = rot_bound - min_possible.min()
offsets += extension
if extension < 0:
raise BuildError(f'Position is too close by at least {-numpy.floor(extension)}. Total extensions would be'
+ '\n\t'.join(f'{key}: {off}' for key, off in zip(ports.keys(), offsets)))
ext_floor = -numpy.floor(extension)
raise BuildError(f'Position is too close by at least {ext_floor}. Total extensions would be\n\t'
+ '\n\t'.join(f'{key}: {off}' for key, off in zip(ports.keys(), offsets, strict=True)))
result = dict(zip(ports.keys(), offsets))
result = dict(zip(ports.keys(), offsets, strict=True))
return result

View File

@ -11,13 +11,6 @@ class PatternError(MasqueError):
"""
pass
class PatternLockedError(PatternError):
"""
Exception raised when trying to modify a locked pattern
"""
def __init__(self):
PatternError.__init__(self, 'Tried to modify a locked Pattern, subpattern, or shape')
class LibraryError(MasqueError):
"""
@ -26,22 +19,21 @@ class LibraryError(MasqueError):
pass
class DeviceLibraryError(MasqueError):
"""
Exception raised by DeviceLibrary classes
"""
pass
class DeviceError(MasqueError):
"""
Exception raised by Device and Port objects
"""
pass
class BuildError(MasqueError):
"""
Exception raised by builder-related functions
"""
pass
class PortError(MasqueError):
"""
Exception raised by builder-related functions
"""
pass
class OneShotError(MasqueError):
"""
Exception raised when a function decorated with `@oneshot` is called more than once
"""
def __init__(self, func_name: str) -> None:
Exception.__init__(self, f'Function "{func_name}" with @oneshot was called more than once')

View File

@ -1,45 +1,51 @@
"""
DXF file format readers and writers
Notes:
* Gzip modification time is set to 0 (start of current epoch, usually 1970-01-01)
* ezdxf sets creation time, write time, $VERSIONGUID, and $FINGERPRINTGUID
to unique values, so byte-for-byte reproducibility is not achievable for now
"""
from typing import List, Any, Dict, Tuple, Callable, Union, Sequence, Iterable
import re
from typing import Any, cast, TextIO, IO
from collections.abc import Mapping, Callable
import io
import base64
import struct
import logging
import pathlib
import gzip
import numpy # type: ignore
import ezdxf # type: ignore
import numpy
import ezdxf
from ezdxf.enums import TextEntityAlignment
from ezdxf.entities import LWPolyline, Polyline, Text, Insert
from .. import Pattern, SubPattern, PatternError, Label, Shape
from ..shapes import Polygon, Path
from .utils import is_gzipped, tmpfile
from .. import Pattern, Ref, PatternError, Label
from ..library import ILibraryView, LibraryView, Library
from ..shapes import Shape, Polygon, Path
from ..repetition import Grid
from ..utils import rotation_matrix_2d, layer_t
from ..utils import rotation_matrix_2d, layer_t, normalize_mirror
logger = logging.getLogger(__name__)
logger.warning('DXF support is experimental and only slightly tested!')
logger.warning('DXF support is experimental!')
DEFAULT_LAYER = 'DEFAULT'
def write(
pattern: Pattern,
stream: io.TextIOBase,
library: Mapping[str, Pattern], # TODO could allow library=None for flat DXF
top_name: str,
stream: TextIO,
*,
modify_originals: bool = False,
dxf_version='AC1024',
disambiguate_func: Callable[[Iterable[Pattern]], None] = None,
dxf_version: str = 'AC1024',
) -> None:
"""
Write a `Pattern` to a DXF file, by first calling `.polygonize()` to change the shapes
into polygons, and then writing patterns as DXF `Block`s, polygons as `LWPolyline`s,
and subpatterns as `Insert`s.
and refs as `Insert`s.
The top level pattern's name is not written to the DXF file. Nested patterns keep their
names.
@ -49,60 +55,61 @@ def write(
tuple: (1, 2) -> '1.2'
str: '1.2' -> '1.2' (no change)
It is often a good idea to run `pattern.subpatternize()` prior to calling this function,
especially if calling `.polygonize()` will result in very many vertices.
DXF does not support shape repetition (only block repeptition). Please call
library.wrap_repeated_shapes() before writing to file.
If you want pattern polygonized with non-default arguments, just call `pattern.polygonize()`
prior to calling this function.
Other functions you may want to call:
- `masque.file.oasis.check_valid_names(library.keys())` to check for invalid names
- `library.dangling_refs()` to check for references to missing patterns
- `pattern.polygonize()` for any patterns with shapes other
than `masque.shapes.Polygon` or `masque.shapes.Path`
Only `Grid` repetition objects with manhattan basis vectors are preserved as arrays. Since DXF
rotations apply to basis vectors while `masque`'s rotations do not, the basis vectors of an
array with rotated instances must be manhattan _after_ having a compensating rotation applied.
Args:
patterns: A Pattern or list of patterns to write to the stream.
library: A {name: Pattern} mapping of patterns. Only `top_name` and patterns referenced
by it are written.
top_name: Name of the top-level pattern to write.
stream: Stream object to write to.
modify_original: If `True`, the original pattern is modified as part of the writing
process. Otherwise, a copy is made and `deepunlock()`-ed.
Default `False`.
disambiguate_func: Function which takes a list of patterns and alters them
to make their names valid and unique. Default is `disambiguate_pattern_names`.
WARNING: No additional error checking is performed on the results.
"""
#TODO consider supporting DXF arcs?
if disambiguate_func is None:
disambiguate_func = lambda pats: disambiguate_pattern_names(pats)
assert(disambiguate_func is not None)
if not isinstance(library, ILibraryView):
if isinstance(library, dict):
library = LibraryView(library)
else:
library = LibraryView(dict(library))
if not modify_originals:
pattern = pattern.deepcopy().deepunlock()
# Get a dict of id(pattern) -> pattern
patterns_by_id = pattern.referenced_patterns_by_id()
disambiguate_func(patterns_by_id.values())
pattern = library[top_name]
subtree = library.subtree(top_name)
# Create library
lib = ezdxf.new(dxf_version, setup=True)
msp = lib.modelspace()
_shapes_to_elements(msp, pattern.shapes)
_labels_to_texts(msp, pattern.labels)
_subpatterns_to_refs(msp, pattern.subpatterns)
_mrefs_to_drefs(msp, pattern.refs)
# Now create a block for each referenced pattern, and add in any shapes
for pat in patterns_by_id.values():
assert(pat is not None)
block = lib.blocks.new(name=pat.name)
for name, pat in subtree.items():
assert pat is not None
if name == top_name:
continue
block = lib.blocks.new(name=name)
_shapes_to_elements(block, pat.shapes)
_labels_to_texts(block, pat.labels)
_subpatterns_to_refs(block, pat.subpatterns)
_mrefs_to_drefs(block, pat.refs)
lib.write(stream)
def writefile(
pattern: Pattern,
filename: Union[str, pathlib.Path],
library: Mapping[str, Pattern],
top_name: str,
filename: str | pathlib.Path,
*args,
**kwargs,
) -> None:
@ -112,30 +119,42 @@ def writefile(
Will automatically compress the file if it has a .gz suffix.
Args:
pattern: `Pattern` to save
library: A {name: Pattern} mapping of patterns. Only `top_name` and patterns referenced
by it are written.
top_name: Name of the top-level pattern to write.
filename: Filename to save to.
*args: passed to `dxf.write`
**kwargs: passed to `dxf.write`
"""
path = pathlib.Path(filename)
if path.suffix == '.gz':
open_func: Callable = gzip.open
else:
open_func = open
with open_func(path, mode='wt') as stream:
write(pattern, stream, *args, **kwargs)
gz_stream: IO[bytes]
with tmpfile(path) as base_stream:
streams: tuple[Any, ...] = (base_stream,)
if path.suffix == '.gz':
gz_stream = cast(IO[bytes], gzip.GzipFile(filename='', mtime=0, fileobj=base_stream, mode='wb'))
streams = (gz_stream,) + streams
else:
gz_stream = base_stream
stream = io.TextIOWrapper(gz_stream) # type: ignore
streams = (stream,) + streams
try:
write(library, top_name, stream, *args, **kwargs)
finally:
for ss in streams:
ss.close()
def readfile(
filename: Union[str, pathlib.Path],
filename: str | pathlib.Path,
*args,
**kwargs,
) -> Tuple[Pattern, Dict[str, Any]]:
) -> tuple[Library, dict[str, Any]]:
"""
Wrapper for `dxf.read()` that takes a filename or path instead of a stream.
Will automatically decompress files with a .gz suffix.
Will automatically decompress gzipped files.
Args:
filename: Filename to save to.
@ -143,7 +162,7 @@ def readfile(
**kwargs: passed to `dxf.read`
"""
path = pathlib.Path(filename)
if path.suffix == '.gz':
if is_gzipped(path):
open_func: Callable = gzip.open
else:
open_func = open
@ -154,21 +173,17 @@ def readfile(
def read(
stream: io.TextIOBase,
clean_vertices: bool = True,
) -> Tuple[Pattern, Dict[str, Any]]:
stream: TextIO,
) -> tuple[Library, dict[str, Any]]:
"""
Read a dxf file and translate it into a dict of `Pattern` objects. DXF `Block`s are
translated into `Pattern` objects; `LWPolyline`s are translated into polygons, and `Insert`s
are translated into `SubPattern` objects.
are translated into `Ref` objects.
If an object has no layer it is set to this module's `DEFAULT_LAYER` ("DEFAULT").
Args:
stream: Stream to read from.
clean_vertices: If `True`, remove any redundant vertices when loading polygons.
The cleaning process removes any polygons with zero area or <3 vertices.
Default `True`.
Returns:
- Top level pattern
@ -176,163 +191,165 @@ def read(
lib = ezdxf.read(stream)
msp = lib.modelspace()
pat = _read_block(msp, clean_vertices)
patterns = [pat] + [_read_block(bb, clean_vertices) for bb in lib.blocks if bb.name != '*Model_Space']
top_name, top_pat = _read_block(msp)
mlib = Library({top_name: top_pat})
for bb in lib.blocks:
if bb.name == '*Model_Space':
continue
name, pat = _read_block(bb)
mlib[name] = pat
# Create a dict of {pattern.name: pattern, ...}, then fix up all subpattern.pattern entries
# according to the subpattern.identifier (which is deleted after use).
patterns_dict = dict(((p.name, p) for p in patterns))
for p in patterns_dict.values():
for sp in p.subpatterns:
sp.pattern = patterns_dict[sp.identifier[0]]
del sp.identifier
library_info = dict(
layers=[ll.dxfattribs() for ll in lib.layers],
)
library_info = {
'layers': [ll.dxfattribs() for ll in lib.layers]
}
return pat, library_info
return mlib, library_info
def _read_block(block, clean_vertices: bool) -> Pattern:
pat = Pattern(block.name)
def _read_block(block: ezdxf.layouts.BlockLayout | ezdxf.layouts.Modelspace) -> tuple[str, Pattern]:
name = block.name
pat = Pattern()
for element in block:
eltype = element.dxftype()
if eltype in ('POLYLINE', 'LWPOLYLINE'):
if eltype == 'LWPOLYLINE':
points = numpy.array(tuple(element.lwpoints))
else:
points = numpy.array(tuple(element.points()))
if isinstance(element, LWPolyline | Polyline):
if isinstance(element, LWPolyline):
points = numpy.asarray(element.get_points())
elif isinstance(element, Polyline):
points = numpy.asarray(element.points())[:, :2]
attr = element.dxfattribs()
layer = attr.get('layer', DEFAULT_LAYER)
if points.shape[1] == 2:
raise PatternError('Invalid or unimplemented polygon?')
#shape = Polygon(layer=layer)
elif points.shape[1] > 2:
if points.shape[1] > 2:
if (points[0, 2] != points[:, 2]).any():
raise PatternError('PolyLine has non-constant width (not yet representable in masque!)')
elif points.shape[1] == 4 and (points[:, 3] != 0).any():
if points.shape[1] == 4 and (points[:, 3] != 0).any():
raise PatternError('LWPolyLine has bulge (not yet representable in masque!)')
width = points[0, 2]
if width == 0:
width = attr.get('const_width', 0)
shape: Union[Path, Polygon]
shape: Path | Polygon
if width == 0 and len(points) > 2 and numpy.array_equal(points[0], points[-1]):
shape = Polygon(layer=layer, vertices=points[:-1, :2])
shape = Polygon(vertices=points[:-1, :2])
else:
shape = Path(layer=layer, width=width, vertices=points[:, :2])
shape = Path(width=width, vertices=points[:, :2])
if clean_vertices:
try:
shape.clean_vertices()
except PatternError:
continue
pat.shapes[layer].append(shape)
pat.shapes.append(shape)
elif eltype in ('TEXT',):
args = {'offset': numpy.array(element.get_pos()[1])[:2],
'layer': element.dxfattribs().get('layer', DEFAULT_LAYER),
}
elif isinstance(element, Text):
args = dict(
offset=numpy.asarray(element.get_placement()[1])[:2],
layer=element.dxfattribs().get('layer', DEFAULT_LAYER),
)
string = element.dxfattribs().get('text', '')
# height = element.dxfattribs().get('height', 0)
# if height != 0:
# logger.warning('Interpreting DXF TEXT as a label despite nonzero height. '
# 'This could be changed in the future by setting a font path in the masque DXF code.')
pat.labels.append(Label(string=string, **args))
pat.label(string=string, **args)
# else:
# pat.shapes.append(Text(string=string, height=height, font_path=????))
elif eltype in ('INSERT',):
# pat.shapes[args['layer']].append(Text(string=string, height=height, font_path=????))
elif isinstance(element, Insert):
attr = element.dxfattribs()
xscale = attr.get('xscale', 1)
yscale = attr.get('yscale', 1)
if abs(xscale) != abs(yscale):
logger.warning('Masque does not support per-axis scaling; using x-scaling only!')
scale = abs(xscale)
mirrored = (yscale < 0, xscale < 0)
rotation = numpy.deg2rad(attr.get('rotation', 0))
mirrored, extra_angle = normalize_mirror((yscale < 0, xscale < 0))
rotation = numpy.deg2rad(attr.get('rotation', 0)) + extra_angle
offset = numpy.array(attr.get('insert', (0, 0, 0)))[:2]
offset = numpy.asarray(attr.get('insert', (0, 0, 0)))[:2]
args = {
'offset': offset,
'scale': scale,
'mirrored': mirrored,
'rotation': rotation,
'pattern': None,
'identifier': (attr.get('name', None),),
}
args = dict(
target=attr.get('name', None),
offset=offset,
scale=scale,
mirrored=mirrored,
rotation=rotation,
)
if 'column_count' in attr:
args['repetition'] = Grid(a_vector=(attr['column_spacing'], 0),
args['repetition'] = Grid(
a_vector=(attr['column_spacing'], 0),
b_vector=(0, attr['row_spacing']),
a_count=attr['column_count'],
b_count=attr['row_count'])
pat.subpatterns.append(SubPattern(**args))
b_count=attr['row_count'],
)
pat.ref(**args)
else:
logger.warning(f'Ignoring DXF element {element.dxftype()} (not implemented).')
return pat
return name, pat
def _subpatterns_to_refs(
block: Union[ezdxf.layouts.BlockLayout, ezdxf.layouts.Modelspace],
subpatterns: List[SubPattern],
def _mrefs_to_drefs(
block: ezdxf.layouts.BlockLayout | ezdxf.layouts.Modelspace,
refs: dict[str | None, list[Ref]],
) -> None:
for subpat in subpatterns:
if subpat.pattern is None:
continue
encoded_name = subpat.pattern.name
def mk_blockref(encoded_name: str, ref: Ref) -> None:
rotation = numpy.rad2deg(ref.rotation) % 360
attribs = dict(
xscale=ref.scale,
yscale=ref.scale * (-1 if ref.mirrored else 1),
rotation=rotation,
)
rotation = (subpat.rotation * 180 / numpy.pi) % 360
attribs = {
'xscale': subpat.scale * (-1 if subpat.mirrored[1] else 1),
'yscale': subpat.scale * (-1 if subpat.mirrored[0] else 1),
'rotation': rotation,
}
rep = subpat.repetition
rep = ref.repetition
if rep is None:
block.add_blockref(encoded_name, subpat.offset, dxfattribs=attribs)
block.add_blockref(encoded_name, ref.offset, dxfattribs=attribs)
elif isinstance(rep, Grid):
a = rep.a_vector
b = rep.b_vector if rep.b_vector is not None else numpy.zeros(2)
rotated_a = rotation_matrix_2d(-subpat.rotation) @ a
rotated_b = rotation_matrix_2d(-subpat.rotation) @ b
rotated_a = rotation_matrix_2d(-ref.rotation) @ a
rotated_b = rotation_matrix_2d(-ref.rotation) @ b
if rotated_a[1] == 0 and rotated_b[0] == 0:
attribs['column_count'] = rep.a_count
attribs['row_count'] = rep.b_count
attribs['column_spacing'] = rotated_a[0]
attribs['row_spacing'] = rotated_b[1]
block.add_blockref(encoded_name, subpat.offset, dxfattribs=attribs)
block.add_blockref(encoded_name, ref.offset, dxfattribs=attribs)
elif rotated_a[0] == 0 and rotated_b[1] == 0:
attribs['column_count'] = rep.b_count
attribs['row_count'] = rep.a_count
attribs['column_spacing'] = rotated_b[0]
attribs['row_spacing'] = rotated_a[1]
block.add_blockref(encoded_name, subpat.offset, dxfattribs=attribs)
block.add_blockref(encoded_name, ref.offset, dxfattribs=attribs)
else:
#NOTE: We could still do non-manhattan (but still orthogonal) grids by getting
# creative with counter-rotated nested patterns, but probably not worth it.
# Instead, just break appart the grid into individual elements:
for dd in rep.displacements:
block.add_blockref(encoded_name, subpat.offset + dd, dxfattribs=attribs)
block.add_blockref(encoded_name, ref.offset + dd, dxfattribs=attribs)
else:
for dd in rep.displacements:
block.add_blockref(encoded_name, subpat.offset + dd, dxfattribs=attribs)
block.add_blockref(encoded_name, ref.offset + dd, dxfattribs=attribs)
for target, rseq in refs.items():
if target is None:
continue
for ref in rseq:
mk_blockref(target, ref)
def _shapes_to_elements(
block: Union[ezdxf.layouts.BlockLayout, ezdxf.layouts.Modelspace],
shapes: List[Shape],
polygonize_paths: bool = False,
block: ezdxf.layouts.BlockLayout | ezdxf.layouts.Modelspace,
shapes: dict[layer_t, list[Shape]],
) -> None:
# Add `LWPolyline`s for each shape.
# Could set do paths with width setting, but need to consider endcaps.
for shape in shapes:
attribs = {'layer': _mlayer2dxf(shape.layer)}
# TODO: can DXF do paths?
for layer, sseq in shapes.items():
attribs = dict(layer=_mlayer2dxf(layer))
for shape in sseq:
if shape.repetition is not None:
raise PatternError(
'Shape repetitions are not supported by DXF.'
' Please call library.wrap_repeated_shapes() before writing to file.'
)
for polygon in shape.to_polygons():
xy_open = polygon.vertices + polygon.offset
xy_closed = numpy.vstack((xy_open, xy_open[0, :]))
@ -340,13 +357,17 @@ def _shapes_to_elements(
def _labels_to_texts(
block: Union[ezdxf.layouts.BlockLayout, ezdxf.layouts.Modelspace],
labels: List[Label],
block: ezdxf.layouts.BlockLayout | ezdxf.layouts.Modelspace,
labels: dict[layer_t, list[Label]],
) -> None:
for label in labels:
attribs = {'layer': _mlayer2dxf(label.layer)}
for layer, lseq in labels.items():
attribs = dict(layer=_mlayer2dxf(layer))
for label in lseq:
xy = label.offset
block.add_text(label.string, dxfattribs=attribs).set_pos(xy, align='BOTTOM_LEFT')
block.add_text(
label.string,
dxfattribs=attribs
).set_placement(xy, align=TextEntityAlignment.BOTTOM_LEFT)
def _mlayer2dxf(layer: layer_t) -> str:
@ -357,40 +378,3 @@ def _mlayer2dxf(layer: layer_t) -> str:
if isinstance(layer, tuple):
return f'{layer[0]}.{layer[1]}'
raise PatternError(f'Unknown layer type: {layer} ({type(layer)})')
def disambiguate_pattern_names(
patterns: Iterable[Pattern],
max_name_length: int = 32,
suffix_length: int = 6,
dup_warn_filter: Callable[[str], bool] = None, # If returns False, don't warn about this name
) -> None:
used_names = []
for pat in patterns:
sanitized_name = re.compile(r'[^A-Za-z0-9_\?\$]').sub('_', pat.name)
i = 0
suffixed_name = sanitized_name
while suffixed_name in used_names or suffixed_name == '':
suffix = base64.b64encode(struct.pack('>Q', i), b'$?').decode('ASCII')
suffixed_name = sanitized_name + '$' + suffix[:-1].lstrip('A')
i += 1
if sanitized_name == '':
logger.warning(f'Empty pattern name saved as "{suffixed_name}"')
elif suffixed_name != sanitized_name:
if dup_warn_filter is None or dup_warn_filter(pat.name):
logger.warning(f'Pattern name "{pat.name}" ({sanitized_name}) appears multiple times;\n'
+ f' renaming to "{suffixed_name}"')
if len(suffixed_name) == 0:
# Should never happen since zero-length names are replaced
raise PatternError(f'Zero-length name after sanitize,\n originally "{pat.name}"')
if len(suffixed_name) > max_name_length:
raise PatternError(f'Pattern name "{suffixed_name!r}" length > {max_name_length} after encode,\n'
+ f' originally "{pat.name}"')
pat.name = suffixed_name
used_names.append(suffixed_name)

View File

@ -16,31 +16,31 @@ Notes:
* PLEX is not supported
* ELFLAGS are not supported
* GDS does not support library- or structure-level annotations
* Creation/modification/access times are set to 1900-01-01 for reproducibility.
* GDS creation/modification/access times are set to 1900-01-01 for reproducibility.
* Gzip modification time is set to 0 (start of current epoch, usually 1970-01-01)
"""
from typing import List, Any, Dict, Tuple, Callable, Union, Iterable, Optional
from typing import Sequence, BinaryIO
import re
from typing import IO, cast, Any
from collections.abc import Iterable, Mapping, Callable
import io
import mmap
import copy
import base64
import struct
import logging
import pathlib
import gzip
import string
from pprint import pformat
import numpy
from numpy.typing import NDArray, ArrayLike
from numpy.typing import ArrayLike, NDArray
import klamath
from klamath import records
from .utils import is_gzipped
from .. import Pattern, SubPattern, PatternError, Label, Shape
from .utils import is_gzipped, tmpfile
from .. import Pattern, Ref, PatternError, LibraryError, Label, Shape
from ..shapes import Polygon, Path
from ..repetition import Grid
from ..utils import layer_t, normalize_mirror, annotations_t
from ..library import Library
from ..utils import layer_t, annotations_t
from ..library import LazyLibrary, Library, ILibrary, ILibraryView
logger = logging.getLogger(__name__)
@ -53,20 +53,21 @@ path_cap_map = {
}
def rint_cast(val: ArrayLike) -> NDArray[numpy.int32]:
return numpy.rint(val).astype(numpy.int32)
def write(
patterns: Union[Pattern, Sequence[Pattern]],
stream: BinaryIO,
library: Mapping[str, Pattern],
stream: IO[bytes],
meters_per_unit: float,
logical_units_per_unit: float = 1,
library_name: str = 'masque-klamath',
*,
modify_originals: bool = False,
disambiguate_func: Callable[[Iterable[Pattern]], None] = None,
) -> None:
"""
Convert a `Pattern` or list of patterns to a GDSII stream, and then mapping data as follows:
Convert a library to a GDSII stream, mapping data as follows:
Pattern -> GDSII structure
SubPattern -> GDSII SREF or AREF
Ref -> GDSII SREF or AREF
Path -> GSDII path
Shape (other than path) -> GDSII boundary/ies
Label -> GDSII text
@ -78,14 +79,17 @@ def write(
datatype is chosen to be `shape.layer[1]` if available,
otherwise `0`
It is often a good idea to run `pattern.subpatternize()` prior to calling this function,
especially if calling `.polygonize()` will result in very many vertices.
GDS does not support shape repetition (only cell repeptition). Please call
`library.wrap_repeated_shapes()` before writing to file.
If you want pattern polygonized with non-default arguments, just call `pattern.polygonize()`
prior to calling this function.
Other functions you may want to call:
- `masque.file.gdsii.check_valid_names(library.keys())` to check for invalid names
- `library.dangling_refs()` to check for references to missing patterns
- `pattern.polygonize()` for any patterns with shapes other
than `masque.shapes.Polygon` or `masque.shapes.Path`
Args:
patterns: A Pattern or list of patterns to convert.
library: A {name: Pattern} mapping of patterns to write.
meters_per_unit: Written into the GDSII file, meters per (database) length unit.
All distances are assumed to be an integer multiple of this unit, and are stored as such.
logical_units_per_unit: Written into the GDSII file. Allows the GDSII to specify a
@ -93,54 +97,35 @@ def write(
Default `1`.
library_name: Library name written into the GDSII file.
Default 'masque-klamath'.
modify_originals: If `True`, the original pattern is modified as part of the writing
process. Otherwise, a copy is made and `deepunlock()`-ed.
Default `False`.
disambiguate_func: Function which takes a list of patterns and alters them
to make their names valid and unique. Default is `disambiguate_pattern_names`, which
attempts to adhere to the GDSII standard as well as possible.
WARNING: No additional error checking is performed on the results.
"""
if isinstance(patterns, Pattern):
patterns = [patterns]
if disambiguate_func is None:
disambiguate_func = disambiguate_pattern_names # type: ignore
assert(disambiguate_func is not None) # placate mypy
if not modify_originals:
patterns = [p.deepunlock() for p in copy.deepcopy(patterns)]
patterns = [p.wrap_repeated_shapes() for p in patterns]
if not isinstance(library, ILibrary):
if isinstance(library, dict):
library = Library(library)
else:
library = Library(dict(library))
# Create library
header = klamath.library.FileHeader(name=library_name.encode('ASCII'),
header = klamath.library.FileHeader(
name=library_name.encode('ASCII'),
user_units_per_db_unit=logical_units_per_unit,
meters_per_db_unit=meters_per_unit)
meters_per_db_unit=meters_per_unit,
)
header.write(stream)
# Get a dict of id(pattern) -> pattern
patterns_by_id = {id(pattern): pattern for pattern in patterns}
for pattern in patterns:
for i, p in pattern.referenced_patterns_by_id().items():
patterns_by_id[i] = p
disambiguate_func(patterns_by_id.values())
# Now create a structure for each pattern, and add in any Boundary and SREF elements
for pat in patterns_by_id.values():
elements: List[klamath.elements.Element] = []
for name, pat in library.items():
elements: list[klamath.elements.Element] = []
elements += _shapes_to_elements(pat.shapes)
elements += _labels_to_texts(pat.labels)
elements += _subpatterns_to_refs(pat.subpatterns)
elements += _mrefs_to_grefs(pat.refs)
klamath.library.write_struct(stream, name=pat.name.encode('ASCII'), elements=elements)
klamath.library.write_struct(stream, name=name.encode('ASCII'), elements=elements)
records.ENDLIB.write(stream, None)
def writefile(
patterns: Union[Sequence[Pattern], Pattern],
filename: Union[str, pathlib.Path],
library: Mapping[str, Pattern],
filename: str | pathlib.Path,
*args,
**kwargs,
) -> None:
@ -150,26 +135,33 @@ def writefile(
Will automatically compress the file if it has a .gz suffix.
Args:
patterns: `Pattern` or list of patterns to save
library: {name: Pattern} pairs to save.
filename: Filename to save to.
*args: passed to `write()`
**kwargs: passed to `write()`
"""
path = pathlib.Path(filename)
if path.suffix == '.gz':
open_func: Callable = gzip.open
else:
open_func = open
with io.BufferedWriter(open_func(path, mode='wb')) as stream:
write(patterns, stream, *args, **kwargs)
with tmpfile(path) as base_stream:
streams: tuple[Any, ...] = (base_stream,)
if path.suffix == '.gz':
stream = cast(IO[bytes], gzip.GzipFile(filename='', mtime=0, fileobj=base_stream, mode='wb', compresslevel=6))
streams = (stream,) + streams
else:
stream = base_stream
try:
write(library, stream, *args, **kwargs)
finally:
for ss in streams:
ss.close()
def readfile(
filename: Union[str, pathlib.Path],
filename: str | pathlib.Path,
*args,
**kwargs,
) -> Tuple[Dict[str, Pattern], Dict[str, Any]]:
) -> tuple[Library, dict[str, Any]]:
"""
Wrapper for `read()` that takes a filename or path instead of a stream.
@ -186,19 +178,20 @@ def readfile(
else:
open_func = open
with io.BufferedReader(open_func(path, mode='rb')) as stream:
with open_func(path, mode='rb') as stream:
results = read(stream, *args, **kwargs)
return results
def read(
stream: BinaryIO,
stream: IO[bytes],
raw_mode: bool = True,
) -> Tuple[Dict[str, Pattern], Dict[str, Any]]:
) -> tuple[Library, dict[str, Any]]:
"""
# TODO check GDSII file for cycles!
Read a gdsii file and translate it into a dict of Pattern objects. GDSII structures are
translated into Pattern objects; boundaries are translated into polygons, and srefs and arefs
are translated into SubPattern objects.
are translated into Ref objects.
Additional library info is returned in a dict, containing:
'name': name of the library
@ -211,31 +204,23 @@ def read(
raw_mode: If True, constructs shapes in raw mode, bypassing most data validation, Default True.
Returns:
- Dict of pattern_name:Patterns generated from GDSII structures
- Dict of GDSII library info
- dict of pattern_name:Patterns generated from GDSII structures
- dict of GDSII library info
"""
library_info = _read_header(stream)
patterns = []
mlib = Library()
found_struct = records.BGNSTR.skip_past(stream)
while found_struct:
name = records.STRNAME.skip_and_read(stream)
pat = read_elements(stream, name=name.decode('ASCII'), raw_mode=raw_mode)
patterns.append(pat)
pat = read_elements(stream, raw_mode=raw_mode)
mlib[name.decode('ASCII')] = pat
found_struct = records.BGNSTR.skip_past(stream)
# Create a dict of {pattern.name: pattern, ...}, then fix up all subpattern.pattern entries
# according to the subpattern.identifier (which is deleted after use).
patterns_dict = dict(((p.name, p) for p in patterns))
for p in patterns_dict.values():
for sp in p.subpatterns:
sp.pattern = patterns_dict[sp.identifier[0]]
del sp.identifier
return patterns_dict, library_info
return mlib, library_info
def _read_header(stream: BinaryIO) -> Dict[str, Any]:
def _read_header(stream: IO[bytes]) -> dict[str, Any]:
"""
Read the file header and create the library_info dict.
"""
@ -249,8 +234,7 @@ def _read_header(stream: BinaryIO) -> Dict[str, Any]:
def read_elements(
stream: BinaryIO,
name: str,
stream: IO[bytes],
raw_mode: bool = True,
) -> Pattern:
"""
@ -265,28 +249,30 @@ def read_elements(
Returns:
A pattern containing the elements that were read.
"""
pat = Pattern(name)
pat = Pattern()
elements = klamath.library.read_elements(stream)
for element in elements:
if isinstance(element, klamath.elements.Boundary):
poly = _boundary_to_polygon(element, raw_mode)
pat.shapes.append(poly)
layer, poly = _boundary_to_polygon(element, raw_mode)
pat.shapes[layer].append(poly)
elif isinstance(element, klamath.elements.Path):
path = _gpath_to_mpath(element, raw_mode)
pat.shapes.append(path)
layer, path = _gpath_to_mpath(element, raw_mode)
pat.shapes[layer].append(path)
elif isinstance(element, klamath.elements.Text):
label = Label(offset=element.xy.astype(float),
pat.label(
layer=element.layer,
offset=element.xy.astype(float),
string=element.string.decode('ASCII'),
annotations=_properties_to_annotations(element.properties))
pat.labels.append(label)
annotations=_properties_to_annotations(element.properties),
)
elif isinstance(element, klamath.elements.Reference):
pat.subpatterns.append(_ref_to_subpat(element))
target, ref = _gref_to_mref(element)
pat.refs[target].append(ref)
return pat
def _mlayer2gds(mlayer: layer_t) -> Tuple[int, int]:
def _mlayer2gds(mlayer: layer_t) -> tuple[int, int]:
""" Helper to turn a layer tuple-or-int into a layer and datatype"""
if isinstance(mlayer, int):
layer = mlayer
@ -302,10 +288,9 @@ def _mlayer2gds(mlayer: layer_t) -> Tuple[int, int]:
return layer, data_type
def _ref_to_subpat(ref: klamath.library.Reference) -> SubPattern:
def _gref_to_mref(ref: klamath.library.Reference) -> tuple[str, Ref]:
"""
Helper function to create a SubPattern from an SREF or AREF. Sets subpat.pattern to None
and sets the instance .identifier to (struct_name,).
Helper function to create a Ref from an SREF or AREF. Sets ref.target to struct_name.
"""
xy = ref.xy.astype(float)
offset = xy[0]
@ -317,25 +302,26 @@ def _ref_to_subpat(ref: klamath.library.Reference) -> SubPattern:
repetition = Grid(a_vector=a_vector, b_vector=b_vector,
a_count=a_count, b_count=b_count)
subpat = SubPattern(pattern=None,
target = ref.struct_name.decode('ASCII')
mref = Ref(
offset=offset,
rotation=numpy.deg2rad(ref.angle_deg),
scale=ref.mag,
mirrored=(ref.invert_y, False),
mirrored=ref.invert_y,
annotations=_properties_to_annotations(ref.properties),
repetition=repetition)
subpat.identifier = (ref.struct_name.decode('ASCII'),)
return subpat
repetition=repetition,
)
return target, mref
def _gpath_to_mpath(gpath: klamath.library.Path, raw_mode: bool) -> Path:
def _gpath_to_mpath(gpath: klamath.library.Path, raw_mode: bool) -> tuple[layer_t, Path]:
if gpath.path_type in path_cap_map:
cap = path_cap_map[gpath.path_type]
else:
raise PatternError(f'Unrecognized path type: {gpath.path_type}')
mpath = Path(vertices=gpath.xy.astype(float),
layer=gpath.layer,
mpath = Path(
vertices=gpath.xy.astype(float),
width=gpath.width,
cap=cap,
offset=numpy.zeros(2),
@ -344,81 +330,87 @@ def _gpath_to_mpath(gpath: klamath.library.Path, raw_mode: bool) -> Path:
)
if cap == Path.Cap.SquareCustom:
mpath.cap_extensions = gpath.extension
return mpath
return gpath.layer, mpath
def _boundary_to_polygon(boundary: klamath.library.Boundary, raw_mode: bool) -> Polygon:
return Polygon(vertices=boundary.xy[:-1].astype(float),
layer=boundary.layer,
def _boundary_to_polygon(boundary: klamath.library.Boundary, raw_mode: bool) -> tuple[layer_t, Polygon]:
return boundary.layer, Polygon(
vertices=boundary.xy[:-1].astype(float),
offset=numpy.zeros(2),
annotations=_properties_to_annotations(boundary.properties),
raw=raw_mode,
)
def _subpatterns_to_refs(subpatterns: List[SubPattern]) -> List[klamath.library.Reference]:
refs = []
for subpat in subpatterns:
if subpat.pattern is None:
def _mrefs_to_grefs(refs: dict[str | None, list[Ref]]) -> list[klamath.library.Reference]:
grefs = []
for target, rseq in refs.items():
if target is None:
continue
encoded_name = subpat.pattern.name.encode('ASCII')
# Note: GDS mirrors first and rotates second
mirror_across_x, extra_angle = normalize_mirror(subpat.mirrored)
rep = subpat.repetition
angle_deg = numpy.rad2deg(subpat.rotation + extra_angle) % 360
properties = _annotations_to_properties(subpat.annotations, 512)
encoded_name = target.encode('ASCII')
for ref in rseq:
# Note: GDS also mirrors first and rotates second
rep = ref.repetition
angle_deg = numpy.rad2deg(ref.rotation) % 360
properties = _annotations_to_properties(ref.annotations, 512)
if isinstance(rep, Grid):
b_vector = rep.b_vector if rep.b_vector is not None else numpy.zeros(2)
b_count = rep.b_count if rep.b_count is not None else 1
xy: NDArray[numpy.float64] = numpy.array(subpat.offset) + [
[0, 0],
xy = numpy.asarray(ref.offset) + numpy.array([
[0.0, 0.0],
rep.a_vector * rep.a_count,
b_vector * b_count,
]
aref = klamath.library.Reference(struct_name=encoded_name,
xy=numpy.round(xy).astype(int),
colrow=(numpy.round(rep.a_count), numpy.round(rep.b_count)),
])
aref = klamath.library.Reference(
struct_name=encoded_name,
xy=rint_cast(xy),
colrow=(numpy.rint(rep.a_count), numpy.rint(rep.b_count)),
angle_deg=angle_deg,
invert_y=mirror_across_x,
mag=subpat.scale,
properties=properties)
refs.append(aref)
invert_y=ref.mirrored,
mag=ref.scale,
properties=properties,
)
grefs.append(aref)
elif rep is None:
ref = klamath.library.Reference(struct_name=encoded_name,
xy=numpy.round([subpat.offset]).astype(int),
sref = klamath.library.Reference(
struct_name=encoded_name,
xy=rint_cast([ref.offset]),
colrow=None,
angle_deg=angle_deg,
invert_y=mirror_across_x,
mag=subpat.scale,
properties=properties)
refs.append(ref)
invert_y=ref.mirrored,
mag=ref.scale,
properties=properties,
)
grefs.append(sref)
else:
new_srefs = [klamath.library.Reference(struct_name=encoded_name,
xy=numpy.round([subpat.offset + dd]).astype(int),
new_srefs = [
klamath.library.Reference(
struct_name=encoded_name,
xy=rint_cast([ref.offset + dd]),
colrow=None,
angle_deg=angle_deg,
invert_y=mirror_across_x,
mag=subpat.scale,
properties=properties)
invert_y=ref.mirrored,
mag=ref.scale,
properties=properties,
)
for dd in rep.displacements]
refs += new_srefs
return refs
grefs += new_srefs
return grefs
def _properties_to_annotations(properties: Dict[int, bytes]) -> annotations_t:
def _properties_to_annotations(properties: dict[int, bytes]) -> annotations_t:
return {str(k): [v.decode()] for k, v in properties.items()}
def _annotations_to_properties(annotations: annotations_t, max_len: int = 126) -> Dict[int, bytes]:
def _annotations_to_properties(annotations: annotations_t, max_len: int = 126) -> dict[int, bytes]:
cum_len = 0
props = {}
for key, vals in annotations.items():
try:
i = int(key)
except ValueError:
raise PatternError(f'Annotation key {key} is not convertable to an integer')
except ValueError as err:
raise PatternError(f'Annotation key {key} is not convertable to an integer') from err
if not (0 < i < 126):
raise PatternError(f'Annotation key {key} converts to {i} (must be in the range [1,125])')
@ -434,138 +426,93 @@ def _annotations_to_properties(annotations: annotations_t, max_len: int = 126) -
def _shapes_to_elements(
shapes: List[Shape],
shapes: dict[layer_t, list[Shape]],
polygonize_paths: bool = False,
) -> List[klamath.elements.Element]:
elements: List[klamath.elements.Element] = []
) -> list[klamath.elements.Element]:
elements: list[klamath.elements.Element] = []
# Add a Boundary element for each shape, and Path elements if necessary
for shape in shapes:
layer, data_type = _mlayer2gds(shape.layer)
for mlayer, sseq in shapes.items():
layer, data_type = _mlayer2gds(mlayer)
for shape in sseq:
if shape.repetition is not None:
raise PatternError('Shape repetitions are not supported by GDS.'
' Please call library.wrap_repeated_shapes() before writing to file.')
properties = _annotations_to_properties(shape.annotations, 128)
if isinstance(shape, Path) and not polygonize_paths:
xy = numpy.round(shape.vertices + shape.offset).astype(int)
width = numpy.round(shape.width).astype(int)
xy = rint_cast(shape.vertices + shape.offset)
width = rint_cast(shape.width)
path_type = next(k for k, v in path_cap_map.items() if v == shape.cap) # reverse lookup
extension: Tuple[int, int]
extension: tuple[int, int]
if shape.cap == Path.Cap.SquareCustom and shape.cap_extensions is not None:
extension = tuple(shape.cap_extensions) # type: ignore
else:
extension = (0, 0)
path = klamath.elements.Path(layer=(layer, data_type),
path = klamath.elements.Path(
layer=(layer, data_type),
xy=xy,
path_type=path_type,
width=width,
width=int(width),
extension=extension,
properties=properties)
properties=properties,
)
elements.append(path)
elif isinstance(shape, Polygon):
polygon = shape
xy_closed = numpy.empty((polygon.vertices.shape[0] + 1, 2), dtype=numpy.int32)
numpy.rint(polygon.vertices + polygon.offset, out=xy_closed[:-1], casting='unsafe')
xy_closed[-1] = xy_closed[0]
boundary = klamath.elements.Boundary(layer=(layer, data_type),
boundary = klamath.elements.Boundary(
layer=(layer, data_type),
xy=xy_closed,
properties=properties)
properties=properties,
)
elements.append(boundary)
else:
for polygon in shape.to_polygons():
xy_closed = numpy.empty((polygon.vertices.shape[0] + 1, 2), dtype=numpy.int32)
numpy.rint(polygon.vertices + polygon.offset, out=xy_closed[:-1], casting='unsafe')
xy_closed[-1] = xy_closed[0]
boundary = klamath.elements.Boundary(layer=(layer, data_type),
boundary = klamath.elements.Boundary(
layer=(layer, data_type),
xy=xy_closed,
properties=properties)
properties=properties,
)
elements.append(boundary)
return elements
def _labels_to_texts(labels: List[Label]) -> List[klamath.elements.Text]:
def _labels_to_texts(labels: dict[layer_t, list[Label]]) -> list[klamath.elements.Text]:
texts = []
for label in labels:
for mlayer, lseq in labels.items():
layer, text_type = _mlayer2gds(mlayer)
for label in lseq:
properties = _annotations_to_properties(label.annotations, 128)
layer, text_type = _mlayer2gds(label.layer)
xy = numpy.round([label.offset]).astype(int)
text = klamath.elements.Text(layer=(layer, text_type),
xy = rint_cast([label.offset])
text = klamath.elements.Text(
layer=(layer, text_type),
xy=xy,
string=label.string.encode('ASCII'),
properties=properties,
presentation=0, # TODO maybe set some of these?
angle_deg=0,
invert_y=False,
width=0,
path_type=0,
mag=1)
presentation=0, # font number & alignment -- unused by us
angle_deg=0, # rotation -- unused by us
invert_y=False, # inversion -- unused by us
width=0, # stroke width -- unused by us
path_type=0, # text path endcaps, unused
mag=1, # size -- unused by us
)
texts.append(text)
return texts
def disambiguate_pattern_names(
patterns: Sequence[Pattern],
max_name_length: int = 32,
suffix_length: int = 6,
dup_warn_filter: Optional[Callable[[str], bool]] = None,
) -> None:
"""
Args:
patterns: List of patterns to disambiguate
max_name_length: Names longer than this will be truncated
suffix_length: Names which get truncated are truncated by this many extra characters. This is to
leave room for a suffix if one is necessary.
dup_warn_filter: (optional) Function for suppressing warnings about cell names changing. Receives
the cell name and returns `False` if the warning should be suppressed and `True` if it should
be displayed. Default displays all warnings.
"""
used_names = []
for pat in set(patterns):
# Shorten names which already exceed max-length
if len(pat.name) > max_name_length:
shortened_name = pat.name[:max_name_length - suffix_length]
logger.warning(f'Pattern name "{pat.name}" is too long ({len(pat.name)}/{max_name_length} chars),\n'
+ f' shortening to "{shortened_name}" before generating suffix')
else:
shortened_name = pat.name
# Remove invalid characters
sanitized_name = re.compile(r'[^A-Za-z0-9_\?\$]').sub('_', shortened_name)
# Add a suffix that makes the name unique
i = 0
suffixed_name = sanitized_name
while suffixed_name in used_names or suffixed_name == '':
suffix = base64.b64encode(struct.pack('>Q', i), b'$?').decode('ASCII')
suffixed_name = sanitized_name + '$' + suffix[:-1].lstrip('A')
i += 1
if sanitized_name == '':
logger.warning(f'Empty pattern name saved as "{suffixed_name}"')
elif suffixed_name != sanitized_name:
if dup_warn_filter is None or dup_warn_filter(pat.name):
logger.warning(f'Pattern name "{pat.name}" ({sanitized_name}) appears multiple times;\n'
+ f' renaming to "{suffixed_name}"')
# Encode into a byte-string and perform some final checks
encoded_name = suffixed_name.encode('ASCII')
if len(encoded_name) == 0:
# Should never happen since zero-length names are replaced
raise PatternError(f'Zero-length name after sanitize+encode,\n originally "{pat.name}"')
if len(encoded_name) > max_name_length:
raise PatternError(f'Pattern name "{encoded_name!r}" length > {max_name_length} after encode,\n'
+ f' originally "{pat.name}"')
pat.name = suffixed_name
used_names.append(suffixed_name)
def load_library(
stream: BinaryIO,
tag: str,
is_secondary: Optional[Callable[[str], bool]] = None,
stream: IO[bytes],
*,
full_load: bool = False,
) -> Tuple[Library, Dict[str, Any]]:
postprocess: Callable[[ILibraryView, str, Pattern], Pattern] | None = None
) -> tuple[LazyLibrary, dict[str, Any]]:
"""
Scan a GDSII stream to determine what structures are present, and create
a library from them. This enables deferred reading of structures
@ -577,33 +524,27 @@ def load_library(
The caller should leave the stream open while the library
is still in use, since the library will need to access it
in order to read the structure contents.
tag: Unique identifier that will be used to identify this data source
is_secondary: Function which takes a structure name and returns
True if the structure should only be used as a subcell
and not appear in the main Library interface.
Default always returns False.
full_load: If True, force all structures to be read immediately rather
than as-needed. Since data is read sequentially from the file,
this will be faster than using the resulting library's
`precache` method.
than as-needed. Since data is read sequentially from the file, this
will be faster than using the resulting library's `precache` method.
postprocess: If given, this function is used to post-process each
pattern *upon first load only*.
Returns:
Library object, allowing for deferred load of structures.
LazyLibrary object, allowing for deferred load of structures.
Additional library info (dict, same format as from `read`).
"""
if is_secondary is None:
def is_secondary(k: str) -> bool:
return False
assert(is_secondary is not None)
stream.seek(0)
lib = Library()
lib = LazyLibrary()
if full_load:
# Full load approach (immediately load everything)
patterns, library_info = read(stream)
for name, pattern in patterns.items():
lib.set_const(name, tag, pattern, secondary=is_secondary(name))
if postprocess is not None:
lib[name] = postprocess(lib, name, pattern)
else:
lib[name] = pattern
return lib, library_info
# Normal approach (scan and defer load)
@ -615,21 +556,23 @@ def load_library(
def mkstruct(pos: int = pos, name: str = name) -> Pattern:
stream.seek(pos)
return read_elements(stream, name, raw_mode=True)
pat = read_elements(stream, raw_mode=True)
if postprocess is not None:
pat = postprocess(lib, name, pat)
return pat
lib.set_value(name, tag, mkstruct, secondary=is_secondary(name))
lib[name] = mkstruct
return lib, library_info
def load_libraryfile(
filename: Union[str, pathlib.Path],
tag: str,
is_secondary: Optional[Callable[[str], bool]] = None,
filename: str | pathlib.Path,
*,
use_mmap: bool = True,
full_load: bool = False,
) -> Tuple[Library, Dict[str, Any]]:
postprocess: Callable[[ILibraryView, str, Pattern], Pattern] | None = None
) -> tuple[LazyLibrary, dict[str, Any]]:
"""
Wrapper for `load_library()` that takes a filename or path instead of a stream.
@ -640,31 +583,65 @@ def load_libraryfile(
Args:
path: filename or path to read from
tag: Unique identifier for library, see `load_library`
is_secondary: Function specifying subcess, see `load_library`
use_mmap: If `True`, will attempt to memory-map the file instead
of buffering. In the case of gzipped files, the file
is decompressed into a python `bytes` object in memory
and reopened as an `io.BytesIO` stream.
full_load: If `True`, immediately loads all data. See `load_library`.
postprocess: Passed to `load_library`
Returns:
Library object, allowing for deferred load of structures.
LazyLibrary object, allowing for deferred load of structures.
Additional library info (dict, same format as from `read`).
"""
path = pathlib.Path(filename)
stream: IO[bytes]
if is_gzipped(path):
if mmap:
if use_mmap:
logger.info('Asked to mmap a gzipped file, reading into memory instead...')
base_stream = gzip.open(path, mode='rb')
stream = io.BytesIO(base_stream.read())
gz_stream = gzip.open(path, mode='rb') # noqa: SIM115
stream = io.BytesIO(gz_stream.read()) # type: ignore
else:
base_stream = gzip.open(path, mode='rb')
stream = io.BufferedReader(base_stream)
gz_stream = gzip.open(path, mode='rb') # noqa: SIM115
stream = io.BufferedReader(gz_stream) # type: ignore
else: # noqa: PLR5501
if use_mmap:
base_stream = path.open(mode='rb', buffering=0) # noqa: SIM115
stream = mmap.mmap(base_stream.fileno(), 0, access=mmap.ACCESS_READ) # type: ignore
else:
base_stream = open(path, mode='rb')
if mmap:
stream = mmap.mmap(base_stream.fileno(), 0, access=mmap.ACCESS_READ)
else:
stream = io.BufferedReader(base_stream)
return load_library(stream, tag, is_secondary)
stream = path.open(mode='rb') # noqa: SIM115
return load_library(stream, full_load=full_load, postprocess=postprocess)
def check_valid_names(
names: Iterable[str],
max_length: int = 32,
) -> None:
"""
Check all provided names to see if they're valid GDSII cell names.
Args:
names: Collection of names to check
max_length: Max allowed length
"""
allowed_chars = set(string.ascii_letters + string.digits + '_?$')
bad_chars = [
name for name in names
if not set(name).issubset(allowed_chars)
]
bad_lengths = [
name for name in names
if len(name) > max_length
]
if bad_chars:
logger.error('Names contain invalid characters:\n' + pformat(bad_chars))
if bad_lengths:
logger.error(f'Names too long (>{max_length}:\n' + pformat(bad_chars))
if bad_chars or bad_lengths:
raise LibraryError('Library contains invalid names, see log above')

View File

@ -1,2 +0,0 @@
# FOr backwards compatibility
from .gdsii import *

View File

@ -10,33 +10,36 @@ Note that OASIS references follow the same convention as `masque`,
Scaling, rotation, and mirroring apply to individual instances, not grid
vectors or offsets.
Notes:
* Gzip modification time is set to 0 (start of current epoch, usually 1970-01-01)
"""
from typing import List, Any, Dict, Tuple, Callable, Union, Sequence, Iterable, Optional
import re
import io
import copy
import base64
import struct
from typing import Any, IO, cast
from collections.abc import Sequence, Iterable, Mapping, Callable
import logging
import pathlib
import gzip
import string
from pprint import pformat
import numpy
from numpy.typing import ArrayLike, NDArray
import fatamorgana
import fatamorgana.records as fatrec
from fatamorgana.basic import PathExtensionScheme, AString, NString, PropStringReference
from .utils import clean_pattern_vertices, is_gzipped
from .. import Pattern, SubPattern, PatternError, Label, Shape
from ..shapes import Polygon, Path, Circle
from .utils import is_gzipped, tmpfile
from .. import Pattern, Ref, PatternError, LibraryError, Label, Shape
from ..library import Library, ILibrary
from ..shapes import Path, Circle
from ..repetition import Grid, Arbitrary, Repetition
from ..utils import layer_t, normalize_mirror, annotations_t
from ..utils import layer_t, annotations_t
logger = logging.getLogger(__name__)
logger.warning('OASIS support is experimental and mostly untested!')
logger.warning('OASIS support is experimental!')
path_cap_map = {
@ -45,21 +48,23 @@ path_cap_map = {
PathExtensionScheme.Arbitrary: Path.Cap.SquareCustom,
}
#TODO implement more shape types?
#TODO implement more shape types in OASIS?
def rint_cast(val: ArrayLike) -> NDArray[numpy.int64]:
return numpy.rint(val).astype(numpy.int64)
def build(
patterns: Union[Pattern, Sequence[Pattern]],
library: Mapping[str, Pattern], # NOTE: Pattern here should be treated as immutable!
units_per_micron: int,
layer_map: Optional[Dict[str, Union[int, Tuple[int, int]]]] = None,
layer_map: dict[str, int | tuple[int, int]] | None = None,
*,
modify_originals: bool = False,
disambiguate_func: Optional[Callable[[Iterable[Pattern]], None]] = None,
annotations: Optional[annotations_t] = None,
annotations: annotations_t | None = None,
) -> fatamorgana.OasisLayout:
"""
Convert a `Pattern` or list of patterns to an OASIS stream, writing patterns
as OASIS cells, subpatterns as Placement records, and other shapes and labels
mapped to equivalent record types (Polygon, Path, Circle, Text).
Convert a collection of {name: Pattern} pairs to an OASIS stream, writing patterns
as OASIS cells, refs as Placement records, and mapping other shapes and labels
to equivalent record types (Polygon, Path, Circle, Text).
Other shape types may be converted to polygons if no equivalent
record type exists (or is not implemented here yet).
@ -71,14 +76,17 @@ def build(
If a layer map is provided, layer strings will be converted
automatically, and layer names will be written to the file.
If you want pattern polygonized with non-default arguments, just call `pattern.polygonize()`
prior to calling this function.
Other functions you may want to call:
- `masque.file.oasis.check_valid_names(library.keys())` to check for invalid names
- `library.dangling_refs()` to check for references to missing patterns
- `pattern.polygonize()` for any patterns with shapes other
than `masque.shapes.Polygon`, `masque.shapes.Path`, or `masque.shapes.Circle`
Args:
patterns: A Pattern or list of patterns to convert.
library: A {name: Pattern} mapping of patterns to write.
units_per_micron: Written into the OASIS file, number of grid steps per micrometer.
All distances are assumed to be an integer multiple of the grid step, and are stored as such.
layer_map: Dictionary which translates layer names into layer numbers. If this argument is
layer_map: dictionary which translates layer names into layer numbers. If this argument is
provided, input shapes and labels are allowed to have layer names instead of numbers.
It is assumed that geometry and text share the same layer names, and each name is
assigned only to a single layer (not a range).
@ -86,31 +94,23 @@ def build(
into numbers, omit this argument, and manually generate the required
`fatamorgana.records.LayerName` entries.
Default is an empty dict (no names provided).
modify_originals: If `True`, the original pattern is modified as part of the writing
process. Otherwise, a copy is made and `deepunlock()`-ed.
Default `False`.
disambiguate_func: Function which takes a list of patterns and alters them
to make their names valid and unique. Default is `disambiguate_pattern_names`.
annotations: dictionary of key-value pairs which are saved as library-level properties
Returns:
`fatamorgana.OasisLayout`
"""
if isinstance(patterns, Pattern):
patterns = [patterns]
if not isinstance(library, ILibrary):
if isinstance(library, dict):
library = Library(library)
else:
library = Library(dict(library))
if layer_map is None:
layer_map = {}
if disambiguate_func is None:
disambiguate_func = disambiguate_pattern_names
if annotations is None:
annotations = {}
if not modify_originals:
patterns = [p.deepunlock() for p in copy.deepcopy(patterns)]
# Create library
lib = fatamorgana.OasisLayout(unit=units_per_micron, validation=None)
lib.properties = annotations_to_properties(annotations)
@ -119,44 +119,38 @@ def build(
for name, layer_num in layer_map.items():
layer, data_type = _mlayer2oas(layer_num)
lib.layers += [
fatrec.LayerName(nstring=name,
fatrec.LayerName(
nstring=name,
layer_interval=(layer, layer),
type_interval=(data_type, data_type),
is_textlayer=tt)
is_textlayer=tt,
)
for tt in (True, False)]
def layer2oas(mlayer: layer_t) -> Tuple[int, int]:
assert(layer_map is not None)
def layer2oas(mlayer: layer_t) -> tuple[int, int]:
assert layer_map is not None
layer_num = layer_map[mlayer] if isinstance(mlayer, str) else mlayer
return _mlayer2oas(layer_num)
else:
layer2oas = _mlayer2oas
# Get a dict of id(pattern) -> pattern
patterns_by_id = {id(pattern): pattern for pattern in patterns}
for pattern in patterns:
for i, p in pattern.referenced_patterns_by_id().items():
patterns_by_id[i] = p
disambiguate_func(patterns_by_id.values())
# Now create a structure for each pattern
for pat in patterns_by_id.values():
structure = fatamorgana.Cell(name=pat.name)
for name, pat in library.items():
structure = fatamorgana.Cell(name=name)
lib.cells.append(structure)
structure.properties += annotations_to_properties(pat.annotations)
structure.geometry += _shapes_to_elements(pat.shapes, layer2oas)
structure.geometry += _labels_to_texts(pat.labels, layer2oas)
structure.placements += _subpatterns_to_placements(pat.subpatterns)
structure.placements += _refs_to_placements(pat.refs)
return lib
def write(
patterns: Union[Sequence[Pattern], Pattern],
stream: io.BufferedIOBase,
library: Mapping[str, Pattern], # NOTE: Pattern here should be treated as immutable!
stream: IO[bytes],
*args,
**kwargs,
) -> None:
@ -165,18 +159,18 @@ def write(
for details.
Args:
patterns: A Pattern or list of patterns to write to file.
library: A {name: Pattern} mapping of patterns to write.
stream: Stream to write to.
*args: passed to `oasis.build()`
**kwargs: passed to `oasis.build()`
"""
lib = build(patterns, *args, **kwargs)
lib = build(library, *args, **kwargs)
lib.write(stream)
def writefile(
patterns: Union[Sequence[Pattern], Pattern],
filename: Union[str, pathlib.Path],
library: Mapping[str, Pattern], # NOTE: Pattern here should be treated as immutable!
filename: str | pathlib.Path,
*args,
**kwargs,
) -> None:
@ -186,26 +180,33 @@ def writefile(
Will automatically compress the file if it has a .gz suffix.
Args:
patterns: `Pattern` or list of patterns to save
library: A {name: Pattern} mapping of patterns to write.
filename: Filename to save to.
*args: passed to `oasis.write`
**kwargs: passed to `oasis.write`
"""
path = pathlib.Path(filename)
if path.suffix == '.gz':
open_func: Callable = gzip.open
else:
open_func = open
with io.BufferedWriter(open_func(path, mode='wb')) as stream:
write(patterns, stream, *args, **kwargs)
with tmpfile(path) as base_stream:
streams: tuple[Any, ...] = (base_stream,)
if path.suffix == '.gz':
stream = cast(IO[bytes], gzip.GzipFile(filename='', mtime=0, fileobj=base_stream, mode='wb'))
streams += (stream,)
else:
stream = base_stream
try:
write(library, stream, *args, **kwargs)
finally:
for ss in streams:
ss.close()
def readfile(
filename: Union[str, pathlib.Path],
filename: str | pathlib.Path,
*args,
**kwargs,
) -> Tuple[Dict[str, Pattern], Dict[str, Any]]:
) -> tuple[Library, dict[str, Any]]:
"""
Wrapper for `oasis.read()` that takes a filename or path instead of a stream.
@ -222,19 +223,18 @@ def readfile(
else:
open_func = open
with io.BufferedReader(open_func(path, mode='rb')) as stream:
with open_func(path, mode='rb') as stream:
results = read(stream, *args, **kwargs)
return results
def read(
stream: io.BufferedIOBase,
clean_vertices: bool = True,
) -> Tuple[Dict[str, Pattern], Dict[str, Any]]:
stream: IO[bytes],
) -> tuple[Library, dict[str, Any]]:
"""
Read a OASIS file and translate it into a dict of Pattern objects. OASIS cells are
translated into Pattern objects; Polygons are translated into polygons, and Placements
are translated into SubPattern objects.
are translated into Ref objects.
Additional library info is returned in a dict, containing:
'units_per_micrometer': number of database units per micrometer (all values are in database units)
@ -243,18 +243,15 @@ def read(
Args:
stream: Stream to read from.
clean_vertices: If `True`, remove any redundant vertices when loading polygons.
The cleaning process removes any polygons with zero area or <3 vertices.
Default `True`.
Returns:
- Dict of `pattern_name`:`Pattern`s generated from OASIS cells
- Dict of OASIS library info
- dict of `pattern_name`:`Pattern`s generated from OASIS cells
- dict of OASIS library info
"""
lib = fatamorgana.OasisLayout.read(stream)
library_info: Dict[str, Any] = {
library_info: dict[str, Any] = {
'units_per_micrometer': lib.unit,
'annotations': properties_to_annotations(lib.properties, lib.propnames, lib.propstrings),
}
@ -264,72 +261,76 @@ def read(
layer_map[str(layer_name.nstring)] = layer_name
library_info['layer_map'] = layer_map
patterns = []
mlib = Library()
for cell in lib.cells:
if isinstance(cell.name, int):
cell_name = lib.cellnames[cell.name].nstring.string
else:
cell_name = cell.name.string
pat = Pattern(name=cell_name)
pat = Pattern()
for element in cell.geometry:
if isinstance(element, fatrec.XElement):
logger.warning('Skipping XElement record')
# note XELEMENT has no repetition
continue
assert(not isinstance(element.repetition, fatamorgana.ReuseRepetition))
assert not isinstance(element.repetition, fatamorgana.ReuseRepetition)
repetition = repetition_fata2masq(element.repetition)
# Switch based on element type:
if isinstance(element, fatrec.Polygon):
vertices = numpy.cumsum(numpy.vstack(((0, 0), element.get_point_list())), axis=0)
# Drop last point (`fatamorgana` returns explicity closed list; we use implicit close)
# also need `cumsum` to convert from deltas to locations
vertices = numpy.cumsum(numpy.vstack(((0, 0), element.get_point_list()[:-1])), axis=0)
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
poly = Polygon(vertices=vertices,
pat.polygon(
vertices=vertices,
layer=element.get_layer_tuple(),
offset=element.get_xy(),
annotations=annotations,
repetition=repetition)
pat.shapes.append(poly)
repetition=repetition,
)
elif isinstance(element, fatrec.Path):
vertices = numpy.cumsum(numpy.vstack(((0, 0), element.get_point_list())), axis=0)
cap_start = path_cap_map[element.get_extension_start()[0]]
cap_end = path_cap_map[element.get_extension_end()[0]]
if cap_start != cap_end:
raise Exception('masque does not support multiple cap types on a single path.') # TODO handle multiple cap types
raise PatternError('masque does not support multiple cap types on a single path.') # TODO handle multiple cap types
cap = cap_start
path_args: Dict[str, Any] = {}
path_args: dict[str, Any] = {}
if cap == Path.Cap.SquareCustom:
path_args['cap_extensions'] = numpy.array((element.get_extension_start()[1],
element.get_extension_end()[1]))
path_args['cap_extensions'] = numpy.array((
element.get_extension_start()[1],
element.get_extension_end()[1],
))
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
path = Path(vertices=vertices,
pat.path(
vertices=vertices,
layer=element.get_layer_tuple(),
offset=element.get_xy(),
repetition=repetition,
annotations=annotations,
width=element.get_half_width() * 2,
cap=cap,
**path_args)
pat.shapes.append(path)
**path_args,
)
elif isinstance(element, fatrec.Rectangle):
width = element.get_width()
height = element.get_height()
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
rect = Polygon(layer=element.get_layer_tuple(),
pat.polygon(
layer=element.get_layer_tuple(),
offset=element.get_xy(),
repetition=repetition,
vertices=numpy.array(((0, 0), (1, 0), (1, 1), (0, 1))) * (width, height),
annotations=annotations,
)
pat.shapes.append(rect)
elif isinstance(element, fatrec.Trapezoid):
vertices = numpy.array(((0, 0), (1, 0), (1, 1), (0, 1))) * (element.get_width(), element.get_height())
@ -357,13 +358,13 @@ def read(
vertices[2, 0] -= b
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
trapz = Polygon(layer=element.get_layer_tuple(),
pat.polygon(
layer=element.get_layer_tuple(),
offset=element.get_xy(),
repetition=repetition,
vertices=vertices,
annotations=annotations,
)
pat.shapes.append(trapz)
elif isinstance(element, fatrec.CTrapezoid):
cttype = element.get_ctrapezoid_type()
@ -412,22 +413,24 @@ def read(
vertices[0, 1] += width
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
ctrapz = Polygon(layer=element.get_layer_tuple(),
pat.polygon(
layer=element.get_layer_tuple(),
offset=element.get_xy(),
repetition=repetition,
vertices=vertices,
annotations=annotations,
)
pat.shapes.append(ctrapz)
elif isinstance(element, fatrec.Circle):
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
circle = Circle(layer=element.get_layer_tuple(),
layer = element.get_layer_tuple()
circle = Circle(
offset=element.get_xy(),
repetition=repetition,
annotations=annotations,
radius=float(element.get_radius()))
pat.shapes.append(circle)
radius=float(element.get_radius()),
)
pat.shapes[layer].append(circle)
elif isinstance(element, fatrec.Text):
annotations = properties_to_annotations(element.properties, lib.propnames, lib.propstrings)
@ -436,38 +439,30 @@ def read(
string = lib.textstrings[str_or_ref].string
else:
string = str_or_ref.string
label = Label(layer=element.get_layer_tuple(),
pat.label(
layer=element.get_layer_tuple(),
offset=element.get_xy(),
repetition=repetition,
annotations=annotations,
string=string)
pat.labels.append(label)
string=string,
)
else:
logger.warning(f'Skipping record {element} (unimplemented)')
continue
for placement in cell.placements:
pat.subpatterns.append(_placement_to_subpat(placement, lib))
target, ref = _placement_to_ref(placement, lib)
if isinstance(target, int):
target = lib.cellnames[target].nstring.string
pat.refs[target].append(ref)
if clean_vertices:
clean_pattern_vertices(pat)
patterns.append(pat)
mlib[cell_name] = pat
# Create a dict of {pattern.name: pattern, ...}, then fix up all subpattern.pattern entries
# according to the subpattern.identifier (which is deleted after use).
patterns_dict = dict(((p.name, p) for p in patterns))
for p in patterns_dict.values():
for sp in p.subpatterns:
ident = sp.identifier[0]
name = ident if isinstance(ident, str) else lib.cellnames[ident].nstring.string
sp.pattern = patterns_dict[name]
del sp.identifier
return patterns_dict, library_info
return mlib, library_info
def _mlayer2oas(mlayer: layer_t) -> Tuple[int, int]:
def _mlayer2oas(mlayer: layer_t) -> tuple[int, int]:
""" Helper to turn a layer tuple-or-int into a layer and datatype"""
if isinstance(mlayer, int):
layer = mlayer
@ -479,97 +474,102 @@ def _mlayer2oas(mlayer: layer_t) -> Tuple[int, int]:
else:
data_type = 0
else:
raise PatternError(f'Invalid layer for OASIS: {layer}. Note that OASIS layers cannot be '
raise PatternError(f'Invalid layer for OASIS: {mlayer}. Note that OASIS layers cannot be '
f'strings unless a layer map is provided.')
return layer, data_type
def _placement_to_subpat(placement: fatrec.Placement, lib: fatamorgana.OasisLayout) -> SubPattern:
def _placement_to_ref(placement: fatrec.Placement, lib: fatamorgana.OasisLayout) -> tuple[int | str, Ref]:
"""
Helper function to create a SubPattern from a placment. Sets subpat.pattern to None
and sets the instance .identifier to (struct_name,).
Helper function to create a Ref from a placment. Also returns the placement name (or id).
"""
assert(not isinstance(placement.repetition, fatamorgana.ReuseRepetition))
assert not isinstance(placement.repetition, fatamorgana.ReuseRepetition)
xy = numpy.array((placement.x, placement.y))
mag = placement.magnification if placement.magnification is not None else 1
pname = placement.get_name()
name = pname if isinstance(pname, int) else pname.string
name: int | str = pname if isinstance(pname, int) else pname.string # TODO deal with referenced names
annotations = properties_to_annotations(placement.properties, lib.propnames, lib.propstrings)
if placement.angle is None:
rotation = 0
else:
rotation = numpy.deg2rad(float(placement.angle))
subpat = SubPattern(offset=xy,
pattern=None,
mirrored=(placement.flip, False),
ref = Ref(
offset=xy,
mirrored=placement.flip,
rotation=rotation,
scale=float(mag),
identifier=(name,),
repetition=repetition_fata2masq(placement.repetition),
annotations=annotations)
return subpat
annotations=annotations,
)
return name, ref
def _subpatterns_to_placements(
subpatterns: List[SubPattern],
) -> List[fatrec.Placement]:
refs = []
for subpat in subpatterns:
if subpat.pattern is None:
def _refs_to_placements(
refs: dict[str | None, list[Ref]],
) -> list[fatrec.Placement]:
placements = []
for target, rseq in refs.items():
if target is None:
continue
for ref in rseq:
# Note: OASIS also mirrors first and rotates second
frep, rep_offset = repetition_masq2fata(ref.repetition)
# Note: OASIS mirrors first and rotates second
mirror_across_x, extra_angle = normalize_mirror(subpat.mirrored)
frep, rep_offset = repetition_masq2fata(subpat.repetition)
offset = numpy.round(subpat.offset + rep_offset).astype(int)
angle = numpy.rad2deg(subpat.rotation + extra_angle) % 360
ref = fatrec.Placement(
name=subpat.pattern.name,
flip=mirror_across_x,
offset = rint_cast(ref.offset + rep_offset)
angle = numpy.rad2deg(ref.rotation) % 360
placement = fatrec.Placement(
name=target,
flip=ref.mirrored,
angle=angle,
magnification=subpat.scale,
properties=annotations_to_properties(subpat.annotations),
magnification=ref.scale,
properties=annotations_to_properties(ref.annotations),
x=offset[0],
y=offset[1],
repetition=frep)
repetition=frep,
)
refs.append(ref)
return refs
placements.append(placement)
return placements
def _shapes_to_elements(
shapes: List[Shape],
layer2oas: Callable[[layer_t], Tuple[int, int]],
) -> List[Union[fatrec.Polygon, fatrec.Path, fatrec.Circle]]:
shapes: dict[layer_t, list[Shape]],
layer2oas: Callable[[layer_t], tuple[int, int]],
) -> list[fatrec.Polygon | fatrec.Path | fatrec.Circle]:
# Add a Polygon record for each shape, and Path elements if necessary
elements: List[Union[fatrec.Polygon, fatrec.Path, fatrec.Circle]] = []
for shape in shapes:
layer, datatype = layer2oas(shape.layer)
elements: list[fatrec.Polygon | fatrec.Path | fatrec.Circle] = []
for mlayer, sseq in shapes.items():
layer, datatype = layer2oas(mlayer)
for shape in sseq:
repetition, rep_offset = repetition_masq2fata(shape.repetition)
properties = annotations_to_properties(shape.annotations)
if isinstance(shape, Circle):
offset = numpy.round(shape.offset + rep_offset).astype(int)
radius = numpy.round(shape.radius).astype(int)
circle = fatrec.Circle(layer=layer,
offset = rint_cast(shape.offset + rep_offset)
radius = rint_cast(shape.radius)
circle = fatrec.Circle(
layer=layer,
datatype=datatype,
radius=radius,
radius=cast(int, radius),
x=offset[0],
y=offset[1],
properties=properties,
repetition=repetition)
repetition=repetition,
)
elements.append(circle)
elif isinstance(shape, Path):
xy = numpy.round(shape.offset + shape.vertices[0] + rep_offset).astype(int)
deltas = numpy.round(numpy.diff(shape.vertices, axis=0)).astype(int)
half_width = numpy.round(shape.width / 2).astype(int)
xy = rint_cast(shape.offset + shape.vertices[0] + rep_offset)
deltas = rint_cast(numpy.diff(shape.vertices, axis=0))
half_width = rint_cast(shape.width / 2)
path_type = next(k for k, v in path_cap_map.items() if v == shape.cap) # reverse lookup
extension_start = (path_type, shape.cap_extensions[0] if shape.cap_extensions is not None else None)
extension_end = (path_type, shape.cap_extensions[1] if shape.cap_extensions is not None else None)
path = fatrec.Path(layer=layer,
path = fatrec.Path(
layer=layer,
datatype=datatype,
point_list=deltas,
half_width=half_width,
point_list=cast(Sequence[Sequence[int]], deltas),
half_width=cast(int, half_width),
x=xy[0],
y=xy[1],
extension_start=extension_start, # TODO implement multiple cap types?
@ -580,81 +580,57 @@ def _shapes_to_elements(
elements.append(path)
else:
for polygon in shape.to_polygons():
xy = numpy.round(polygon.offset + polygon.vertices[0] + rep_offset).astype(int)
points = numpy.round(numpy.diff(polygon.vertices, axis=0)).astype(int)
elements.append(fatrec.Polygon(layer=layer,
xy = rint_cast(polygon.offset + polygon.vertices[0] + rep_offset)
points = rint_cast(numpy.diff(polygon.vertices, axis=0))
elements.append(fatrec.Polygon(
layer=layer,
datatype=datatype,
x=xy[0],
y=xy[1],
point_list=points,
point_list=cast(list[list[int]], points),
properties=properties,
repetition=repetition))
repetition=repetition,
))
return elements
def _labels_to_texts(
labels: List[Label],
layer2oas: Callable[[layer_t], Tuple[int, int]],
) -> List[fatrec.Text]:
labels: dict[layer_t, list[Label]],
layer2oas: Callable[[layer_t], tuple[int, int]],
) -> list[fatrec.Text]:
texts = []
for label in labels:
layer, datatype = layer2oas(label.layer)
for mlayer, lseq in labels.items():
layer, datatype = layer2oas(mlayer)
for label in lseq:
repetition, rep_offset = repetition_masq2fata(label.repetition)
xy = numpy.round(label.offset + rep_offset).astype(int)
xy = rint_cast(label.offset + rep_offset)
properties = annotations_to_properties(label.annotations)
texts.append(fatrec.Text(layer=layer,
texts.append(fatrec.Text(
layer=layer,
datatype=datatype,
x=xy[0],
y=xy[1],
string=label.string,
properties=properties,
repetition=repetition))
repetition=repetition,
))
return texts
def disambiguate_pattern_names(
patterns,
dup_warn_filter: Callable[[str], bool] = None, # If returns False, don't warn about this name
) -> None:
used_names = []
for pat in patterns:
sanitized_name = re.compile(r'[^A-Za-z0-9_\?\$]').sub('_', pat.name)
i = 0
suffixed_name = sanitized_name
while suffixed_name in used_names or suffixed_name == '':
suffix = base64.b64encode(struct.pack('>Q', i), b'$?').decode('ASCII')
suffixed_name = sanitized_name + '$' + suffix[:-1].lstrip('A')
i += 1
if sanitized_name == '':
logger.warning(f'Empty pattern name saved as "{suffixed_name}"')
elif suffixed_name != sanitized_name:
if dup_warn_filter is None or dup_warn_filter(pat.name):
logger.warning(f'Pattern name "{pat.name}" ({sanitized_name}) appears multiple times;\n'
+ f' renaming to "{suffixed_name}"')
if len(suffixed_name) == 0:
# Should never happen since zero-length names are replaced
raise PatternError(f'Zero-length name after sanitize+encode,\n originally "{pat.name}"')
pat.name = suffixed_name
used_names.append(suffixed_name)
def repetition_fata2masq(
rep: Union[fatamorgana.GridRepetition, fatamorgana.ArbitraryRepetition, None],
) -> Optional[Repetition]:
mrep: Optional[Repetition]
rep: fatamorgana.GridRepetition | fatamorgana.ArbitraryRepetition | None,
) -> Repetition | None:
mrep: Repetition | None
if isinstance(rep, fatamorgana.GridRepetition):
mrep = Grid(a_vector=rep.a_vector,
b_vector=rep.b_vector,
a_count=rep.a_count,
b_count=rep.b_count)
elif isinstance(rep, fatamorgana.ArbitraryRepetition):
displacements = numpy.cumsum(numpy.column_stack((rep.x_displacements,
rep.y_displacements)), axis=0)
displacements = numpy.cumsum(numpy.column_stack((
rep.x_displacements,
rep.y_displacements,
)), axis=0)
displacements = numpy.vstack(([0, 0], displacements))
mrep = Arbitrary(displacements)
elif rep is None:
@ -663,37 +639,37 @@ def repetition_fata2masq(
def repetition_masq2fata(
rep: Optional[Repetition],
) -> Tuple[Union[fatamorgana.GridRepetition,
fatamorgana.ArbitraryRepetition,
None],
Tuple[int, int]]:
frep: Union[fatamorgana.GridRepetition, fatamorgana.ArbitraryRepetition, None]
rep: Repetition | None,
) -> tuple[
fatamorgana.GridRepetition | fatamorgana.ArbitraryRepetition | None,
tuple[int, int]
]:
frep: fatamorgana.GridRepetition | fatamorgana.ArbitraryRepetition | None
if isinstance(rep, Grid):
a_vector = rint_cast(rep.a_vector)
b_vector = rint_cast(rep.b_vector) if rep.b_vector is not None else None
a_count = rint_cast(rep.a_count)
b_count = rint_cast(rep.b_count) if rep.b_count is not None else None
frep = fatamorgana.GridRepetition(
a_vector=a_vector,
b_vector=b_vector,
a_count=a_count,
b_count=b_count,
a_vector=cast(list[int], a_vector),
b_vector=cast(list[int] | None, b_vector),
a_count=cast(int, a_count),
b_count=cast(int | None, b_count),
)
offset = (0, 0)
elif isinstance(rep, Arbitrary):
diffs = numpy.diff(rep.displacements, axis=0)
diff_ints = rint_cast(diffs)
frep = fatamorgana.ArbitraryRepetition(diff_ints[:, 0], diff_ints[:, 1])
frep = fatamorgana.ArbitraryRepetition(diff_ints[:, 0], diff_ints[:, 1]) # type: ignore
offset = rep.displacements[0, :]
else:
assert(rep is None)
assert rep is None
frep = None
offset = (0, 0)
return frep, offset
def annotations_to_properties(annotations: annotations_t) -> List[fatrec.Property]:
def annotations_to_properties(annotations: annotations_t) -> list[fatrec.Property]:
#TODO determine is_standard based on key?
properties = []
for key, values in annotations.items():
@ -704,24 +680,24 @@ def annotations_to_properties(annotations: annotations_t) -> List[fatrec.Propert
def properties_to_annotations(
properties: List[fatrec.Property],
propnames: Dict[int, NString],
propstrings: Dict[int, AString],
properties: list[fatrec.Property],
propnames: dict[int, NString],
propstrings: dict[int, AString],
) -> annotations_t:
annotations = {}
for proprec in properties:
assert(proprec.name is not None)
assert proprec.name is not None
if isinstance(proprec.name, int):
key = propnames[proprec.name].string
else:
key = proprec.name.string
values: List[Union[str, float, int]] = []
values: list[str | float | int] = []
assert(proprec.values is not None)
assert proprec.values is not None
for value in proprec.values:
if isinstance(value, (float, int)):
if isinstance(value, float | int):
values.append(value)
elif isinstance(value, (NString, AString)):
elif isinstance(value, NString | AString):
values.append(value.string)
elif isinstance(value, PropStringReference):
values.append(propstrings[value.ref].string) # dereference
@ -735,3 +711,25 @@ def properties_to_annotations(
properties = [fatrec.Property(key, vals, is_standard=False)
for key, vals in annotations.items()]
return properties
def check_valid_names(
names: Iterable[str],
) -> None:
"""
Check all provided names to see if they're valid GDSII cell names.
Args:
names: Collection of names to check
max_length: Max allowed length
"""
allowed_chars = set(string.ascii_letters + string.digits + string.punctuation + ' ')
bad_chars = [
name for name in names
if not set(name).issubset(allowed_chars)
]
if bad_chars:
raise LibraryError('Names contain invalid characters:\n' + pformat(bad_chars))

View File

@ -1,580 +0,0 @@
"""
GDSII file format readers and writers using python-gdsii
Note that GDSII references follow the same convention as `masque`,
with this order of operations:
1. Mirroring
2. Rotation
3. Scaling
4. Offset and array expansion (no mirroring/rotation/scaling applied to offsets)
Scaling, rotation, and mirroring apply to individual instances, not grid
vectors or offsets.
Notes:
* absolute positioning is not supported
* PLEX is not supported
* ELFLAGS are not supported
* GDS does not support library- or structure-level annotations
"""
from typing import List, Any, Dict, Tuple, Callable, Union, Iterable, Optional
from typing import Sequence
import re
import io
import copy
import base64
import struct
import logging
import pathlib
import gzip
import numpy
from numpy.typing import NDArray, ArrayLike
# python-gdsii
import gdsii.library #type: ignore
import gdsii.structure #type: ignore
import gdsii.elements #type: ignore
from .utils import clean_pattern_vertices, is_gzipped
from .. import Pattern, SubPattern, PatternError, Label, Shape
from ..shapes import Polygon, Path
from ..repetition import Grid
from ..utils import get_bit, set_bit, layer_t, normalize_mirror, annotations_t
logger = logging.getLogger(__name__)
path_cap_map = {
None: Path.Cap.Flush,
0: Path.Cap.Flush,
1: Path.Cap.Circle,
2: Path.Cap.Square,
4: Path.Cap.SquareCustom,
}
def rint_cast(val: ArrayLike) -> NDArray[numpy.int32]:
return numpy.rint(val, dtype=numpy.int32, casting='unsafe')
def build(
patterns: Union[Pattern, Sequence[Pattern]],
meters_per_unit: float,
logical_units_per_unit: float = 1,
library_name: str = 'masque-gdsii-write',
*,
modify_originals: bool = False,
disambiguate_func: Callable[[Iterable[Pattern]], None] = None,
) -> gdsii.library.Library:
"""
Convert a `Pattern` or list of patterns to a GDSII stream, by first calling
`.polygonize()` to change the shapes into polygons, and then writing patterns
as GDSII structures, polygons as boundary elements, and subpatterns as structure
references (sref).
For each shape,
layer is chosen to be equal to `shape.layer` if it is an int,
or `shape.layer[0]` if it is a tuple
datatype is chosen to be `shape.layer[1]` if available,
otherwise `0`
It is often a good idea to run `pattern.subpatternize()` prior to calling this function,
especially if calling `.polygonize()` will result in very many vertices.
If you want pattern polygonized with non-default arguments, just call `pattern.polygonize()`
prior to calling this function.
Args:
patterns: A Pattern or list of patterns to convert.
meters_per_unit: Written into the GDSII file, meters per (database) length unit.
All distances are assumed to be an integer multiple of this unit, and are stored as such.
logical_units_per_unit: Written into the GDSII file. Allows the GDSII to specify a
"logical" unit which is different from the "database" unit, for display purposes.
Default `1`.
library_name: Library name written into the GDSII file.
Default 'masque-gdsii-write'.
modify_originals: If `True`, the original pattern is modified as part of the writing
process. Otherwise, a copy is made and `deepunlock()`-ed.
Default `False`.
disambiguate_func: Function which takes a list of patterns and alters them
to make their names valid and unique. Default is `disambiguate_pattern_names`, which
attempts to adhere to the GDSII standard as well as possible.
WARNING: No additional error checking is performed on the results.
Returns:
`gdsii.library.Library`
"""
if isinstance(patterns, Pattern):
patterns = [patterns]
if disambiguate_func is None:
disambiguate_func = disambiguate_pattern_names # type: ignore
assert(disambiguate_func is not None) # placate mypy
if not modify_originals:
patterns = [p.deepunlock() for p in copy.deepcopy(patterns)]
patterns = [p.wrap_repeated_shapes() for p in patterns]
# Create library
lib = gdsii.library.Library(version=600,
name=library_name.encode('ASCII'),
logical_unit=logical_units_per_unit,
physical_unit=meters_per_unit)
# Get a dict of id(pattern) -> pattern
patterns_by_id = {id(pattern): pattern for pattern in patterns}
for pattern in patterns:
for i, p in pattern.referenced_patterns_by_id().items():
patterns_by_id[i] = p
disambiguate_func(patterns_by_id.values())
# Now create a structure for each pattern, and add in any Boundary and SREF elements
for pat in patterns_by_id.values():
structure = gdsii.structure.Structure(name=pat.name.encode('ASCII'))
lib.append(structure)
structure += _shapes_to_elements(pat.shapes)
structure += _labels_to_texts(pat.labels)
structure += _subpatterns_to_refs(pat.subpatterns)
return lib
def write(
patterns: Union[Pattern, Sequence[Pattern]],
stream: io.BufferedIOBase,
*args,
**kwargs,
) -> None:
"""
Write a `Pattern` or list of patterns to a GDSII file.
See `masque.file.gdsii.build()` for details.
Args:
patterns: A Pattern or list of patterns to write to file.
stream: Stream to write to.
*args: passed to `masque.file.gdsii.build()`
**kwargs: passed to `masque.file.gdsii.build()`
"""
lib = build(patterns, *args, **kwargs)
lib.save(stream)
return
def writefile(
patterns: Union[Sequence[Pattern], Pattern],
filename: Union[str, pathlib.Path],
*args,
**kwargs,
) -> None:
"""
Wrapper for `masque.file.gdsii.write()` that takes a filename or path instead of a stream.
Will automatically compress the file if it has a .gz suffix.
Args:
patterns: `Pattern` or list of patterns to save
filename: Filename to save to.
*args: passed to `masque.file.gdsii.write`
**kwargs: passed to `masque.file.gdsii.write`
"""
path = pathlib.Path(filename)
if path.suffix == '.gz':
open_func: Callable = gzip.open
else:
open_func = open
with io.BufferedWriter(open_func(path, mode='wb')) as stream:
write(patterns, stream, *args, **kwargs)
def readfile(
filename: Union[str, pathlib.Path],
*args,
**kwargs,
) -> Tuple[Dict[str, Pattern], Dict[str, Any]]:
"""
Wrapper for `masque.file.gdsii.read()` that takes a filename or path instead of a stream.
Will automatically decompress gzipped files.
Args:
filename: Filename to save to.
*args: passed to `masque.file.gdsii.read`
**kwargs: passed to `masque.file.gdsii.read`
"""
path = pathlib.Path(filename)
if is_gzipped(path):
open_func: Callable = gzip.open
else:
open_func = open
with io.BufferedReader(open_func(path, mode='rb')) as stream:
results = read(stream, *args, **kwargs)
return results
def read(
stream: io.BufferedIOBase,
clean_vertices: bool = True,
) -> Tuple[Dict[str, Pattern], Dict[str, Any]]:
"""
Read a gdsii file and translate it into a dict of Pattern objects. GDSII structures are
translated into Pattern objects; boundaries are translated into polygons, and srefs and arefs
are translated into SubPattern objects.
Additional library info is returned in a dict, containing:
'name': name of the library
'meters_per_unit': number of meters per database unit (all values are in database units)
'logical_units_per_unit': number of "logical" units displayed by layout tools (typically microns)
per database unit
Args:
stream: Stream to read from.
clean_vertices: If `True`, remove any redundant vertices when loading polygons.
The cleaning process removes any polygons with zero area or <3 vertices.
Default `True`.
Returns:
- Dict of pattern_name:Patterns generated from GDSII structures
- Dict of GDSII library info
"""
lib = gdsii.library.Library.load(stream)
library_info = {'name': lib.name.decode('ASCII'),
'meters_per_unit': lib.physical_unit,
'logical_units_per_unit': lib.logical_unit,
}
raw_mode = True # Whether to construct shapes in raw mode (less error checking)
patterns = []
for structure in lib:
pat = Pattern(name=structure.name.decode('ASCII'))
for element in structure:
# Switch based on element type:
if isinstance(element, gdsii.elements.Boundary):
poly = _boundary_to_polygon(element, raw_mode)
pat.shapes.append(poly)
if isinstance(element, gdsii.elements.Path):
path = _gpath_to_mpath(element, raw_mode)
pat.shapes.append(path)
elif isinstance(element, gdsii.elements.Text):
label = Label(offset=element.xy.astype(float),
layer=(element.layer, element.text_type),
string=element.string.decode('ASCII'))
pat.labels.append(label)
elif isinstance(element, (gdsii.elements.SRef, gdsii.elements.ARef)):
pat.subpatterns.append(_ref_to_subpat(element))
if clean_vertices:
clean_pattern_vertices(pat)
patterns.append(pat)
# Create a dict of {pattern.name: pattern, ...}, then fix up all subpattern.pattern entries
# according to the subpattern.identifier (which is deleted after use).
patterns_dict = dict(((p.name, p) for p in patterns))
for p in patterns_dict.values():
for sp in p.subpatterns:
sp.pattern = patterns_dict[sp.identifier[0].decode('ASCII')]
del sp.identifier
return patterns_dict, library_info
def _mlayer2gds(mlayer: layer_t) -> Tuple[int, int]:
""" Helper to turn a layer tuple-or-int into a layer and datatype"""
if isinstance(mlayer, int):
layer = mlayer
data_type = 0
elif isinstance(mlayer, tuple):
layer = mlayer[0]
if len(mlayer) > 1:
data_type = mlayer[1]
else:
data_type = 0
else:
raise PatternError(f'Invalid layer for gdsii: {mlayer}. Note that gdsii layers cannot be strings.')
return layer, data_type
def _ref_to_subpat(
element: Union[gdsii.elements.SRef,
gdsii.elements.ARef]
) -> SubPattern:
"""
Helper function to create a SubPattern from an SREF or AREF. Sets subpat.pattern to None
and sets the instance .identifier to (struct_name,).
NOTE: "Absolute" means not affected by parent elements.
That's not currently supported by masque at all (and not planned).
"""
rotation = 0.0
offset = numpy.array(element.xy[0], dtype=float)
scale = 1.0
mirror_across_x = False
repetition = None
if element.strans is not None:
if element.mag is not None:
scale = element.mag
# Bit 13 means absolute scale
if get_bit(element.strans, 15 - 13):
raise PatternError('Absolute scale is not implemented in masque!')
if element.angle is not None:
rotation = numpy.deg2rad(element.angle)
# Bit 14 means absolute rotation
if get_bit(element.strans, 15 - 14):
raise PatternError('Absolute rotation is not implemented in masque!')
# Bit 0 means mirror x-axis
if get_bit(element.strans, 15 - 0):
mirror_across_x = True
if isinstance(element, gdsii.elements.ARef):
a_count = element.cols
b_count = element.rows
a_vector = (element.xy[1] - offset) / a_count
b_vector = (element.xy[2] - offset) / b_count
repetition = Grid(a_vector=a_vector, b_vector=b_vector,
a_count=a_count, b_count=b_count)
subpat = SubPattern(pattern=None,
offset=offset,
rotation=rotation,
scale=scale,
mirrored=(mirror_across_x, False),
annotations=_properties_to_annotations(element.properties),
repetition=repetition)
subpat.identifier = (element.struct_name,)
return subpat
def _gpath_to_mpath(element: gdsii.elements.Path, raw_mode: bool) -> Path:
if element.path_type in path_cap_map:
cap = path_cap_map[element.path_type]
else:
raise PatternError(f'Unrecognized path type: {element.path_type}')
args = {'vertices': element.xy.astype(float),
'layer': (element.layer, element.data_type),
'width': element.width if element.width is not None else 0.0,
'cap': cap,
'offset': numpy.zeros(2),
'annotations': _properties_to_annotations(element.properties),
'raw': raw_mode,
}
if cap == Path.Cap.SquareCustom:
args['cap_extensions'] = numpy.zeros(2)
if element.bgn_extn is not None:
args['cap_extensions'][0] = element.bgn_extn
if element.end_extn is not None:
args['cap_extensions'][1] = element.end_extn
return Path(**args)
def _boundary_to_polygon(element: gdsii.elements.Boundary, raw_mode: bool) -> Polygon:
args = {'vertices': element.xy[:-1].astype(float),
'layer': (element.layer, element.data_type),
'offset': numpy.zeros(2),
'annotations': _properties_to_annotations(element.properties),
'raw': raw_mode,
}
return Polygon(**args)
def _subpatterns_to_refs(
subpatterns: List[SubPattern],
) -> List[Union[gdsii.elements.ARef, gdsii.elements.SRef]]:
refs = []
for subpat in subpatterns:
if subpat.pattern is None:
continue
encoded_name = subpat.pattern.name.encode('ASCII')
# Note: GDS mirrors first and rotates second
mirror_across_x, extra_angle = normalize_mirror(subpat.mirrored)
rep = subpat.repetition
new_refs: List[Union[gdsii.elements.SRef, gdsii.elements.ARef]]
ref: Union[gdsii.elements.SRef, gdsii.elements.ARef]
if isinstance(rep, Grid):
b_vector = rep.b_vector if rep.b_vector is not None else numpy.zeros(2)
b_count = rep.b_count if rep.b_count is not None else 1
xy: NDArray[numpy.float64] = numpy.array(subpat.offset) + [
[0, 0],
rep.a_vector * rep.a_count,
b_vector * b_count,
]
ref = gdsii.elements.ARef(
struct_name=encoded_name,
xy=rint_cast(xy),
cols=rint_cast(rep.a_count),
rows=rint_cast(rep.b_count),
)
new_refs = [ref]
elif rep is None:
ref = gdsii.elements.SRef(
struct_name=encoded_name,
xy=rint_cast([subpat.offset]),
)
new_refs = [ref]
else:
new_refs = [gdsii.elements.SRef(
struct_name=encoded_name,
xy=rint_cast([subpat.offset + dd]),
)
for dd in rep.displacements]
for ref in new_refs:
ref.angle = numpy.rad2deg(subpat.rotation + extra_angle) % 360
# strans must be non-None for angle and mag to take effect
ref.strans = set_bit(0, 15 - 0, mirror_across_x)
ref.mag = subpat.scale
ref.properties = _annotations_to_properties(subpat.annotations, 512)
refs += new_refs
return refs
def _properties_to_annotations(properties: List[Tuple[int, bytes]]) -> annotations_t:
return {str(k): [v.decode()] for k, v in properties}
def _annotations_to_properties(annotations: annotations_t, max_len: int = 126) -> List[Tuple[int, bytes]]:
cum_len = 0
props = []
for key, vals in annotations.items():
try:
i = int(key)
except ValueError:
raise PatternError(f'Annotation key {key} is not convertable to an integer')
if not (0 < i < 126):
raise PatternError(f'Annotation key {key} converts to {i} (must be in the range [1,125])')
val_strings = ' '.join(str(val) for val in vals)
b = val_strings.encode()
if len(b) > 126:
raise PatternError(f'Annotation value {b!r} is longer than 126 characters!')
cum_len += numpy.ceil(len(b) / 2) * 2 + 2
if cum_len > max_len:
raise PatternError(f'Sum of annotation data will be longer than {max_len} bytes! Generated bytes were {b!r}')
props.append((i, b))
return props
def _shapes_to_elements(
shapes: List[Shape],
polygonize_paths: bool = False,
) -> List[Union[gdsii.elements.Boundary, gdsii.elements.Path]]:
elements: List[Union[gdsii.elements.Boundary, gdsii.elements.Path]] = []
# Add a Boundary element for each shape, and Path elements if necessary
for shape in shapes:
layer, data_type = _mlayer2gds(shape.layer)
properties = _annotations_to_properties(shape.annotations, 128)
if isinstance(shape, Path) and not polygonize_paths:
xy = rint_cast(shape.vertices + shape.offset)
width = rint_cast(shape.width)
path_type = next(k for k, v in path_cap_map.items() if v == shape.cap) # reverse lookup
path = gdsii.elements.Path(layer=layer,
data_type=data_type,
xy=xy)
path.path_type = path_type
path.width = width
path.properties = properties
elements.append(path)
else:
for polygon in shape.to_polygons():
xy_closed = numpy.empty((polygon.vertices.shape[0] + 1, 2), dtype=numpy.int32)
numpy.rint(polygon.vertices + polygon.offset, out=xy_closed[:-1], casting='unsafe')
xy_closed[-1] = xy_closed[0]
boundary = gdsii.elements.Boundary(
layer=layer,
data_type=data_type,
xy=xy_closed,
)
boundary.properties = properties
elements.append(boundary)
return elements
def _labels_to_texts(labels: List[Label]) -> List[gdsii.elements.Text]:
texts = []
for label in labels:
properties = _annotations_to_properties(label.annotations, 128)
layer, text_type = _mlayer2gds(label.layer)
xy = rint_cast([label.offset])
text = gdsii.elements.Text(
layer=layer,
text_type=text_type,
xy=xy,
string=label.string.encode('ASCII'),
)
text.properties = properties
texts.append(text)
return texts
def disambiguate_pattern_names(
patterns: Sequence[Pattern],
max_name_length: int = 32,
suffix_length: int = 6,
dup_warn_filter: Optional[Callable[[str], bool]] = None,
) -> None:
"""
Args:
patterns: List of patterns to disambiguate
max_name_length: Names longer than this will be truncated
suffix_length: Names which get truncated are truncated by this many extra characters. This is to
leave room for a suffix if one is necessary.
dup_warn_filter: (optional) Function for suppressing warnings about cell names changing. Receives
the cell name and returns `False` if the warning should be suppressed and `True` if it should
be displayed. Default displays all warnings.
"""
used_names = []
for pat in set(patterns):
# Shorten names which already exceed max-length
if len(pat.name) > max_name_length:
shortened_name = pat.name[:max_name_length - suffix_length]
logger.warning(f'Pattern name "{pat.name}" is too long ({len(pat.name)}/{max_name_length} chars),\n'
+ f' shortening to "{shortened_name}" before generating suffix')
else:
shortened_name = pat.name
# Remove invalid characters
sanitized_name = re.compile(r'[^A-Za-z0-9_\?\$]').sub('_', shortened_name)
# Add a suffix that makes the name unique
i = 0
suffixed_name = sanitized_name
while suffixed_name in used_names or suffixed_name == '':
suffix = base64.b64encode(struct.pack('>Q', i), b'$?').decode('ASCII')
suffixed_name = sanitized_name + '$' + suffix[:-1].lstrip('A')
i += 1
if sanitized_name == '':
logger.warning(f'Empty pattern name saved as "{suffixed_name}"')
elif suffixed_name != sanitized_name:
if dup_warn_filter is None or dup_warn_filter(pat.name):
logger.warning(f'Pattern name "{pat.name}" ({sanitized_name}) appears multiple times;\n'
+ f' renaming to "{suffixed_name}"')
# Encode into a byte-string and perform some final checks
encoded_name = suffixed_name.encode('ASCII')
if len(encoded_name) == 0:
# Should never happen since zero-length names are replaced
raise PatternError(f'Zero-length name after sanitize+encode,\n originally "{pat.name}"')
if len(encoded_name) > max_name_length:
raise PatternError(f'Pattern name "{encoded_name!r}" length > {max_name_length} after encode,\n'
+ f' originally "{pat.name}"')
pat.name = suffixed_name
used_names.append(suffixed_name)

View File

@ -1,7 +1,7 @@
"""
SVG file format readers and writers
"""
from typing import Dict, Optional
from collections.abc import Mapping
import warnings
import numpy
@ -13,22 +13,23 @@ from .. import Pattern
def writefile(
pattern: Pattern,
library: Mapping[str, Pattern],
top: str,
filename: str,
custom_attributes: bool = False,
) -> None:
"""
Write a Pattern to an SVG file, by first calling .polygonize() on it
to change the shapes into polygons, and then writing patterns as SVG
groups (<g>, inside <defs>), polygons as paths (<path>), and subpatterns
groups (<g>, inside <defs>), polygons as paths (<path>), and refs
as <use> elements.
Note that this function modifies the Pattern.
If `custom_attributes` is `True`, non-standard `pattern_layer` and `pattern_dose` attributes
are written to the relevant elements.
If `custom_attributes` is `True`, a non-standard `pattern_layer` attribute
is written to the relevant elements.
It is often a good idea to run `pattern.subpatternize()` on pattern prior to
It is often a good idea to run `pattern.dedup()` on pattern prior to
calling this function, especially if calling `.polygonize()` will result in very
many vertices.
@ -38,17 +39,18 @@ def writefile(
Args:
pattern: Pattern to write to file. Modified by this function.
filename: Filename to write to.
custom_attributes: Whether to write non-standard `pattern_layer` and
`pattern_dose` attributes to the SVG elements.
custom_attributes: Whether to write non-standard `pattern_layer` attribute to the
SVG elements.
"""
pattern = library[top]
# Polygonize pattern
pattern.polygonize()
bounds = pattern.get_bounds()
bounds = pattern.get_bounds(library=library)
if bounds is None:
bounds_min, bounds_max = numpy.array([[-1, -1], [1, 1]])
warnings.warn('Pattern had no bounds (empty?); setting arbitrary viewbox')
warnings.warn('Pattern had no bounds (empty?); setting arbitrary viewbox', stacklevel=1)
else:
bounds_min, bounds_max = bounds
@ -59,42 +61,39 @@ def writefile(
svg = svgwrite.Drawing(filename, profile='full', viewBox=viewbox_string,
debug=(not custom_attributes))
# Get a dict of id(pattern) -> pattern
patterns_by_id = {**(pattern.referenced_patterns_by_id()), id(pattern): pattern} # type: Dict[int, Optional[Pattern]]
# Now create a group for each pattern and add in any Boundary and Use elements
for name, pat in library.items():
svg_group = svg.g(id=mangle_name(name), fill='blue', stroke='red')
# Now create a group for each row in sd_table (ie, each pattern + dose combination)
# and add in any Boundary and Use elements
for pat in patterns_by_id.values():
if pat is None:
continue
svg_group = svg.g(id=mangle_name(pat), fill='blue', stroke='red')
for shape in pat.shapes:
for layer, shapes in pat.shapes.items():
for shape in shapes:
for polygon in shape.to_polygons():
path_spec = poly2path(polygon.vertices + polygon.offset)
path = svg.path(d=path_spec)
if custom_attributes:
path['pattern_layer'] = polygon.layer
path['pattern_dose'] = polygon.dose
path['pattern_layer'] = layer
svg_group.add(path)
for subpat in pat.subpatterns:
if subpat.pattern is None:
for target, refs in pat.refs.items():
if target is None:
continue
transform = f'scale({subpat.scale:g}) rotate({subpat.rotation:g}) translate({subpat.offset[0]:g},{subpat.offset[1]:g})'
use = svg.use(href='#' + mangle_name(subpat.pattern), transform=transform)
if custom_attributes:
use['pattern_dose'] = subpat.dose
for ref in refs:
transform = f'scale({ref.scale:g}) rotate({ref.rotation:g}) translate({ref.offset[0]:g},{ref.offset[1]:g})'
use = svg.use(href='#' + mangle_name(target), transform=transform)
svg_group.add(use)
svg.defs.add(svg_group)
svg.add(svg.use(href='#' + mangle_name(pattern)))
svg.add(svg.use(href='#' + mangle_name(top)))
svg.save()
def writefile_inverted(pattern: Pattern, filename: str):
def writefile_inverted(
library: Mapping[str, Pattern],
top: str,
filename: str,
) -> None:
"""
Write an inverted Pattern to an SVG file, by first calling `.polygonize()` and
`.flatten()` on it to change the shapes into polygons, then drawing a bounding
@ -110,13 +109,15 @@ def writefile_inverted(pattern: Pattern, filename: str):
pattern: Pattern to write to file. Modified by this function.
filename: Filename to write to.
"""
# Polygonize and flatten pattern
pattern.polygonize().flatten()
pattern = library[top]
bounds = pattern.get_bounds()
# Polygonize and flatten pattern
pattern.polygonize().flatten(library)
bounds = pattern.get_bounds(library=library)
if bounds is None:
bounds_min, bounds_max = numpy.array([[-1, -1], [1, 1]])
warnings.warn('Pattern had no bounds (empty?); setting arbitrary viewbox')
warnings.warn('Pattern had no bounds (empty?); setting arbitrary viewbox', stacklevel=1)
else:
bounds_min, bounds_max = bounds
@ -134,7 +135,8 @@ def writefile_inverted(pattern: Pattern, filename: str):
path_spec = poly2path(slab_edge)
# Draw polygons with reversed vertex order
for shape in pattern.shapes:
for _layer, shapes in pattern.shapes.items():
for shape in shapes:
for polygon in shape.to_polygons():
path_spec += poly2path(polygon.vertices[::-1] + polygon.offset)
@ -152,9 +154,9 @@ def poly2path(vertices: ArrayLike) -> str:
Returns:
SVG path-string.
"""
verts = numpy.array(vertices, copy=False)
commands = 'M{:g},{:g} '.format(verts[0][0], verts[0][1])
verts = numpy.asarray(vertices)
commands = 'M{:g},{:g} '.format(verts[0][0], verts[0][1]) # noqa: UP032
for vertex in verts[1:]:
commands += 'L{:g},{:g}'.format(vertex[0], vertex[1])
commands += 'L{:g},{:g}'.format(vertex[0], vertex[1]) # noqa: UP032
commands += ' Z '
return commands

View File

@ -1,29 +1,105 @@
"""
Helper functions for file reading and writing
"""
from typing import Set, Tuple, List
from typing import IO
from collections.abc import Iterator, Mapping
import re
import copy
import pathlib
import logging
import tempfile
import shutil
from collections import defaultdict
from contextlib import contextmanager
from pprint import pformat
from itertools import chain
from .. import Pattern, PatternError
from .. import Pattern, PatternError, Library, LibraryError
from ..shapes import Polygon, Path
def mangle_name(pattern: Pattern, dose_multiplier: float = 1.0) -> str:
logger = logging.getLogger(__name__)
def preflight(
lib: Library,
sort: bool = True,
sort_elements: bool = False,
allow_dangling_refs: bool | None = None,
allow_named_layers: bool = True,
prune_empty_patterns: bool = False,
wrap_repeated_shapes: bool = False,
) -> Library:
"""
Create a name using `pattern.name`, `id(pattern)`, and the dose multiplier.
Run a standard set of useful operations and checks, usually done immediately prior
to writing to a file (or immediately after reading).
Args:
pattern: Pattern whose name we want to mangle.
dose_multiplier: Dose multiplier to mangle with.
sort: Whether to sort the patterns based on their names, and optionaly sort the pattern contents.
Default True. Useful for reproducible builds.
sort_elements: Whether to sort the pattern contents. Requires sort=True to run.
allow_dangling_refs: If `None` (default), warns about any refs to patterns that are not
in the provided library. If `True`, no check is performed; if `False`, a `LibraryError`
is raised instead.
allow_named_layers: If `False`, raises a `PatternError` if any layer is referred to by
a string instead of a number (or tuple).
prune_empty_patterns: Runs `Library.prune_empty()`, recursively deleting any empty patterns.
wrap_repeated_shapes: Runs `Library.wrap_repeated_shapes()`, turning repeated shapes into
repeated refs containing non-repeated shapes.
Returns:
`lib` or an equivalent sorted library
"""
if sort:
lib = Library(dict(sorted(
(nn, pp.sort(sort_elements=sort_elements)) for nn, pp in lib.items()
)))
if not allow_dangling_refs:
refs = lib.referenced_patterns()
dangling = refs - set(lib.keys())
if dangling:
msg = 'Dangling refs found: ' + pformat(dangling)
if allow_dangling_refs is None:
logger.warning(msg)
else:
raise LibraryError(msg)
if not allow_named_layers:
named_layers: Mapping[str, set] = defaultdict(set)
for name, pat in lib.items():
for layer in chain(pat.shapes.keys(), pat.labels.keys()):
if isinstance(layer, str):
named_layers[name].add(layer)
named_layers = dict(named_layers)
if named_layers:
raise PatternError('Non-numeric layers found:' + pformat(named_layers))
if prune_empty_patterns:
pruned = lib.prune_empty()
if pruned:
logger.info(f'Preflight pruned {len(pruned)} empty patterns')
logger.debug('Pruned: ' + pformat(pruned))
else:
logger.debug('Preflight found no empty patterns')
if wrap_repeated_shapes:
lib.wrap_repeated_shapes()
return lib
def mangle_name(name: str) -> str:
"""
Sanitize a name.
Args:
name: Name we want to mangle.
Returns:
Mangled name.
"""
expression = re.compile(r'[^A-Za-z0-9_\?\$]')
full_name = '{}_{}_{}'.format(pattern.name, dose_multiplier, id(pattern))
sanitized_name = expression.sub('_', full_name)
sanitized_name = expression.sub('_', name)
return sanitized_name
@ -38,149 +114,39 @@ def clean_pattern_vertices(pat: Pattern) -> Pattern:
Returns:
pat
"""
for shapes in pat.shapes.values():
remove_inds = []
for ii, shape in enumerate(pat.shapes):
if not isinstance(shape, (Polygon, Path)):
for ii, shape in enumerate(shapes):
if not isinstance(shape, Polygon | Path):
continue
try:
shape.clean_vertices()
except PatternError:
remove_inds.append(ii)
for ii in sorted(remove_inds, reverse=True):
del pat.shapes[ii]
del shapes[ii]
return pat
def make_dose_table(patterns: List[Pattern], dose_multiplier: float = 1.0) -> Set[Tuple[int, float]]:
"""
Create a set containing `(id(pat), written_dose)` for each pattern (including subpatterns)
Args:
pattern: Source Patterns.
dose_multiplier: Multiplier for all written_dose entries.
Returns:
`{(id(subpat.pattern), written_dose), ...}`
"""
dose_table = {(id(pattern), dose_multiplier) for pattern in patterns}
for pattern in patterns:
for subpat in pattern.subpatterns:
if subpat.pattern is None:
continue
subpat_dose_entry = (id(subpat.pattern), subpat.dose * dose_multiplier)
if subpat_dose_entry not in dose_table:
subpat_dose_table = make_dose_table([subpat.pattern], subpat.dose * dose_multiplier)
dose_table = dose_table.union(subpat_dose_table)
return dose_table
def dtype2dose(pattern: Pattern) -> Pattern:
"""
For each shape in the pattern, if the layer is a tuple, set the
layer to the tuple's first element and set the dose to the
tuple's second element.
Generally intended for use with `Pattern.apply()`.
Args:
pattern: Pattern to modify
Returns:
pattern
"""
for shape in pattern.shapes:
if isinstance(shape.layer, tuple):
shape.dose = shape.layer[1]
shape.layer = shape.layer[0]
return pattern
def dose2dtype(
patterns: List[Pattern],
) -> Tuple[List[Pattern], List[float]]:
"""
For each shape in each pattern, set shape.layer to the tuple
(base_layer, datatype), where:
layer is chosen to be equal to the original shape.layer if it is an int,
or shape.layer[0] if it is a tuple. `str` layers raise a PatterError.
datatype is chosen arbitrarily, based on calcualted dose for each shape.
Shapes with equal calcualted dose will have the same datatype.
A list of doses is retured, providing a mapping between datatype
(list index) and dose (list entry).
Note that this function modifies the input Pattern(s).
Args:
patterns: A `Pattern` or list of patterns to write to file. Modified by this function.
Returns:
(patterns, dose_list)
patterns: modified input patterns
dose_list: A list of doses, providing a mapping between datatype (int, list index)
and dose (float, list entry).
"""
# Get a dict of id(pattern) -> pattern
patterns_by_id = {id(pattern): pattern for pattern in patterns}
for pattern in patterns:
for i, p in pattern.referenced_patterns_by_id().items():
patterns_by_id[i] = p
# Get a table of (id(pat), written_dose) for each pattern and subpattern
sd_table = make_dose_table(patterns)
# Figure out all the unique doses necessary to write this pattern
# This means going through each row in sd_table and adding the dose values needed to write
# that subpattern at that dose level
dose_vals = set()
for pat_id, pat_dose in sd_table:
pat = patterns_by_id[pat_id]
for shape in pat.shapes:
dose_vals.add(shape.dose * pat_dose)
if len(dose_vals) > 256:
raise PatternError('Too many dose values: {}, maximum 256 when using dtypes.'.format(len(dose_vals)))
dose_vals_list = list(dose_vals)
# Create a new pattern for each non-1-dose entry in the dose table
# and update the shapes to reflect their new dose
new_pats = {} # (id, dose) -> new_pattern mapping
for pat_id, pat_dose in sd_table:
if pat_dose == 1:
new_pats[(pat_id, pat_dose)] = patterns_by_id[pat_id]
continue
old_pat = patterns_by_id[pat_id]
pat = old_pat.copy() # keep old subpatterns
pat.shapes = copy.deepcopy(old_pat.shapes)
pat.labels = copy.deepcopy(old_pat.labels)
encoded_name = mangle_name(pat, pat_dose)
if len(encoded_name) == 0:
raise PatternError('Zero-length name after mangle+encode, originally "{}"'.format(pat.name))
pat.name = encoded_name
for shape in pat.shapes:
data_type = dose_vals_list.index(shape.dose * pat_dose)
if isinstance(shape.layer, int):
shape.layer = (shape.layer, data_type)
elif isinstance(shape.layer, tuple):
shape.layer = (shape.layer[0], data_type)
else:
raise PatternError(f'Invalid layer for gdsii: {shape.layer}')
new_pats[(pat_id, pat_dose)] = pat
# Go back through all the dose-specific patterns and fix up their subpattern entries
for (pat_id, pat_dose), pat in new_pats.items():
for subpat in pat.subpatterns:
dose_mult = subpat.dose * pat_dose
subpat.pattern = new_pats[(id(subpat.pattern), dose_mult)]
return patterns, dose_vals_list
def is_gzipped(path: pathlib.Path) -> bool:
with open(path, 'rb') as stream:
with path.open('rb') as stream:
magic_bytes = stream.read(2)
return magic_bytes == b'\x1f\x8b'
@contextmanager
def tmpfile(path: str | pathlib.Path) -> Iterator[IO[bytes]]:
"""
Context manager which allows you to write to a temporary file,
and move that file into its final location only after the write
has finished.
"""
path = pathlib.Path(path)
suffixes = ''.join(path.suffixes)
with tempfile.NamedTemporaryFile(suffix=suffixes, delete=False) as tmp_stream:
yield tmp_stream
try:
shutil.move(tmp_stream.name, path)
finally:
pathlib.Path(tmp_stream.name).unlink(missing_ok=True)

View File

@ -1,31 +1,30 @@
from typing import Tuple, Dict, Optional, TypeVar
from typing import Self, Any
import copy
import functools
import numpy
from numpy.typing import ArrayLike, NDArray
from .repetition import Repetition
from .utils import rotation_matrix_2d, layer_t, AutoSlots, annotations_t
from .traits import PositionableImpl, LayerableImpl, Copyable, Pivotable, LockableImpl, RepeatableImpl
from .utils import rotation_matrix_2d, annotations_t, annotations_eq, annotations_lt, rep2key
from .traits import PositionableImpl, Copyable, Pivotable, RepeatableImpl, Bounded
from .traits import AnnotatableImpl
L = TypeVar('L', bound='Label')
class Label(PositionableImpl, LayerableImpl, LockableImpl, RepeatableImpl, AnnotatableImpl,
Pivotable, Copyable, metaclass=AutoSlots):
@functools.total_ordering
class Label(PositionableImpl, RepeatableImpl, AnnotatableImpl, Bounded, Pivotable, Copyable):
"""
A text annotation with a position and layer (but no size; it is not drawn)
A text annotation with a position (but no size; it is not drawn)
"""
__slots__ = ( '_string', 'identifier')
__slots__ = (
'_string',
# Inherited
'_offset', '_repetition', '_annotations',
)
_string: str
""" Label string """
identifier: Tuple
""" Arbitrary identifier tuple, useful for keeping track of history when flattening """
'''
---- Properties
'''
@ -46,38 +45,45 @@ class Label(PositionableImpl, LayerableImpl, LockableImpl, RepeatableImpl, Annot
string: str,
*,
offset: ArrayLike = (0.0, 0.0),
layer: layer_t = 0,
repetition: Optional[Repetition] = None,
annotations: Optional[annotations_t] = None,
locked: bool = False,
identifier: Tuple = (),
repetition: Repetition | None = None,
annotations: annotations_t | None = None,
) -> None:
LockableImpl.unlock(self)
self.identifier = identifier
self.string = string
self.offset = numpy.array(offset, dtype=float, copy=True)
self.layer = layer
self.offset = numpy.array(offset, dtype=float)
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
self.set_locked(locked)
def __copy__(self: L) -> L:
return type(self)(string=self.string,
def __copy__(self) -> Self:
return type(self)(
string=self.string,
offset=self.offset.copy(),
layer=self.layer,
repetition=self.repetition,
locked=self.locked,
identifier=self.identifier)
)
def __deepcopy__(self: L, memo: Dict = None) -> L:
def __deepcopy__(self, memo: dict | None = None) -> Self:
memo = {} if memo is None else memo
new = copy.copy(self)
LockableImpl.unlock(new)
new._offset = self._offset.copy()
new.set_locked(self.locked)
return new
def rotate_around(self: L, pivot: ArrayLike, rotation: float) -> L:
def __lt__(self, other: 'Label') -> bool:
if self.string != other.string:
return self.string < other.string
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
def __eq__(self, other: Any) -> bool:
return (
self.string == other.string
and numpy.array_equal(self.offset, other.offset)
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self:
"""
Rotate the label around a point.
@ -88,13 +94,13 @@ class Label(PositionableImpl, LayerableImpl, LockableImpl, RepeatableImpl, Annot
Returns:
self
"""
pivot = numpy.array(pivot, dtype=float)
pivot = numpy.asarray(pivot, dtype=float)
self.translate(-pivot)
self.offset = numpy.dot(rotation_matrix_2d(rotation), self.offset)
self.translate(+pivot)
return self
def get_bounds(self) -> NDArray[numpy.float64]:
def get_bounds_single(self) -> NDArray[numpy.float64]:
"""
Return the bounds of the label.
@ -106,17 +112,3 @@ class Label(PositionableImpl, LayerableImpl, LockableImpl, RepeatableImpl, Annot
Bounds [[xmin, xmax], [ymin, ymax]]
"""
return numpy.array([self.offset, self.offset])
def lock(self: L) -> L:
PositionableImpl._lock(self)
LockableImpl.lock(self)
return self
def unlock(self: L) -> L:
LockableImpl.unlock(self)
PositionableImpl._unlock(self)
return self
def __repr__(self) -> str:
locked = ' L' if self.locked else ''
return f'<Label "{self.string}" l{self.layer} o{self.offset}{locked}>'

1381
masque/library.py Normal file

File diff suppressed because it is too large Load Diff

View File

@ -1,2 +0,0 @@
from .library import Library, PatternGenerator
from .device_library import DeviceLibrary, LibDeviceLibrary

View File

@ -1,298 +0,0 @@
"""
DeviceLibrary class for managing unique name->device mappings and
deferred loading or creation.
"""
from typing import Dict, Callable, TypeVar, TYPE_CHECKING
from typing import Any, Tuple, Union, Iterator
import logging
from pprint import pformat
from ..error import DeviceLibraryError
from ..library import Library
from ..builder import Device
from .. import Pattern
logger = logging.getLogger(__name__)
D = TypeVar('D', bound='DeviceLibrary')
L = TypeVar('L', bound='LibDeviceLibrary')
class DeviceLibrary:
"""
This class maps names to functions which generate or load the
relevant `Device` object.
This class largely functions the same way as `Library`, but
operates on `Device`s rather than `Patterns` and thus has no
need for distinctions between primary/secondary devices (as
there is no inter-`Device` hierarchy).
Each device is cached the first time it is used. The cache can
be disabled by setting the `enable_cache` attribute to `False`.
"""
generators: Dict[str, Callable[[], Device]]
cache: Dict[Union[str, Tuple[str, str]], Device]
enable_cache: bool = True
def __init__(self) -> None:
self.generators = {}
self.cache = {}
def __setitem__(self, key: str, value: Callable[[], Device]) -> None:
self.generators[key] = value
if key in self.cache:
del self.cache[key]
def __delitem__(self, key: str) -> None:
del self.generators[key]
if key in self.cache:
del self.cache[key]
def __getitem__(self, key: str) -> Device:
if self.enable_cache and key in self.cache:
logger.debug(f'found {key} in cache')
return self.cache[key]
logger.debug(f'loading {key}')
dev = self.generators[key]()
self.cache[key] = dev
return dev
def __iter__(self) -> Iterator[str]:
return iter(self.keys())
def __contains__(self, key: str) -> bool:
return key in self.generators
def keys(self) -> Iterator[str]:
return iter(self.generators.keys())
def values(self) -> Iterator[Device]:
return iter(self[key] for key in self.keys())
def items(self) -> Iterator[Tuple[str, Device]]:
return iter((key, self[key]) for key in self.keys())
def __repr__(self) -> str:
return '<DeviceLibrary with keys ' + repr(list(self.generators.keys())) + '>'
def set_const(self, const: Device) -> None:
"""
Convenience function to avoid having to manually wrap
already-generated Device objects into callables.
Args:
const: Pre-generated device object
"""
self.generators[const.pattern.name] = lambda: const
def add(
self: D,
other: D,
use_ours: Callable[[str], bool] = lambda name: False,
use_theirs: Callable[[str], bool] = lambda name: False,
) -> D:
"""
Add keys from another library into this one.
There must be no conflicting keys.
Args:
other: The library to insert keys from
use_ours: Decision function for name conflicts. Will be called with duplicate cell names.
Should return `True` if the value from `self` should be used.
use_theirs: Decision function for name conflicts. Same format as `use_ours`.
Should return `True` if the value from `other` should be used.
`use_ours` takes priority over `use_theirs`.
Returns:
self
"""
duplicates = set(self.keys()) & set(other.keys())
keep_ours = set(name for name in duplicates if use_ours(name))
keep_theirs = set(name for name in duplicates - keep_ours if use_theirs(name))
conflicts = duplicates - keep_ours - keep_theirs
if conflicts:
raise DeviceLibraryError('Duplicate keys encountered in DeviceLibrary merge: '
+ pformat(conflicts))
for name in set(other.generators.keys()) - keep_ours:
self.generators[name] = other.generators[name]
if name in other.cache:
self.cache[name] = other.cache[name]
return self
def clear_cache(self: D) -> D:
"""
Clear the cache of this library.
This is usually used before modifying or deleting cells, e.g. when merging
with another library.
Returns:
self
"""
self.cache = {}
return self
def add_device(
self,
name: str,
fn: Callable[[], Device],
dev2pat: Callable[[Device], Pattern],
prefix: str = '',
) -> None:
"""
Convenience function for adding a device to the library.
- The device is generated with the provided `fn()`
- Port info is written to the pattern using the provied dev2pat
- The pattern is renamed to match the provided `prefix + name`
- If `prefix` is non-empty, a wrapped copy is also added, named
`name` (no prefix). See `wrap_device()` for details.
Adding devices with this function helps to
- Make sure Pattern names are reflective of what the devices are named
- Ensure port info is written into the `Pattern`, so that the `Device`
can be reconstituted from the layout.
- Simplify adding a prefix to all device names, to make it easier to
track their provenance and purpose, while also allowing for
generic device names which can later be swapped out with different
underlying implementations.
Args:
name: Base name for the device. If a prefix is used, this is the
"generic" name (e.g. "L3_cavity" vs "2022_02_02_L3_cavity").
fn: Function which is called to generate the device.
dev2pat: Post-processing function which is called to add the port
info into the device's pattern.
prefix: If present, the actual device is named `prefix + name`, and
a second device with name `name` is also added (containing only
this one).
"""
def build_dev() -> Device:
dev = fn()
dev.pattern = dev2pat(dev)
dev.pattern.rename(prefix + name)
return dev
self[prefix + name] = build_dev
if prefix:
self.wrap_device(name, prefix + name)
def wrap_device(
self,
name: str,
old_name: str,
) -> None:
"""
Create a new device which simply contains an instance of an already-existing device.
This is useful for assigning an alternate name to a device, while still keeping
the original name available for traceability.
Args:
name: Name for the wrapped device.
old_name: Name of the existing device to wrap.
"""
def build_wrapped_dev() -> Device:
old_dev = self[old_name]
wrapper = Pattern(name=name)
wrapper.addsp(old_dev.pattern)
return Device(wrapper, old_dev.ports)
self[name] = build_wrapped_dev
class LibDeviceLibrary(DeviceLibrary):
"""
Extends `DeviceLibrary`, enabling it to ingest `Library` objects
(e.g. obtained by loading a GDS file).
Each `Library` object must be accompanied by a `pat2dev` function,
which takes in the `Pattern` and returns a full `Device` (including
port info). This is usually accomplished by scanning the `Pattern` for
port-related geometry, but could also bake in external info.
`Library` objects are ingested into `underlying`, which is a
`Library` which is kept in sync with the `DeviceLibrary` when
devices are removed (or new libraries added via `add_library()`).
"""
underlying: Library
def __init__(self) -> None:
DeviceLibrary.__init__(self)
self.underlying = Library()
def __setitem__(self, key: str, value: Callable[[], Device]) -> None:
self.generators[key] = value
if key in self.cache:
del self.cache[key]
# If any `Library` that has been (or will be) added has an entry for `key`,
# it will be added to `self.underlying` and then returned by it during subpattern
# resolution for other entries, and will conflict with the name for our
# wrapped device. To avoid that, we need to set ourselves as the "true" source of
# the `Pattern` named `key`.
if key in self.underlying:
raise DeviceLibraryError(f'Device name {key} already exists in underlying Library!'
' Demote or delete it first.')
# NOTE that this means the `Device` may be cached without the `Pattern` being in
# the `underlying` cache yet!
self.underlying.set_value(key, '__DeviceLibrary', lambda: self[key].pattern)
def __delitem__(self, key: str) -> None:
DeviceLibrary.__delitem__(self, key)
if key in self.underlying:
del self.underlying[key]
def add_library(
self: L,
lib: Library,
pat2dev: Callable[[Pattern], Device],
use_ours: Callable[[Union[str, Tuple[str, str]]], bool] = lambda name: False,
use_theirs: Callable[[Union[str, Tuple[str, str]]], bool] = lambda name: False,
) -> L:
"""
Add a pattern `Library` into this `LibDeviceLibrary`.
This requires a `pat2dev` function which can transform each `Pattern`
into a `Device`. For example, this can be accomplished by scanning
the `Pattern` data for port location info or by looking up port info
based on the pattern name or other characteristics in a hardcoded or
user-supplied dictionary.
Args:
lib: Pattern library to add.
pat2dev: Function for transforming each `Pattern` object from `lib`
into a `Device` which will be returned by this device library.
use_ours: Decision function for name conflicts. Will be called with
duplicate cell names, and (name, tag) tuples from the underlying library.
Should return `True` if the value from `self` should be used.
use_theirs: Decision function for name conflicts. Same format as `use_ours`.
Should return `True` if the value from `other` should be used.
`use_ours` takes priority over `use_theirs`.
Returns:
self
"""
duplicates = set(lib.keys()) & set(self.keys())
keep_ours = set(name for name in duplicates if use_ours(name))
keep_theirs = set(name for name in duplicates - keep_ours if use_theirs(name))
bad_duplicates = duplicates - keep_ours - keep_theirs
if bad_duplicates:
raise DeviceLibraryError('Duplicate devices (no action specified): ' + pformat(bad_duplicates))
# No 'bad' duplicates, so all duplicates should be overwritten
for name in keep_theirs:
self.underlying.demote(name)
self.underlying.add(lib, use_ours, use_theirs)
for name in lib:
self.generators[name] = lambda name=name: pat2dev(self.underlying[name])
return self

View File

@ -1,355 +0,0 @@
"""
Library class for managing unique name->pattern mappings and
deferred loading or creation.
"""
from typing import Dict, Callable, TypeVar, TYPE_CHECKING
from typing import Any, Tuple, Union, Iterator
import logging
from pprint import pformat
from dataclasses import dataclass
import copy
from ..error import LibraryError
if TYPE_CHECKING:
from ..pattern import Pattern
logger = logging.getLogger(__name__)
@dataclass
class PatternGenerator:
__slots__ = ('tag', 'gen')
tag: str
""" Unique identifier for the source """
gen: Callable[[], 'Pattern']
""" Function which generates a pattern when called """
L = TypeVar('L', bound='Library')
class Library:
"""
This class is usually used to create a library of Patterns by mapping names to
functions which generate or load the relevant `Pattern` object as-needed.
Generated/loaded patterns can have "symbolic" references, where a SubPattern
object `sp` has a `None`-valued `sp.pattern` attribute, in which case the
Library expects `sp.identifier[0]` to contain a string which specifies the
referenced pattern's name.
Patterns can either be "primary" (default) or "secondary". Both get the
same deferred-load behavior, but "secondary" patterns may have conflicting
names and are not accessible through basic []-indexing. They are only used
to fill symbolic references in cases where there is no "primary" pattern
available, and only if both the referencing and referenced pattern-generators'
`tag` values match (i.e., only if they came from the same source).
Primary patterns can be turned into secondary patterns with the `demote`
method, `promote` performs the reverse (secondary -> primary) operation.
The `set_const` and `set_value` methods provide an easy way to transparently
construct PatternGenerator objects and directly set create "secondary"
patterns.
The cache can be disabled by setting the `enable_cache` attribute to `False`.
"""
primary: Dict[str, PatternGenerator]
secondary: Dict[Tuple[str, str], PatternGenerator]
cache: Dict[Union[str, Tuple[str, str]], 'Pattern']
enable_cache: bool = True
def __init__(self) -> None:
self.primary = {}
self.secondary = {}
self.cache = {}
def __setitem__(self, key: str, value: PatternGenerator) -> None:
self.primary[key] = value
if key in self.cache:
logger.warning(f'Replaced library item "{key}" & existing cache entry.'
' Previously-generated Pattern will *not* be updated!')
del self.cache[key]
def __delitem__(self, key: str) -> None:
if isinstance(key, str):
del self.primary[key]
elif isinstance(key, tuple):
del self.secondary[key]
if key in self.cache:
logger.warning(f'Deleting library item "{key}" & existing cache entry.'
' Previously-generated Pattern may remain in the wild!')
del self.cache[key]
def __getitem__(self, key: str) -> 'Pattern':
return self.get_primary(key)
def __iter__(self) -> Iterator[str]:
return iter(self.keys())
def __contains__(self, key: str) -> bool:
return key in self.primary
def get_primary(self, key: str) -> 'Pattern':
if self.enable_cache and key in self.cache:
logger.debug(f'found {key} in cache')
return self.cache[key]
logger.debug(f'loading {key}')
pg = self.primary[key]
pat = pg.gen()
self.resolve_subpatterns(pat, pg.tag)
self.cache[key] = pat
return pat
def get_secondary(self, key: str, tag: str) -> 'Pattern':
logger.debug(f'get_secondary({key}, {tag})')
key2 = (key, tag)
if self.enable_cache and key2 in self.cache:
return self.cache[key2]
pg = self.secondary[key2]
pat = pg.gen()
self.resolve_subpatterns(pat, pg.tag)
self.cache[key2] = pat
return pat
def set_secondary(self, key: str, tag: str, value: PatternGenerator) -> None:
self.secondary[(key, tag)] = value
if (key, tag) in self.cache:
logger.warning(f'Replaced library item "{key}" & existing cache entry.'
' Previously-generated Pattern will *not* be updated!')
del self.cache[(key, tag)]
def resolve_subpatterns(self, pat: 'Pattern', tag: str) -> 'Pattern':
logger.debug(f'Resolving subpatterns in {pat.name}')
for sp in pat.subpatterns:
if sp.pattern is not None:
continue
key = sp.identifier[0]
if key in self.primary:
sp.pattern = self.get_primary(key)
continue
if (key, tag) in self.secondary:
sp.pattern = self.get_secondary(key, tag)
continue
raise LibraryError(f'Broken reference to {key} (tag {tag})')
return pat
def keys(self) -> Iterator[str]:
return iter(self.primary.keys())
def values(self) -> Iterator['Pattern']:
return iter(self[key] for key in self.keys())
def items(self) -> Iterator[Tuple[str, 'Pattern']]:
return iter((key, self[key]) for key in self.keys())
def __repr__(self) -> str:
return '<Library with keys ' + repr(list(self.primary.keys())) + '>'
def set_const(
self,
key: str,
tag: Any,
const: 'Pattern',
secondary: bool = False,
) -> None:
"""
Convenience function to avoid having to manually wrap
constant values into callables.
Args:
key: Lookup key, usually the cell/pattern name
tag: Unique tag for the source, used to disambiguate secondary patterns
const: Pattern object to return
secondary: If True, this pattern is not accessible for normal lookup, and is
only used as a sub-component of other patterns if no non-secondary
equivalent is available.
"""
pg = PatternGenerator(tag=tag, gen=lambda: const)
if secondary:
self.secondary[(key, tag)] = pg
else:
self.primary[key] = pg
def set_value(
self,
key: str,
tag: str,
value: Callable[[], 'Pattern'],
secondary: bool = False,
) -> None:
"""
Convenience function to automatically build a PatternGenerator.
Args:
key: Lookup key, usually the cell/pattern name
tag: Unique tag for the source, used to disambiguate secondary patterns
value: Callable which takes no arguments and generates the `Pattern` object
secondary: If True, this pattern is not accessible for normal lookup, and is
only used as a sub-component of other patterns if no non-secondary
equivalent is available.
"""
pg = PatternGenerator(tag=tag, gen=value)
if secondary:
self.secondary[(key, tag)] = pg
else:
self.primary[key] = pg
def precache(self: L) -> L:
"""
Force all patterns into the cache
Returns:
self
"""
for key in self.primary:
_ = self.get_primary(key)
for key2 in self.secondary:
_ = self.get_secondary(*key2)
return self
def add(
self: L,
other: L,
use_ours: Callable[[Union[str, Tuple[str, str]]], bool] = lambda name: False,
use_theirs: Callable[[Union[str, Tuple[str, str]]], bool] = lambda name: False,
) -> L:
"""
Add keys from another library into this one.
Args:
other: The library to insert keys from
use_ours: Decision function for name conflicts.
May be called with cell names and (name, tag) tuples for primary or
secondary cells, respectively.
Should return `True` if the value from `self` should be used.
use_theirs: Decision function for name conflicts. Same format as `use_ours`.
Should return `True` if the value from `other` should be used.
`use_ours` takes priority over `use_theirs`.
Returns:
self
"""
duplicates1 = set(self.primary.keys()) & set(other.primary.keys())
duplicates2 = set(self.secondary.keys()) & set(other.secondary.keys())
keep_ours1 = set(name for name in duplicates1 if use_ours(name))
keep_ours2 = set(name for name in duplicates2 if use_ours(name))
keep_theirs1 = set(name for name in duplicates1 - keep_ours1 if use_theirs(name))
keep_theirs2 = set(name for name in duplicates2 - keep_ours2 if use_theirs(name))
conflicts1 = duplicates1 - keep_ours1 - keep_theirs1
conflicts2 = duplicates2 - keep_ours2 - keep_theirs2
if conflicts1:
raise LibraryError('Unresolved duplicate keys encountered in library merge: ' + pformat(conflicts1))
if conflicts2:
raise LibraryError('Unresolved duplicate secondary keys encountered in library merge: ' + pformat(conflicts2))
for key1 in set(other.primary.keys()) - keep_ours1:
self[key1] = other.primary[key1]
if key1 in other.cache:
self.cache[key1] = other.cache[key1]
for key2 in set(other.secondary.keys()) - keep_ours2:
self.set_secondary(*key2, other.secondary[key2])
if key2 in other.cache:
self.cache[key2] = other.cache[key2]
return self
def demote(self, key: str) -> None:
"""
Turn a primary pattern into a secondary one.
It will no longer be accessible through [] indexing and will only be used to
when referenced by other patterns from the same source, and only if no primary
pattern with the same name exists.
Args:
key: Lookup key, usually the cell/pattern name
"""
pg = self.primary[key]
key2 = (key, pg.tag)
self.secondary[key2] = pg
if key in self.cache:
self.cache[key2] = self.cache[key]
del self[key]
def promote(self, key: str, tag: str) -> None:
"""
Turn a secondary pattern into a primary one.
It will become accessible through [] indexing and will be used to satisfy any
reference to a pattern with its key, regardless of tag.
Args:
key: Lookup key, usually the cell/pattern name
tag: Unique tag for identifying the pattern's source, used to disambiguate
secondary patterns
"""
if key in self.primary:
raise LibraryError(f'Promoting ({key}, {tag}), but {key} already exists in primary!')
key2 = (key, tag)
pg = self.secondary[key2]
self.primary[key] = pg
if key2 in self.cache:
self.cache[key] = self.cache[key2]
del self.secondary[key2]
del self.cache[key2]
def copy(self, preserve_cache: bool = False) -> 'Library':
"""
Create a copy of this `Library`.
A shallow copy is made of the contained dicts.
Note that you should probably clear the cache (with `clear_cache()`) after copying.
Returns:
A copy of self
"""
new = Library()
new.primary.update(self.primary)
new.secondary.update(self.secondary)
new.cache.update(self.cache)
return new
def clear_cache(self: L) -> L:
"""
Clear the cache of this library.
This is usually used before modifying or deleting cells, e.g. when merging
with another library.
Returns:
self
"""
self.cache = {}
return self
r"""
# Add a filter for names which aren't added
- Registration:
- scanned files (tag=filename, gen_fn[stream, {name: pos}])
- generator functions (tag='fn?', gen_fn[params])
- merge decision function (based on tag and cell name, can be "neither") ??? neither=keep both, load using same tag!
- Load process:
- file:
- read single cell
- check subpat identifiers, and load stuff recursively based on those. If not present, load from same file??
- function:
- generate cell
- traverse and check if we should load any subcells from elsewhere. replace if so.
* should fn generate subcells at all, or register those separately and have us control flow? maybe ask us and generate itself if not present?
- Scan all GDS files, save name -> (file, position). Keep the streams handy.
- Merge all names. This requires subcell merge because we don't know hierarchy.
- possibly include a "neither" option during merge, to deal with subcells. Means: just use parent's file.
"""

File diff suppressed because it is too large Load Diff

539
masque/ports.py Normal file
View File

@ -0,0 +1,539 @@
from typing import overload, Self, NoReturn, Any
from collections.abc import Iterable, KeysView, ValuesView, Mapping
import warnings
import traceback
import logging
import functools
from collections import Counter
from abc import ABCMeta, abstractmethod
from itertools import chain
import numpy
from numpy import pi
from numpy.typing import ArrayLike, NDArray
from .traits import PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable
from .utils import rotate_offsets_around
from .error import PortError
logger = logging.getLogger(__name__)
@functools.total_ordering
class Port(PositionableImpl, Rotatable, PivotableImpl, Copyable, Mirrorable):
"""
A point at which a `Device` can be snapped to another `Device`.
Each port has an `offset` ((x, y) position) and may also have a
`rotation` (orientation) and a `ptype` (port type).
The `rotation` is an angle, in radians, measured counterclockwise
from the +x axis, pointing inwards into the device which owns the port.
The rotation may be set to `None`, indicating that any orientation is
allowed (e.g. for a DC electrical port). It is stored modulo 2pi.
The `ptype` is an arbitrary string, default of `unk` (unknown).
"""
__slots__ = (
'ptype', '_rotation',
# inherited:
'_offset',
)
_rotation: float | None
""" radians counterclockwise from +x, pointing into device body.
Can be `None` to signify undirected port """
ptype: str
""" Port types must match to be plugged together if both are non-zero """
def __init__(
self,
offset: ArrayLike,
rotation: float | None,
ptype: str = 'unk',
) -> None:
self.offset = offset
self.rotation = rotation
self.ptype = ptype
@property
def rotation(self) -> float | None:
""" Rotation, radians counterclockwise, pointing into device body. Can be None. """
return self._rotation
@rotation.setter
def rotation(self, val: float) -> None:
if val is None:
self._rotation = None
else:
if not numpy.size(val) == 1:
raise PortError('Rotation must be a scalar')
self._rotation = val % (2 * pi)
@property
def x(self) -> float:
""" Alias for offset[0] """
return self.offset[0]
@x.setter
def x(self, val: float) -> None:
self.offset[0] = val
@property
def y(self) -> float:
""" Alias for offset[1] """
return self.offset[1]
@y.setter
def y(self, val: float) -> None:
self.offset[1] = val
def copy(self) -> Self:
return self.deepcopy()
def get_bounds(self) -> NDArray[numpy.float64]:
return numpy.vstack((self.offset, self.offset))
def set_ptype(self, ptype: str) -> Self:
""" Chainable setter for `ptype` """
self.ptype = ptype
return self
def mirror(self, axis: int = 0) -> Self:
self.offset[1 - axis] *= -1
if self.rotation is not None:
self.rotation *= -1
self.rotation += axis * pi
return self
def rotate(self, rotation: float) -> Self:
if self.rotation is not None:
self.rotation += rotation
return self
def set_rotation(self, rotation: float | None) -> Self:
self.rotation = rotation
return self
def __repr__(self) -> str:
if self.rotation is None:
rot = 'any'
else:
rot = str(numpy.rad2deg(self.rotation))
return f'<{self.offset}, {rot}, [{self.ptype}]>'
def __lt__(self, other: 'Port') -> bool:
if self.ptype != other.ptype:
return self.ptype < other.ptype
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.rotation != other.rotation:
if self.rotation is None:
return True
if other.rotation is None:
return False
return self.rotation < other.rotation
return False
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and self.ptype == other.ptype
and numpy.array_equal(self.offset, other.offset)
and self.rotation == other.rotation
)
class PortList(metaclass=ABCMeta):
__slots__ = () # Allow subclasses to use __slots__
@property
@abstractmethod
def ports(self) -> dict[str, Port]:
""" Uniquely-named ports which can be used to snap to other Device instances"""
pass
@ports.setter
@abstractmethod
def ports(self, value: dict[str, Port]) -> None:
pass
@overload
def __getitem__(self, key: str) -> Port:
pass
@overload
def __getitem__(self, key: list[str] | tuple[str, ...] | KeysView[str] | ValuesView[str]) -> dict[str, Port]:
pass
def __getitem__(self, key: str | Iterable[str]) -> Port | dict[str, Port]:
"""
For convenience, ports can be read out using square brackets:
- `pattern['A'] == Port((0, 0), 0)`
- ```
pattern[['A', 'B']] == {
'A': Port((0, 0), 0),
'B': Port((0, 0), pi),
}
```
"""
if isinstance(key, str):
return self.ports[key]
else: # noqa: RET505
return {k: self.ports[k] for k in key}
def __contains__(self, key: str) -> NoReturn:
raise NotImplementedError('PortsList.__contains__ is left unimplemented. Use `key in container.ports` instead.')
# NOTE: Didn't add keys(), items(), values(), __contains__(), etc.
# because it's weird on stuff like Pattern that contains other lists
# and because you can just grab .ports and use that instead
def mkport(
self,
name: str,
value: Port,
) -> Self:
"""
Create a port, raising a `PortError` if a port with the same name already exists.
Args:
name: Name for the port. A port with this name should not already exist.
value: The `Port` object to which `name` will refer.
Returns:
self
Raises:
`PortError` if the name already exists.
"""
if name in self.ports:
raise PortError(f'Port {name} already exists.')
assert name not in self.ports
self.ports[name] = value
return self
def rename_ports(
self,
mapping: dict[str, str | None],
overwrite: bool = False,
) -> Self:
"""
Renames ports as specified by `mapping`.
Ports can be explicitly deleted by mapping them to `None`.
Args:
mapping: dict of `{'old_name': 'new_name'}` pairs. Names can be mapped
to `None` to perform an explicit deletion. `'new_name'` can also
overwrite an existing non-renamed port to implicitly delete it if
`overwrite` is set to `True`.
overwrite: Allows implicit deletion of ports if set to `True`; see `mapping`.
Returns:
self
"""
if not overwrite:
duplicates = (set(self.ports.keys()) - set(mapping.keys())) & set(mapping.values())
if duplicates:
raise PortError(f'Unrenamed ports would be overwritten: {duplicates}')
renamed = {vv: self.ports.pop(kk) for kk, vv in mapping.items()}
if None in renamed:
del renamed[None]
self.ports.update(renamed) # type: ignore
return self
def add_port_pair(
self,
offset: ArrayLike = (0, 0),
rotation: float = 0.0,
names: tuple[str, str] = ('A', 'B'),
ptype: str = 'unk',
) -> Self:
"""
Add a pair of ports with opposing directions at the specified location.
Args:
offset: Location at which to add the ports
rotation: Orientation of the first port. Radians, counterclockwise.
Default 0.
names: Names for the two ports. Default 'A' and 'B'
ptype: Sets the port type for both ports.
Returns:
self
"""
new_ports = {
names[0]: Port(offset, rotation=rotation, ptype=ptype),
names[1]: Port(offset, rotation=rotation + pi, ptype=ptype),
}
self.check_ports(names)
self.ports.update(new_ports)
return self
def plugged(
self,
connections: dict[str, str],
) -> Self:
"""
Verify that the ports specified by `connections` are coincident and have opposing
rotations, then remove the ports.
This is used when ports have been "manually" aligned as part of some other routing,
but for whatever reason were not eliminated via `plug()`.
Args:
connections: Pairs of ports which "plug" each other (same offset, opposing directions)
Returns:
self
Raises:
`PortError` if the ports are not properly aligned.
"""
a_names, b_names = list(zip(*connections.items(), strict=True))
a_ports = [self.ports[pp] for pp in a_names]
b_ports = [self.ports[pp] for pp in b_names]
a_types = [pp.ptype for pp in a_ports]
b_types = [pp.ptype for pp in b_ports]
type_conflicts = numpy.array([at != bt and 'unk' not in (at, bt)
for at, bt in zip(a_types, b_types, strict=True)])
if type_conflicts.any():
msg = 'Ports have conflicting types:\n'
for nn, (k, v) in enumerate(connections.items()):
if type_conflicts[nn]:
msg += f'{k} | {a_types[nn]}:{b_types[nn]} | {v}\n'
msg = ''.join(traceback.format_stack()) + '\n' + msg
warnings.warn(msg, stacklevel=2)
a_offsets = numpy.array([pp.offset for pp in a_ports])
b_offsets = numpy.array([pp.offset for pp in b_ports])
a_rotations = numpy.array([pp.rotation if pp.rotation is not None else 0 for pp in a_ports])
b_rotations = numpy.array([pp.rotation if pp.rotation is not None else 0 for pp in b_ports])
a_has_rot = numpy.array([pp.rotation is not None for pp in a_ports], dtype=bool)
b_has_rot = numpy.array([pp.rotation is not None for pp in b_ports], dtype=bool)
has_rot = a_has_rot & b_has_rot
if has_rot.any():
rotations = numpy.mod(a_rotations - b_rotations - pi, 2 * pi)
rotations[~has_rot] = rotations[has_rot][0]
if not numpy.allclose(rotations, 0):
rot_deg = numpy.rad2deg(rotations)
msg = 'Port orientations do not match:\n'
for nn, (k, v) in enumerate(connections.items()):
if not numpy.isclose(rot_deg[nn], 0):
msg += f'{k} | {rot_deg[nn]:g} | {v}\n'
raise PortError(msg)
translations = a_offsets - b_offsets
if not numpy.allclose(translations, 0):
msg = 'Port translations do not match:\n'
for nn, (k, v) in enumerate(connections.items()):
if not numpy.allclose(translations[nn], 0):
msg += f'{k} | {translations[nn]} | {v}\n'
raise PortError(msg)
for pp in chain(a_names, b_names):
del self.ports[pp]
return self
def check_ports(
self,
other_names: Iterable[str],
map_in: dict[str, str] | None = None,
map_out: dict[str, str | None] | None = None,
) -> Self:
"""
Given the provided port mappings, check that:
- All of the ports specified in the mappings exist
- There are no duplicate port names after all the mappings are performed
Args:
other_names: List of port names being considered for inclusion into
`self.ports` (before mapping)
map_in: dict of `{'self_port': 'other_port'}` mappings, specifying
port connections between the two devices.
map_out: dict of `{'old_name': 'new_name'}` mappings, specifying
new names for unconnected `other_names` ports.
Returns:
self
Raises:
`PortError` if any ports specified in `map_in` or `map_out` do not
exist in `self.ports` or `other_names`.
`PortError` if there are any duplicate names after `map_in` and `map_out`
are applied.
"""
if map_in is None:
map_in = {}
if map_out is None:
map_out = {}
other = set(other_names)
missing_inkeys = set(map_in.keys()) - set(self.ports.keys())
if missing_inkeys:
raise PortError(f'`map_in` keys not present in device: {missing_inkeys}')
missing_invals = set(map_in.values()) - other
if missing_invals:
raise PortError(f'`map_in` values not present in other device: {missing_invals}')
missing_outkeys = set(map_out.keys()) - other
if missing_outkeys:
raise PortError(f'`map_out` keys not present in other device: {missing_outkeys}')
orig_remaining = set(self.ports.keys()) - set(map_in.keys())
other_remaining = other - set(map_out.keys()) - set(map_in.values())
mapped_vals = set(map_out.values())
mapped_vals.discard(None)
conflicts_final = orig_remaining & (other_remaining | mapped_vals)
if conflicts_final:
raise PortError(f'Device ports conflict with existing ports: {conflicts_final}')
conflicts_partial = other_remaining & mapped_vals
if conflicts_partial:
raise PortError(f'`map_out` targets conflict with non-mapped outputs: {conflicts_partial}')
map_out_counts = Counter(map_out.values())
map_out_counts[None] = 0
conflicts_out = {k for k, v in map_out_counts.items() if v > 1}
if conflicts_out:
raise PortError(f'Duplicate targets in `map_out`: {conflicts_out}')
return self
def find_transform(
self,
other: 'PortList',
map_in: dict[str, str],
*,
mirrored: bool = False,
set_rotation: bool | None = None,
) -> tuple[NDArray[numpy.float64], float, NDArray[numpy.float64]]:
"""
Given a device `other` and a mapping `map_in` specifying port connections,
find the transform which will correctly align the specified ports.
Args:
other: a device
map_in: dict of `{'self_port': 'other_port'}` mappings, specifying
port connections between the two devices.
mirrored: Mirrors `other` across the x axis prior to
connecting any ports.
set_rotation: If the necessary rotation cannot be determined from
the ports being connected (i.e. all pairs have at least one
port with `rotation=None`), `set_rotation` must be provided
to indicate how much `other` should be rotated. Otherwise,
`set_rotation` must remain `None`.
Returns:
- The (x, y) translation (performed last)
- The rotation (radians, counterclockwise)
- The (x, y) pivot point for the rotation
The rotation should be performed before the translation.
"""
s_ports = self[map_in.keys()]
o_ports = other[map_in.values()]
return self.find_port_transform(
s_ports=s_ports,
o_ports=o_ports,
map_in=map_in,
mirrored=mirrored,
set_rotation=set_rotation,
)
@staticmethod
def find_port_transform(
s_ports: Mapping[str, Port],
o_ports: Mapping[str, Port],
map_in: dict[str, str],
*,
mirrored: bool = False,
set_rotation: bool | None = None,
) -> tuple[NDArray[numpy.float64], float, NDArray[numpy.float64]]:
"""
Given two sets of ports (s_ports and o_ports) and a mapping `map_in`
specifying port connections, find the transform which will correctly
align the specified o_ports onto their respective s_ports.
Args:t
s_ports: A list of stationary ports
o_ports: A list of ports which are to be moved/mirrored.
map_in: dict of `{'s_port': 'o_port'}` mappings, specifying
port connections.
mirrored: Mirrors `o_ports` across the x axis prior to
connecting any ports.
set_rotation: If the necessary rotation cannot be determined from
the ports being connected (i.e. all pairs have at least one
port with `rotation=None`), `set_rotation` must be provided
to indicate how much `o_ports` should be rotated. Otherwise,
`set_rotation` must remain `None`.
Returns:
- The (x, y) translation (performed last)
- The rotation (radians, counterclockwise)
- The (x, y) pivot point for the rotation
The rotation should be performed before the translation.
"""
s_offsets = numpy.array([p.offset for p in s_ports.values()])
o_offsets = numpy.array([p.offset for p in o_ports.values()])
s_types = [p.ptype for p in s_ports.values()]
o_types = [p.ptype for p in o_ports.values()]
s_rotations = numpy.array([p.rotation if p.rotation is not None else 0 for p in s_ports.values()])
o_rotations = numpy.array([p.rotation if p.rotation is not None else 0 for p in o_ports.values()])
s_has_rot = numpy.array([p.rotation is not None for p in s_ports.values()], dtype=bool)
o_has_rot = numpy.array([p.rotation is not None for p in o_ports.values()], dtype=bool)
has_rot = s_has_rot & o_has_rot
if mirrored:
o_offsets[:, 1] *= -1
o_rotations *= -1
type_conflicts = numpy.array([st != ot and 'unk' not in (st, ot)
for st, ot in zip(s_types, o_types, strict=True)])
if type_conflicts.any():
msg = 'Ports have conflicting types:\n'
for nn, (k, v) in enumerate(map_in.items()):
if type_conflicts[nn]:
msg += f'{k} | {s_types[nn]}:{o_types[nn]} | {v}\n'
msg = ''.join(traceback.format_stack()) + '\n' + msg
warnings.warn(msg, stacklevel=2)
rotations = numpy.mod(s_rotations - o_rotations - pi, 2 * pi)
if not has_rot.any():
if set_rotation is None:
PortError('Must provide set_rotation if rotation is indeterminate')
rotations[:] = set_rotation
else:
rotations[~has_rot] = rotations[has_rot][0]
if not numpy.allclose(rotations[:1], rotations):
rot_deg = numpy.rad2deg(rotations)
msg = 'Port orientations do not match:\n'
for nn, (k, v) in enumerate(map_in.items()):
msg += f'{k} | {rot_deg[nn]:g} | {v}\n'
raise PortError(msg)
pivot = o_offsets[0].copy()
rotate_offsets_around(o_offsets, pivot, rotations[0])
translations = s_offsets - o_offsets
if not numpy.allclose(translations[:1], translations):
msg = 'Port translations do not match:\n'
for nn, (k, v) in enumerate(map_in.items()):
msg += f'{k} | {translations[nn]} | {v}\n'
raise PortError(msg)
return translations[0], rotations[0], o_offsets[0]

236
masque/ref.py Normal file
View File

@ -0,0 +1,236 @@
"""
Ref provides basic support for nesting Pattern objects within each other.
It carries offset, rotation, mirroring, and scaling data for each individual instance.
"""
from typing import TYPE_CHECKING, Self, Any
from collections.abc import Mapping
import copy
import functools
import numpy
from numpy import pi
from numpy.typing import NDArray, ArrayLike
from .utils import annotations_t, rotation_matrix_2d, annotations_eq, annotations_lt, rep2key
from .repetition import Repetition
from .traits import (
PositionableImpl, RotatableImpl, ScalableImpl,
Mirrorable, PivotableImpl, Copyable, RepeatableImpl, AnnotatableImpl,
)
if TYPE_CHECKING:
from . import Pattern
@functools.total_ordering
class Ref(
PositionableImpl, RotatableImpl, ScalableImpl, Mirrorable,
PivotableImpl, Copyable, RepeatableImpl, AnnotatableImpl,
):
"""
`Ref` provides basic support for nesting Pattern objects within each other.
It containts the transformation (mirror, rotation, scale, offset, repetition)
and annotations for a single instantiation of a `Pattern`.
Note that the target (i.e. which pattern a `Ref` instantiates) is not stored within the
`Ref` itself, but is specified by the containing `Pattern`.
Order of operations is (mirror, rotate, scale, translate, repeat).
"""
__slots__ = (
'_mirrored',
# inherited
'_offset', '_rotation', 'scale', '_repetition', '_annotations',
)
_mirrored: bool
""" Whether to mirror the instance across the x axis (new_y = -old_y)ubefore rotating. """
# Mirrored property
@property
def mirrored(self) -> bool: # mypy#3004, setter should be SupportsBool
return self._mirrored
@mirrored.setter
def mirrored(self, val: bool) -> None:
self._mirrored = bool(val)
def __init__(
self,
*,
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0.0,
mirrored: bool = False,
scale: float = 1.0,
repetition: Repetition | None = None,
annotations: annotations_t | None = None,
) -> None:
"""
Note: Order is (mirror, rotate, scale, translate, repeat)
Args:
offset: (x, y) offset applied to the referenced pattern. Not affected by rotation etc.
rotation: Rotation (radians, counterclockwise) relative to the referenced pattern's (0, 0).
mirrored: Whether to mirror the referenced pattern across its x axis before rotating.
scale: Scaling factor applied to the pattern's geometry.
repetition: `Repetition` object, default `None`
"""
self.offset = offset
self.rotation = rotation
self.scale = scale
self.mirrored = mirrored
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
def __copy__(self) -> 'Ref':
new = Ref(
offset=self.offset.copy(),
rotation=self.rotation,
scale=self.scale,
mirrored=self.mirrored,
repetition=copy.deepcopy(self.repetition),
annotations=copy.deepcopy(self.annotations),
)
return new
def __deepcopy__(self, memo: dict | None = None) -> 'Ref':
memo = {} if memo is None else memo
new = copy.copy(self)
#new.repetition = copy.deepcopy(self.repetition, memo)
#new.annotations = copy.deepcopy(self.annotations, memo)
return new
def __lt__(self, other: 'Ref') -> bool:
if (self.offset != other.offset).any():
return tuple(self.offset) < tuple(other.offset)
if self.mirrored != other.mirrored:
return self.mirrored < other.mirrored
if self.rotation != other.rotation:
return self.rotation < other.rotation
if self.scale != other.scale:
return self.scale < other.scale
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
def __eq__(self, other: Any) -> bool:
return (
numpy.array_equal(self.offset, other.offset)
and self.mirrored == other.mirrored
and self.rotation == other.rotation
and self.scale == other.scale
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def as_pattern(
self,
pattern: 'Pattern',
) -> 'Pattern':
"""
Args:
pattern: Pattern object to transform
Returns:
A copy of the referenced Pattern which has been scaled, rotated, etc.
according to this `Ref`'s properties.
"""
pattern = pattern.deepcopy()
if self.scale != 1:
pattern.scale_by(self.scale)
if self.mirrored:
pattern.mirror()
if self.rotation % (2 * pi) != 0:
pattern.rotate_around((0.0, 0.0), self.rotation)
if numpy.any(self.offset):
pattern.translate_elements(self.offset)
if self.repetition is not None:
combined = type(pattern)()
for dd in self.repetition.displacements:
temp_pat = pattern.deepcopy()
temp_pat.ports = {}
temp_pat.translate_elements(dd)
combined.append(temp_pat)
pattern = combined
return pattern
def rotate(self, rotation: float) -> Self:
self.rotation += rotation
if self.repetition is not None:
self.repetition.rotate(rotation)
return self
def mirror(self, axis: int = 0) -> Self:
self.mirror_target(axis)
self.rotation *= -1
if self.repetition is not None:
self.repetition.mirror(axis)
return self
def mirror_target(self, axis: int = 0) -> Self:
self.mirrored = not self.mirrored
self.rotation += axis * pi
return self
def mirror2d_target(self, across_x: bool = False, across_y: bool = False) -> Self:
self.mirrored = bool((self.mirrored + across_x + across_y) % 2)
if across_y:
self.rotation += pi
return self
def as_transforms(self) -> NDArray[numpy.float64]:
xys = self.offset[None, :]
if self.repetition is not None:
xys = xys + self.repetition.displacements
transforms = numpy.empty((xys.shape[0], 4))
transforms[:, :2] = xys
transforms[:, 2] = self.rotation
transforms[:, 3] = self.mirrored
return transforms
def get_bounds_single(
self,
pattern: 'Pattern',
*,
library: Mapping[str, 'Pattern'] | None = None,
) -> NDArray[numpy.float64] | None:
"""
Return a `numpy.ndarray` containing `[[x_min, y_min], [x_max, y_max]]`, corresponding to the
extent of the `Ref` in each dimension.
Returns `None` if the contained `Pattern` is empty.
Args:
library: Name-to-Pattern mapping for resul
Returns:
`[[x_min, y_min], [x_max, y_max]]` or `None`
"""
if pattern.is_empty():
# no need to run as_pattern()
return None
# if rotation is manhattan, can take pattern's bounds and transform them
if numpy.isclose(self.rotation % (pi / 2), 0):
unrot_bounds = pattern.get_bounds(library)
if unrot_bounds is None:
return None
if self.mirrored:
unrot_bounds[:, 1] *= -1
corners = (rotation_matrix_2d(self.rotation) @ unrot_bounds.T).T
bounds = numpy.vstack((numpy.min(corners, axis=0),
numpy.max(corners, axis=0))) * self.scale + [self.offset]
return bounds
return self.as_pattern(pattern=pattern).get_bounds(library)
def __repr__(self) -> str:
rotation = f' r{numpy.rad2deg(self.rotation):g}' if self.rotation != 0 else ''
scale = f' d{self.scale:g}' if self.scale != 1 else ''
mirrored = ' m' if self.mirrored else ''
return f'<Ref {self.offset}{rotation}{scale}{mirrored}>'

View File

@ -2,24 +2,28 @@
Repetitions provide support for efficiently representing multiple identical
instances of an object .
"""
from typing import Union, Dict, Optional, Sequence, Any, Type
from typing import Any, Self, TypeVar, cast
import copy
import functools
from abc import ABCMeta, abstractmethod
import numpy
from numpy.typing import ArrayLike, NDArray
from .traits import Copyable, Scalable, Rotatable, Mirrorable, Bounded
from .error import PatternError
from .utils import rotation_matrix_2d, AutoSlots
from .traits import LockableImpl, Copyable, Scalable, Rotatable, Mirrorable
from .utils import rotation_matrix_2d
class Repetition(Copyable, Rotatable, Mirrorable, Scalable, metaclass=ABCMeta):
GG = TypeVar('GG', bound='Grid')
@functools.total_ordering
class Repetition(Copyable, Rotatable, Mirrorable, Scalable, Bounded, metaclass=ABCMeta):
"""
Interface common to all objects which specify repetitions
"""
__slots__ = ()
__slots__ = () # Allow subclasses to use __slots__
@property
@abstractmethod
@ -29,8 +33,16 @@ class Repetition(Copyable, Rotatable, Mirrorable, Scalable, metaclass=ABCMeta):
"""
pass
@abstractmethod
def __le__(self, other: 'Repetition') -> bool:
pass
class Grid(LockableImpl, Repetition, metaclass=AutoSlots):
@abstractmethod
def __eq__(self, other: Any) -> bool:
pass
class Grid(Repetition):
"""
`Grid` describes a 2D grid formed by two basis vectors and two 'counts' (sizes).
@ -39,10 +51,10 @@ class Grid(LockableImpl, Repetition, metaclass=AutoSlots):
Note that the offsets in either the 2D or 1D grids do not have to be axis-aligned.
"""
__slots__ = ('_a_vector',
'_b_vector',
'_a_count',
'_b_count')
__slots__ = (
'_a_vector', '_b_vector',
'_a_count', '_b_count',
)
_a_vector: NDArray[numpy.float64]
""" Vector `[x, y]` specifying the first lattice vector of the grid.
@ -52,7 +64,7 @@ class Grid(LockableImpl, Repetition, metaclass=AutoSlots):
_a_count: int
""" Number of instances along the direction specified by the `a_vector` """
_b_vector: Optional[NDArray[numpy.float64]]
_b_vector: NDArray[numpy.float64] | None
""" Vector `[x, y]` specifying a second lattice vector for the grid.
Specifies center-to-center spacing between adjacent elements.
Can be `None` for a 1D array.
@ -65,9 +77,8 @@ class Grid(LockableImpl, Repetition, metaclass=AutoSlots):
self,
a_vector: ArrayLike,
a_count: int,
b_vector: Optional[ArrayLike] = None,
b_count: Optional[int] = 1,
locked: bool = False,
b_vector: ArrayLike | None = None,
b_count: int | None = 1,
) -> None:
"""
Args:
@ -79,7 +90,6 @@ class Grid(LockableImpl, Repetition, metaclass=AutoSlots):
Can be omitted when specifying a 1D array.
b_count: Number of elements in the `b_vector` direction.
Should be omitted if `b_vector` was omitted.
locked: Whether the `Grid` is locked after initialization.
Raises:
PatternError if `b_*` inputs conflict with each other
@ -91,7 +101,6 @@ class Grid(LockableImpl, Repetition, metaclass=AutoSlots):
if b_vector is None:
if b_count > 1:
raise PatternError('Repetition has b_count > 1 but no b_vector')
else:
b_vector = numpy.array([0.0, 0.0])
if a_count < 1:
@ -99,21 +108,19 @@ class Grid(LockableImpl, Repetition, metaclass=AutoSlots):
if b_count < 1:
raise PatternError(f'Repetition has too-small b_count: {b_count}')
object.__setattr__(self, 'locked', False)
self.a_vector = a_vector # type: ignore # setter handles type conversion
self.b_vector = b_vector # type: ignore # setter handles type conversion
self.a_count = a_count
self.b_count = b_count
self.locked = locked
@classmethod
def aligned(
cls: Type,
cls: type[GG],
x: float,
y: float,
x_count: int,
y_count: int,
) -> 'Grid':
) -> GG:
"""
Simple constructor for an axis-aligned 2D grid
@ -129,18 +136,17 @@ class Grid(LockableImpl, Repetition, metaclass=AutoSlots):
return cls(a_vector=(x, 0), b_vector=(0, y), a_count=x_count, b_count=y_count)
def __copy__(self) -> 'Grid':
new = Grid(a_vector=self.a_vector.copy(),
new = Grid(
a_vector=self.a_vector.copy(),
b_vector=copy.copy(self.b_vector),
a_count=self.a_count,
b_count=self.b_count,
locked=self.locked)
)
return new
def __deepcopy__(self, memo: Dict = None) -> 'Grid':
def __deepcopy__(self, memo: dict | None = None) -> Self:
memo = {} if memo is None else memo
new = copy.copy(self)
LocakbleImpl.unlock(new)
new.locked = self.locked
return new
# a_vector property
@ -150,22 +156,20 @@ class Grid(LockableImpl, Repetition, metaclass=AutoSlots):
@a_vector.setter
def a_vector(self, val: ArrayLike) -> None:
if not isinstance(val, numpy.ndarray):
val = numpy.array(val, dtype=float)
if val.size != 2:
raise PatternError('a_vector must be convertible to size-2 ndarray')
self._a_vector = val.flatten().astype(float)
self._a_vector = val.flatten()
# b_vector property
@property
def b_vector(self) -> Optional[NDArray[numpy.float64]]:
def b_vector(self) -> NDArray[numpy.float64] | None:
return self._b_vector
@b_vector.setter
def b_vector(self, val: ArrayLike) -> None:
if not isinstance(val, numpy.ndarray):
val = numpy.array(val, dtype=float, copy=True)
val = numpy.array(val, dtype=float)
if val.size != 2:
raise PatternError('b_vector must be convertible to size-2 ndarray')
@ -202,7 +206,7 @@ class Grid(LockableImpl, Repetition, metaclass=AutoSlots):
return (aa.flatten()[:, None] * self.a_vector[None, :]
+ bb.flatten()[:, None] * self.b_vector[None, :]) # noqa
def rotate(self, rotation: float) -> 'Grid':
def rotate(self, rotation: float) -> Self:
"""
Rotate lattice vectors (around (0, 0))
@ -217,7 +221,7 @@ class Grid(LockableImpl, Repetition, metaclass=AutoSlots):
self.b_vector = numpy.dot(rotation_matrix_2d(rotation), self.b_vector)
return self
def mirror(self, axis: int) -> 'Grid':
def mirror(self, axis: int = 0) -> Self:
"""
Mirror the Grid across an axis.
@ -233,7 +237,7 @@ class Grid(LockableImpl, Repetition, metaclass=AutoSlots):
self.b_vector[1 - axis] *= -1
return self
def get_bounds(self) -> Optional[NDArray[numpy.float64]]:
def get_bounds(self) -> NDArray[numpy.float64] | None:
"""
Return a `numpy.ndarray` containing `[[x_min, y_min], [x_max, y_max]]`, corresponding to the
extent of the `Grid` in each dimension.
@ -241,15 +245,19 @@ class Grid(LockableImpl, Repetition, metaclass=AutoSlots):
Returns:
`[[x_min, y_min], [x_max, y_max]]` or `None`
"""
a_extent = self.a_vector * self.a_count
b_extent = self.b_vector * self.b_count if (self.b_vector is not None) else 0 # type: Union[NDArray[numpy.float64], float]
a_extent = self.a_vector * (self.a_count - 1)
if self.b_count is None:
b_extent = numpy.zeros(2)
else:
assert self.b_vector is not None
b_extent = self.b_vector * (self.b_count - 1)
corners = ((0, 0), a_extent, b_extent, a_extent + b_extent)
corners = numpy.stack(((0, 0), a_extent, b_extent, a_extent + b_extent))
xy_min = numpy.min(corners, axis=0)
xy_max = numpy.max(corners, axis=0)
return numpy.array((xy_min, xy_max))
def scale_by(self, c: float) -> 'Grid':
def scale_by(self, c: float) -> Self:
"""
Scale the Grid by a factor
@ -264,39 +272,12 @@ class Grid(LockableImpl, Repetition, metaclass=AutoSlots):
self.b_vector *= c
return self
def lock(self) -> 'Grid':
"""
Lock the `Grid`, disallowing changes.
Returns:
self
"""
self.a_vector.flags.writeable = False
if self.b_vector is not None:
self.b_vector.flags.writeable = False
LockableImpl.lock(self)
return self
def unlock(self) -> 'Grid':
"""
Unlock the `Grid`
Returns:
self
"""
self.a_vector.flags.writeable = True
if self.b_vector is not None:
self.b_vector.flags.writeable = True
LockableImpl.unlock(self)
return self
def __repr__(self) -> str:
locked = ' L' if self.locked else ''
bv = f', {self.b_vector}' if self.b_vector is not None else ''
return (f'<Grid {self.a_count}x{self.b_count} ({self.a_vector}{bv}){locked}>')
return (f'<Grid {self.a_count}x{self.b_count} ({self.a_vector}{bv})>')
def __eq__(self, other: Any) -> bool:
if not isinstance(other, type(self)):
if type(other) is not type(self):
return False
if self.a_count != other.a_count or self.b_count != other.b_count:
return False
@ -306,14 +287,30 @@ class Grid(LockableImpl, Repetition, metaclass=AutoSlots):
return True
if self.b_vector is None or other.b_vector is None:
return False
if any(self.b_vector[ii] != other.b_vector[ii] for ii in range(2)):
return False
if self.locked != other.locked:
if any(self.b_vector[ii] != other.b_vector[ii] for ii in range(2)): # noqa: SIM103
return False
return True
def __le__(self, other: Repetition) -> bool:
if type(self) is not type(other):
return repr(type(self)) < repr(type(other))
other = cast(Grid, other)
if self.a_count != other.a_count:
return self.a_count < other.a_count
if self.b_count != other.b_count:
return self.b_count < other.b_count
if not numpy.array_equal(self.a_vector, other.a_vector):
return tuple(self.a_vector) < tuple(other.a_vector)
if self.b_vector is None:
return other.b_vector is not None
if other.b_vector is None:
return False
if not numpy.array_equal(self.b_vector, other.b_vector):
return tuple(self.a_vector) < tuple(other.a_vector)
return False
class Arbitrary(LockableImpl, Repetition, metaclass=AutoSlots):
class Arbitrary(Repetition):
"""
`Arbitrary` is a simple list of (absolute) displacements for instances.
@ -330,63 +327,47 @@ class Arbitrary(LockableImpl, Repetition, metaclass=AutoSlots):
"""
@property
def displacements(self) -> Any: # TODO: mypy#3004 NDArray[numpy.float64]:
def displacements(self) -> Any: # mypy#3004 NDArray[numpy.float64]:
return self._displacements
@displacements.setter
def displacements(self, val: ArrayLike) -> None:
vala: NDArray[numpy.float64] = numpy.array(val, dtype=float)
vala = numpy.sort(vala.view([('', vala.dtype)] * vala.shape[1]), 0).view(vala.dtype) # sort rows
self._displacements = vala
vala = numpy.array(val, dtype=float)
order = numpy.lexsort(vala.T[::-1]) # sortrows
self._displacements = vala[order]
def __init__(
self,
displacements: ArrayLike,
locked: bool = False,
) -> None:
"""
Args:
displacements: List of vectors (Nx2 ndarray) specifying displacements.
locked: Whether the object is locked after initialization.
"""
object.__setattr__(self, 'locked', False)
self.displacements = displacements
self.locked = locked
def lock(self) -> 'Arbitrary':
"""
Lock the object, disallowing changes.
Returns:
self
"""
self._displacements.flags.writeable = False
LockableImpl.lock(self)
return self
def unlock(self) -> 'Arbitrary':
"""
Unlock the object
Returns:
self
"""
self._displacements.flags.writeable = True
LockableImpl.unlock(self)
return self
def __repr__(self) -> str:
locked = ' L' if self.locked else ''
return (f'<Arbitrary {len(self.displacements)}pts {locked}>')
return (f'<Arbitrary {len(self.displacements)}pts >')
def __eq__(self, other: Any) -> bool:
if not isinstance(other, type(self)):
return False
if self.locked != other.locked:
if not type(other) is not type(self):
return False
return numpy.array_equal(self.displacements, other.displacements)
def rotate(self, rotation: float) -> 'Arbitrary':
def __le__(self, other: Repetition) -> bool:
if type(self) is not type(other):
return repr(type(self)) < repr(type(other))
other = cast(Arbitrary, other)
if self.displacements.size != other.displacements.size:
return self.displacements.size < other.displacements.size
neq = (self.displacements != other.displacements)
if neq.any():
return self.displacements[neq][0] < other.displacements[neq][0]
return False
def rotate(self, rotation: float) -> Self:
"""
Rotate dispacements (around (0, 0))
@ -399,7 +380,7 @@ class Arbitrary(LockableImpl, Repetition, metaclass=AutoSlots):
self.displacements = numpy.dot(rotation_matrix_2d(rotation), self.displacements.T).T
return self
def mirror(self, axis: int) -> 'Arbitrary':
def mirror(self, axis: int = 0) -> Self:
"""
Mirror the displacements across an axis.
@ -413,7 +394,7 @@ class Arbitrary(LockableImpl, Repetition, metaclass=AutoSlots):
self.displacements[1 - axis] *= -1
return self
def get_bounds(self) -> Optional[NDArray[numpy.float64]]:
def get_bounds(self) -> NDArray[numpy.float64] | None:
"""
Return a `numpy.ndarray` containing `[[x_min, y_min], [x_max, y_max]]`, corresponding to the
extent of the `displacements` in each dimension.
@ -425,7 +406,7 @@ class Arbitrary(LockableImpl, Repetition, metaclass=AutoSlots):
xy_max = numpy.max(self.displacements, axis=0)
return numpy.array((xy_min, xy_max))
def scale_by(self, c: float) -> 'Arbitrary':
def scale_by(self, c: float) -> Self:
"""
Scale the displacements by a factor

View File

@ -3,11 +3,15 @@ Shapes for use with the Pattern class, as well as the Shape abstract class from
which they are derived.
"""
from .shape import Shape, normalized_shape_tuple, DEFAULT_POLY_NUM_POINTS
from .shape import (
Shape as Shape,
normalized_shape_tuple as normalized_shape_tuple,
DEFAULT_POLY_NUM_VERTICES as DEFAULT_POLY_NUM_VERTICES,
)
from .polygon import Polygon
from .circle import Circle
from .ellipse import Ellipse
from .arc import Arc
from .text import Text
from .path import Path
from .polygon import Polygon as Polygon
from .circle import Circle as Circle
from .ellipse import Ellipse as Ellipse
from .arc import Arc as Arc
from .text import Text as Text
from .path import Path as Path

View File

@ -1,19 +1,19 @@
from typing import List, Dict, Optional, Sequence, Any
from typing import Any, cast
import copy
import math
import functools
import numpy
from numpy import pi
from numpy.typing import NDArray, ArrayLike
from . import Shape, Polygon, normalized_shape_tuple, DEFAULT_POLY_NUM_POINTS
from .. import PatternError
from . import Shape, Polygon, normalized_shape_tuple, DEFAULT_POLY_NUM_VERTICES
from ..error import PatternError
from ..repetition import Repetition
from ..utils import is_scalar, layer_t, AutoSlots, annotations_t
from ..traits import LockableImpl
from ..utils import is_scalar, annotations_t, annotations_lt, annotations_eq, rep2key
class Arc(Shape, metaclass=AutoSlots):
@functools.total_ordering
class Arc(Shape):
"""
An elliptical arc, formed by cutting off an elliptical ring with two rays which exit from its
center. It has a position, two radii, a start and stop angle, a rotation, and a width.
@ -22,8 +22,11 @@ class Arc(Shape, metaclass=AutoSlots):
The rotation gives the angle from x-axis, counterclockwise, to the first (x) radius.
The start and stop angle are measured counterclockwise from the first (x) radius.
"""
__slots__ = ('_radii', '_angles', '_width', '_rotation',
'poly_num_points', 'poly_max_arclen')
__slots__ = (
'_radii', '_angles', '_width', '_rotation',
# Inherited
'_offset', '_repetition', '_annotations',
)
_radii: NDArray[numpy.float64]
""" Two radii for defining an ellipse """
@ -37,15 +40,9 @@ class Arc(Shape, metaclass=AutoSlots):
_width: float
""" Width of the arc """
poly_num_points: Optional[int]
""" Sets the default number of points for `.polygonize()` """
poly_max_arclen: Optional[float]
""" Sets the default max segement length for `.polygonize()` """
# radius properties
@property
def radii(self) -> Any: #TODO mypy#3004 NDArray[numpy.float64]:
def radii(self) -> Any: # mypy#3004 NDArray[numpy.float64]:
"""
Return the radii `[rx, ry]`
"""
@ -82,7 +79,7 @@ class Arc(Shape, metaclass=AutoSlots):
# arc start/stop angle properties
@property
def angles(self) -> Any: #TODO mypy#3004 NDArray[numpy.float64]:
def angles(self) -> Any: # mypy#3004 NDArray[numpy.float64]:
"""
Return the start and stop angles `[a_start, a_stop]`.
Angles are measured from x-axis after rotation
@ -157,24 +154,16 @@ class Arc(Shape, metaclass=AutoSlots):
angles: ArrayLike,
width: float,
*,
poly_num_points: Optional[int] = DEFAULT_POLY_NUM_POINTS,
poly_max_arclen: Optional[float] = None,
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0,
mirrored: Sequence[bool] = (False, False),
layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
annotations: Optional[annotations_t] = None,
locked: bool = False,
repetition: Repetition | None = None,
annotations: annotations_t | None = None,
raw: bool = False,
) -> None:
LockableImpl.unlock(self)
self.identifier = ()
if raw:
assert(isinstance(radii, numpy.ndarray))
assert(isinstance(angles, numpy.ndarray))
assert(isinstance(offset, numpy.ndarray))
assert isinstance(radii, numpy.ndarray)
assert isinstance(angles, numpy.ndarray)
assert isinstance(offset, numpy.ndarray)
self._radii = radii
self._angles = angles
self._width = width
@ -182,8 +171,6 @@ class Arc(Shape, metaclass=AutoSlots):
self._rotation = rotation
self._repetition = repetition
self._annotations = annotations if annotations is not None else {}
self._layer = layer
self._dose = dose
else:
self.radii = radii
self.angles = angles
@ -192,35 +179,54 @@ class Arc(Shape, metaclass=AutoSlots):
self.rotation = rotation
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
self.layer = layer
self.dose = dose
self.poly_num_points = poly_num_points
self.poly_max_arclen = poly_max_arclen
[self.mirror(a) for a, do in enumerate(mirrored) if do]
self.set_locked(locked)
def __deepcopy__(self, memo: Dict = None) -> 'Arc':
def __deepcopy__(self, memo: dict | None = None) -> 'Arc':
memo = {} if memo is None else memo
new = copy.copy(self)
Shape.unlock(new)
new._offset = self._offset.copy()
new._radii = self._radii.copy()
new._angles = self._angles.copy()
new._annotations = copy.deepcopy(self._annotations)
new.set_locked(self.locked)
return new
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and numpy.array_equal(self.offset, other.offset)
and numpy.array_equal(self.radii, other.radii)
and numpy.array_equal(self.angles, other.angles)
and self.width == other.width
and self.rotation == other.rotation
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def __lt__(self, other: Shape) -> bool:
if type(self) is not type(other):
if repr(type(self)) != repr(type(other)):
return repr(type(self)) < repr(type(other))
return id(type(self)) < id(type(other))
other = cast(Arc, other)
if self.width != other.width:
return self.width < other.width
if not numpy.array_equal(self.radii, other.radii):
return tuple(self.radii) < tuple(other.radii)
if not numpy.array_equal(self.angles, other.angles):
return tuple(self.angles) < tuple(other.angles)
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.rotation != other.rotation:
return self.rotation < other.rotation
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
def to_polygons(
self,
poly_num_points: Optional[int] = None,
poly_max_arclen: Optional[float] = None,
) -> List[Polygon]:
if poly_num_points is None:
poly_num_points = self.poly_num_points
if poly_max_arclen is None:
poly_max_arclen = self.poly_max_arclen
if (poly_num_points is None) and (poly_max_arclen is None):
num_vertices: int | None = DEFAULT_POLY_NUM_VERTICES,
max_arclen: float | None = None,
) -> list[Polygon]:
if (num_vertices is None) and (max_arclen is None):
raise PatternError('Max number of points and arclength left unspecified'
+ ' (default was also overridden)')
@ -229,27 +235,62 @@ class Arc(Shape, metaclass=AutoSlots):
# Convert from polar angle to ellipse parameter (for [rx*cos(t), ry*sin(t)] representation)
a_ranges = self._angles_to_parameters()
# Approximate perimeter
# Ramanujan, S., "Modular Equations and Approximations to ,"
# Quart. J. Pure. Appl. Math., vol. 45 (1913-1914), pp. 350-372
a0, a1 = a_ranges[1] # use outer arc
h = ((r1 - r0) / (r1 + r0)) ** 2
ellipse_perimeter = pi * (r1 + r0) * (1 + 3 * h / (10 + math.sqrt(4 - 3 * h)))
perimeter = abs(a0 - a1) / (2 * pi) * ellipse_perimeter # TODO: make this more accurate
# Approximate perimeter via numerical integration
n = []
if poly_num_points is not None:
n += [poly_num_points]
if poly_max_arclen is not None:
n += [perimeter / poly_max_arclen]
num_points = int(round(max(n)))
#perimeter1 = numpy.trapz(numpy.sqrt(r0sin * r0sin + r1cos * r1cos), dx=dt)
#from scipy.special import ellipeinc
#m = 1 - (r1 / r0) ** 2
#t1 = ellipeinc(a1 - pi / 2, m)
#t0 = ellipeinc(a0 - pi / 2, m)
#perimeter2 = r0 * (t1 - t0)
def get_arclens(n_pts: int, a0: float, a1: float, dr: float) -> tuple[NDArray[numpy.float64], NDArray[numpy.float64]]:
""" Get `n_pts` arclengths """
t, dt = numpy.linspace(a0, a1, n_pts, retstep=True) # NOTE: could probably use an adaptive number of points
r0sin = (r0 + dr) * numpy.sin(t)
r1cos = (r1 + dr) * numpy.cos(t)
arc_dl = numpy.sqrt(r0sin * r0sin + r1cos * r1cos)
#arc_lengths = numpy.diff(t) * (arc_dl[1:] + arc_dl[:-1]) / 2
arc_lengths = (arc_dl[1:] + arc_dl[:-1]) * numpy.abs(dt) / 2
return arc_lengths, t
wh = self.width / 2.0
if wh == r0 or wh == r1:
if num_vertices is not None:
n_pts = numpy.ceil(max(self.radii + wh) / min(self.radii) * num_vertices * 100).astype(int)
perimeter_inner = get_arclens(n_pts, *a_ranges[0], dr=-wh)[0].sum()
perimeter_outer = get_arclens(n_pts, *a_ranges[1], dr= wh)[0].sum()
implied_arclen = (perimeter_outer + perimeter_inner + self.width * 2) / num_vertices
max_arclen = min(implied_arclen, max_arclen if max_arclen is not None else numpy.inf)
assert max_arclen is not None
def get_thetas(inner: bool) -> NDArray[numpy.float64]:
""" Figure out the parameter values at which we should place vertices to meet the arclength constraint"""
dr = -wh if inner else wh
n_pts = numpy.ceil(2 * pi * max(self.radii + dr) / max_arclen).astype(int)
arc_lengths, thetas = get_arclens(n_pts, *a_ranges[0 if inner else 1], dr=dr)
keep = [0]
removable = (numpy.cumsum(arc_lengths) <= max_arclen)
start = 1
while start < arc_lengths.size:
next_to_keep = start + numpy.where(removable)[0][-1] # TODO: any chance we haven't sampled finely enough?
keep.append(next_to_keep)
removable = (numpy.cumsum(arc_lengths[next_to_keep + 1:]) <= max_arclen)
start = next_to_keep + 1
if keep[-1] != thetas.size - 1:
keep.append(thetas.size - 1)
thetas = thetas[keep]
if inner:
thetas = thetas[::-1]
return thetas
if wh in (r0, r1):
thetas_inner = numpy.zeros(1) # Don't generate multiple vertices if we're at the origin
else:
thetas_inner = numpy.linspace(a_ranges[0][1], a_ranges[0][0], num_points, endpoint=True)
thetas_outer = numpy.linspace(a_ranges[1][0], a_ranges[1][1], num_points, endpoint=True)
thetas_inner = get_thetas(inner=True)
thetas_outer = get_thetas(inner=False)
sin_th_i, cos_th_i = (numpy.sin(thetas_inner), numpy.cos(thetas_inner))
sin_th_o, cos_th_o = (numpy.sin(thetas_outer), numpy.cos(thetas_outer))
@ -263,11 +304,11 @@ class Arc(Shape, metaclass=AutoSlots):
ys = numpy.hstack((ys1, ys2))
xys = numpy.vstack((xs, ys)).T
poly = Polygon(xys, dose=self.dose, layer=self.layer, offset=self.offset, rotation=self.rotation)
poly = Polygon(xys, offset=self.offset, rotation=self.rotation)
return [poly]
def get_bounds(self) -> NDArray[numpy.float64]:
'''
def get_bounds_single(self) -> NDArray[numpy.float64]:
"""
Equation for rotated ellipse is
`x = x0 + a * cos(t) * cos(rot) - b * sin(t) * sin(phi)`
`y = y0 + a * cos(t) * sin(rot) + b * sin(t) * cos(rot)`
@ -278,12 +319,12 @@ class Arc(Shape, metaclass=AutoSlots):
where -+ is for x, y cases, so that's where the extrema are.
If the extrema are innaccessible due to arc constraints, check the arc endpoints instead.
'''
"""
a_ranges = self._angles_to_parameters()
mins = []
maxs = []
for a, sgn in zip(a_ranges, (-1, +1)):
for a, sgn in zip(a_ranges, (-1, +1), strict=True):
wh = sgn * self.width / 2
rx = self.radius_x + wh
ry = self.radius_y + wh
@ -340,7 +381,7 @@ class Arc(Shape, metaclass=AutoSlots):
self.rotation += theta
return self
def mirror(self, axis: int) -> 'Arc':
def mirror(self, axis: int = 0) -> 'Arc':
self.offset[axis - 1] *= -1
self.rotation *= -1
self.rotation += axis * pi
@ -374,23 +415,27 @@ class Arc(Shape, metaclass=AutoSlots):
rotation %= 2 * pi
width = self.width
return ((type(self), radii, angles, width / norm_value, self.layer),
(self.offset, scale / norm_value, rotation, False, self.dose),
lambda: Arc(radii=radii * norm_value, angles=angles, width=width * norm_value, layer=self.layer))
return ((type(self), radii, angles, width / norm_value),
(self.offset, scale / norm_value, rotation, False),
lambda: Arc(
radii=radii * norm_value,
angles=angles,
width=width * norm_value,
))
def get_cap_edges(self) -> NDArray[numpy.float64]:
'''
"""
Returns:
```
[[[x0, y0], [x1, y1]], array of 4 points, specifying the two cuts which
[[x2, y2], [x3, y3]]], would create this arc from its corresponding ellipse.
```
'''
"""
a_ranges = self._angles_to_parameters()
mins = []
maxs = []
for a, sgn in zip(a_ranges, (-1, +1)):
for a, sgn in zip(a_ranges, (-1, +1), strict=True):
wh = sgn * self.width / 2
rx = self.radius_x + wh
ry = self.radius_y + wh
@ -409,41 +454,28 @@ class Arc(Shape, metaclass=AutoSlots):
return numpy.array([mins, maxs]) + self.offset
def _angles_to_parameters(self) -> NDArray[numpy.float64]:
'''
"""
Convert from polar angle to ellipse parameter (for [rx*cos(t), ry*sin(t)] representation)
Returns:
"Eccentric anomaly" parameter ranges for the inner and outer edges, in the form
`[[a_min_inner, a_max_inner], [a_min_outer, a_max_outer]]`
'''
"""
a = []
for sgn in (-1, +1):
wh = sgn * self.width / 2
wh = sgn * self.width / 2.0
rx = self.radius_x + wh
ry = self.radius_y + wh
# create paremeter 'a' for parametrized ellipse
a0, a1 = (numpy.arctan2(rx * numpy.sin(a), ry * numpy.cos(a)) for a in self.angles)
sign = numpy.sign(self.angles[1] - self.angles[0])
if sign != numpy.sign(a1 - a0):
a1 += sign * 2 * pi
a.append((a0, a1))
return numpy.array(a)
def lock(self) -> 'Arc':
self.radii.flags.writeable = False
self.angles.flags.writeable = False
Shape.lock(self)
return self
def unlock(self) -> 'Arc':
Shape.unlock(self)
self.radii.flags.writeable = True
self.angles.flags.writeable = True
return self
return numpy.array(a, dtype=float)
def __repr__(self) -> str:
angles = f'{numpy.rad2deg(self.angles)}'
rotation = f'{numpy.rad2deg(self.rotation):g}' if self.rotation != 0 else ''
dose = f' d{self.dose:g}' if self.dose != 1 else ''
locked = ' L' if self.locked else ''
return f'<Arc l{self.layer} o{self.offset} r{self.radii}{angles} w{self.width:g}{rotation}{dose}{locked}>'
return f'<Arc o{self.offset} r{self.radii}{angles} w{self.width:g}{rotation}>'

View File

@ -1,32 +1,31 @@
from typing import List, Dict, Optional
from typing import Any, cast
import copy
import functools
import numpy
from numpy import pi
from numpy.typing import NDArray, ArrayLike
from . import Shape, Polygon, normalized_shape_tuple, DEFAULT_POLY_NUM_POINTS
from .. import PatternError
from . import Shape, Polygon, normalized_shape_tuple, DEFAULT_POLY_NUM_VERTICES
from ..error import PatternError
from ..repetition import Repetition
from ..utils import is_scalar, layer_t, AutoSlots, annotations_t
from ..traits import LockableImpl
from ..utils import is_scalar, annotations_t, annotations_lt, annotations_eq, rep2key
class Circle(Shape, metaclass=AutoSlots):
@functools.total_ordering
class Circle(Shape):
"""
A circle, which has a position and radius.
"""
__slots__ = ('_radius', 'poly_num_points', 'poly_max_arclen')
__slots__ = (
'_radius',
# Inherited
'_offset', '_repetition', '_annotations',
)
_radius: float
""" Circle radius """
poly_num_points: Optional[int]
""" Sets the default number of points for `.polygonize()` """
poly_max_arclen: Optional[float]
""" Sets the default max segement length for `.polygonize()` """
# radius property
@property
def radius(self) -> float:
@ -47,81 +46,83 @@ class Circle(Shape, metaclass=AutoSlots):
self,
radius: float,
*,
poly_num_points: Optional[int] = DEFAULT_POLY_NUM_POINTS,
poly_max_arclen: Optional[float] = None,
offset: ArrayLike = (0.0, 0.0),
layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
annotations: Optional[annotations_t] = None,
locked: bool = False,
repetition: Repetition | None = None,
annotations: annotations_t | None = None,
raw: bool = False,
) -> None:
LockableImpl.unlock(self)
self.identifier = ()
if raw:
assert(isinstance(offset, numpy.ndarray))
assert isinstance(offset, numpy.ndarray)
self._radius = radius
self._offset = offset
self._repetition = repetition
self._annotations = annotations if annotations is not None else {}
self._layer = layer
self._dose = dose
else:
self.radius = radius
self.offset = offset
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
self.layer = layer
self.dose = dose
self.poly_num_points = poly_num_points
self.poly_max_arclen = poly_max_arclen
self.set_locked(locked)
def __deepcopy__(self, memo: Dict = None) -> 'Circle':
def __deepcopy__(self, memo: dict | None = None) -> 'Circle':
memo = {} if memo is None else memo
new = copy.copy(self)
Shape.unlock(new)
new._offset = self._offset.copy()
new._annotations = copy.deepcopy(self._annotations)
new.set_locked(self.locked)
return new
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and numpy.array_equal(self.offset, other.offset)
and self.radius == other.radius
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def __lt__(self, other: Shape) -> bool:
if type(self) is not type(other):
if repr(type(self)) != repr(type(other)):
return repr(type(self)) < repr(type(other))
return id(type(self)) < id(type(other))
other = cast(Circle, other)
if not self.radius == other.radius:
return self.radius < other.radius
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
def to_polygons(
self,
poly_num_points: Optional[int] = None,
poly_max_arclen: Optional[float] = None,
) -> List[Polygon]:
if poly_num_points is None:
poly_num_points = self.poly_num_points
if poly_max_arclen is None:
poly_max_arclen = self.poly_max_arclen
if (poly_num_points is None) and (poly_max_arclen is None):
num_vertices: int | None = DEFAULT_POLY_NUM_VERTICES,
max_arclen: float | None = None,
) -> list[Polygon]:
if (num_vertices is None) and (max_arclen is None):
raise PatternError('Number of points and arclength left '
'unspecified (default was also overridden)')
n: List[float] = []
if poly_num_points is not None:
n += [poly_num_points]
if poly_max_arclen is not None:
n += [2 * pi * self.radius / poly_max_arclen]
num_points = int(round(max(n)))
thetas = numpy.linspace(2 * pi, 0, num_points, endpoint=False)
n: list[float] = []
if num_vertices is not None:
n += [num_vertices]
if max_arclen is not None:
n += [2 * pi * self.radius / max_arclen]
num_vertices = int(round(max(n)))
thetas = numpy.linspace(2 * pi, 0, num_vertices, endpoint=False)
xs = numpy.cos(thetas) * self.radius
ys = numpy.sin(thetas) * self.radius
xys = numpy.vstack((xs, ys)).T
return [Polygon(xys, offset=self.offset, dose=self.dose, layer=self.layer)]
return [Polygon(xys, offset=self.offset)]
def get_bounds(self) -> NDArray[numpy.float64]:
def get_bounds_single(self) -> NDArray[numpy.float64]:
return numpy.vstack((self.offset - self.radius,
self.offset + self.radius))
def rotate(self, theta: float) -> 'Circle':
def rotate(self, theta: float) -> 'Circle': # noqa: ARG002 (theta unused)
return self
def mirror(self, axis: int) -> 'Circle':
def mirror(self, axis: int = 0) -> 'Circle': # noqa: ARG002 (axis unused)
self.offset *= -1
return self
@ -129,14 +130,12 @@ class Circle(Shape, metaclass=AutoSlots):
self.radius *= c
return self
def normalized_form(self, norm_value) -> normalized_shape_tuple:
def normalized_form(self, norm_value: float) -> normalized_shape_tuple:
rotation = 0.0
magnitude = self.radius / norm_value
return ((type(self), self.layer),
(self.offset, magnitude, rotation, False, self.dose),
lambda: Circle(radius=norm_value, layer=self.layer))
return ((type(self),),
(self.offset, magnitude, rotation, False),
lambda: Circle(radius=norm_value))
def __repr__(self) -> str:
dose = f' d{self.dose:g}' if self.dose != 1 else ''
locked = ' L' if self.locked else ''
return f'<Circle l{self.layer} o{self.offset} r{self.radius:g}{dose}{locked}>'
return f'<Circle o{self.offset} r{self.radius:g}>'

View File

@ -1,25 +1,29 @@
from typing import List, Dict, Sequence, Optional, Any
from typing import Any, Self, cast
import copy
import math
import functools
import numpy
from numpy import pi
from numpy.typing import ArrayLike, NDArray
from . import Shape, Polygon, normalized_shape_tuple, DEFAULT_POLY_NUM_POINTS
from .. import PatternError
from . import Shape, Polygon, normalized_shape_tuple, DEFAULT_POLY_NUM_VERTICES
from ..error import PatternError
from ..repetition import Repetition
from ..utils import is_scalar, rotation_matrix_2d, layer_t, AutoSlots, annotations_t
from ..traits import LockableImpl
from ..utils import is_scalar, rotation_matrix_2d, annotations_t, annotations_lt, annotations_eq, rep2key
class Ellipse(Shape, metaclass=AutoSlots):
@functools.total_ordering
class Ellipse(Shape):
"""
An ellipse, which has a position, two radii, and a rotation.
The rotation gives the angle from x-axis, counterclockwise, to the first (x) radius.
"""
__slots__ = ('_radii', '_rotation',
'poly_num_points', 'poly_max_arclen')
__slots__ = (
'_radii', '_rotation',
# Inherited
'_offset', '_repetition', '_annotations',
)
_radii: NDArray[numpy.float64]
""" Ellipse radii """
@ -27,15 +31,9 @@ class Ellipse(Shape, metaclass=AutoSlots):
_rotation: float
""" Angle from x-axis to first radius (ccw, radians) """
poly_num_points: Optional[int]
""" Sets the default number of points for `.polygonize()` """
poly_max_arclen: Optional[float]
""" Sets the default max segement length for `.polygonize()` """
# radius properties
@property
def radii(self) -> Any: #TODO mypy#3004 NDArray[numpy.float64]:
def radii(self) -> Any: # mypy#3004 NDArray[numpy.float64]:
"""
Return the radii `[rx, ry]`
"""
@ -92,64 +90,67 @@ class Ellipse(Shape, metaclass=AutoSlots):
self,
radii: ArrayLike,
*,
poly_num_points: Optional[int] = DEFAULT_POLY_NUM_POINTS,
poly_max_arclen: Optional[float] = None,
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0,
mirrored: Sequence[bool] = (False, False),
layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
annotations: Optional[annotations_t] = None,
locked: bool = False,
repetition: Repetition | None = None,
annotations: annotations_t | None = None,
raw: bool = False,
) -> None:
LockableImpl.unlock(self)
self.identifier = ()
if raw:
assert(isinstance(radii, numpy.ndarray))
assert(isinstance(offset, numpy.ndarray))
assert isinstance(radii, numpy.ndarray)
assert isinstance(offset, numpy.ndarray)
self._radii = radii
self._offset = offset
self._rotation = rotation
self._repetition = repetition
self._annotations = annotations if annotations is not None else {}
self._layer = layer
self._dose = dose
else:
self.radii = radii
self.offset = offset
self.rotation = rotation
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
self.layer = layer
self.dose = dose
[self.mirror(a) for a, do in enumerate(mirrored) if do]
self.poly_num_points = poly_num_points
self.poly_max_arclen = poly_max_arclen
self.set_locked(locked)
def __deepcopy__(self, memo: Dict = None) -> 'Ellipse':
def __deepcopy__(self, memo: dict | None = None) -> Self:
memo = {} if memo is None else memo
new = copy.copy(self)
Shape.unlock(new)
new._offset = self._offset.copy()
new._radii = self._radii.copy()
new._annotations = copy.deepcopy(self._annotations)
new.set_locked(self.locked)
return new
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and numpy.array_equal(self.offset, other.offset)
and numpy.array_equal(self.radii, other.radii)
and self.rotation == other.rotation
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def __lt__(self, other: Shape) -> bool:
if type(self) is not type(other):
if repr(type(self)) != repr(type(other)):
return repr(type(self)) < repr(type(other))
return id(type(self)) < id(type(other))
other = cast(Ellipse, other)
if not numpy.array_equal(self.radii, other.radii):
return tuple(self.radii) < tuple(other.radii)
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.rotation != other.rotation:
return self.rotation < other.rotation
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
def to_polygons(
self,
poly_num_points: Optional[int] = None,
poly_max_arclen: Optional[float] = None,
) -> List[Polygon]:
if poly_num_points is None:
poly_num_points = self.poly_num_points
if poly_max_arclen is None:
poly_max_arclen = self.poly_max_arclen
if (poly_num_points is None) and (poly_max_arclen is None):
num_vertices: int | None = DEFAULT_POLY_NUM_VERTICES,
max_arclen: float | None = None,
) -> list[Polygon]:
if (num_vertices is None) and (max_arclen is None):
raise PatternError('Number of points and arclength left unspecified'
' (default was also overridden)')
@ -162,37 +163,37 @@ class Ellipse(Shape, metaclass=AutoSlots):
perimeter = pi * (r1 + r0) * (1 + 3 * h / (10 + math.sqrt(4 - 3 * h)))
n = []
if poly_num_points is not None:
n += [poly_num_points]
if poly_max_arclen is not None:
n += [perimeter / poly_max_arclen]
num_points = int(round(max(n)))
thetas = numpy.linspace(2 * pi, 0, num_points, endpoint=False)
if num_vertices is not None:
n += [num_vertices]
if max_arclen is not None:
n += [perimeter / max_arclen]
num_vertices = int(round(max(n)))
thetas = numpy.linspace(2 * pi, 0, num_vertices, endpoint=False)
sin_th, cos_th = (numpy.sin(thetas), numpy.cos(thetas))
xs = r0 * cos_th
ys = r1 * sin_th
xys = numpy.vstack((xs, ys)).T
poly = Polygon(xys, dose=self.dose, layer=self.layer, offset=self.offset, rotation=self.rotation)
poly = Polygon(xys, offset=self.offset, rotation=self.rotation)
return [poly]
def get_bounds(self) -> NDArray[numpy.float64]:
def get_bounds_single(self) -> NDArray[numpy.float64]:
rot_radii = numpy.dot(rotation_matrix_2d(self.rotation), self.radii)
return numpy.vstack((self.offset - rot_radii[0],
self.offset + rot_radii[1]))
def rotate(self, theta: float) -> 'Ellipse':
def rotate(self, theta: float) -> Self:
self.rotation += theta
return self
def mirror(self, axis: int) -> 'Ellipse':
def mirror(self, axis: int = 0) -> Self:
self.offset[axis - 1] *= -1
self.rotation *= -1
self.rotation += axis * pi
return self
def scale_by(self, c: float) -> 'Ellipse':
def scale_by(self, c: float) -> Self:
self.radii *= c
return self
@ -205,22 +206,10 @@ class Ellipse(Shape, metaclass=AutoSlots):
radii = self.radii[::-1] / self.radius_y
scale = self.radius_y
angle = (self.rotation + pi / 2) % pi
return ((type(self), radii, self.layer),
(self.offset, scale / norm_value, angle, False, self.dose),
lambda: Ellipse(radii=radii * norm_value, layer=self.layer))
def lock(self) -> 'Ellipse':
self.radii.flags.writeable = False
Shape.lock(self)
return self
def unlock(self) -> 'Ellipse':
Shape.unlock(self)
self.radii.flags.writeable = True
return self
return ((type(self), radii),
(self.offset, scale / norm_value, angle, False),
lambda: Ellipse(radii=radii * norm_value))
def __repr__(self) -> str:
rotation = f' r{self.rotation*180/pi:g}' if self.rotation != 0 else ''
dose = f' d{self.dose:g}' if self.dose != 1 else ''
locked = ' L' if self.locked else ''
return f'<Ellipse l{self.layer} o{self.offset} r{self.radii}{rotation}{dose}{locked}>'
rotation = f' r{numpy.rad2deg(self.rotation):g}' if self.rotation != 0 else ''
return f'<Ellipse o{self.offset} r{self.radii}{rotation}>'

View File

@ -1,5 +1,7 @@
from typing import List, Tuple, Dict, Optional, Sequence, Any
from typing import Any, cast
from collections.abc import Sequence
import copy
import functools
from enum import Enum
import numpy
@ -7,13 +9,13 @@ from numpy import pi, inf
from numpy.typing import NDArray, ArrayLike
from . import Shape, normalized_shape_tuple, Polygon, Circle
from .. import PatternError
from ..error import PatternError
from ..repetition import Repetition
from ..utils import is_scalar, rotation_matrix_2d, layer_t, AutoSlots
from ..utils import is_scalar, rotation_matrix_2d, annotations_lt, annotations_eq, rep2key
from ..utils import remove_colinear_vertices, remove_duplicate_vertices, annotations_t
from ..traits import LockableImpl
@functools.total_ordering
class PathCap(Enum):
Flush = 0 # Path ends at final vertices
Circle = 1 # Path extends past final vertices with a semicircle of radius width/2
@ -21,19 +23,29 @@ class PathCap(Enum):
SquareCustom = 4 # Path extends past final vertices with a rectangle of length
# # defined by path.cap_extensions
def __lt__(self, other: Any) -> bool:
return self.value == other.value
class Path(Shape, metaclass=AutoSlots):
@functools.total_ordering
class Path(Shape):
"""
A path, consisting of a bunch of vertices (Nx2 ndarray), a width, an end-cap shape,
and an offset.
Note that the setter for `Path.vertices` will create a copy of the passed vertex coordinates.
A normalized_form(...) is available, but can be quite slow with lots of vertices.
"""
__slots__ = ('_vertices', '_width', '_cap', '_cap_extensions')
__slots__ = (
'_vertices', '_width', '_cap', '_cap_extensions',
# Inherited
'_offset', '_repetition', '_annotations',
)
_vertices: NDArray[numpy.float64]
_width: float
_cap: PathCap
_cap_extensions: Optional[NDArray[numpy.float64]]
_cap_extensions: NDArray[numpy.float64] | None
Cap = PathCap
@ -58,12 +70,14 @@ class Path(Shape, metaclass=AutoSlots):
def cap(self) -> PathCap:
"""
Path end-cap
Note that `cap_extensions` will be reset to default values if
`cap` is changed away from `PathCap.SquareCustom`.
"""
return self._cap
@cap.setter
def cap(self, val: PathCap) -> None:
# TODO: Document that setting cap can change cap_extensions
self._cap = PathCap(val)
if self.cap != PathCap.SquareCustom:
self.cap_extensions = None
@ -73,38 +87,43 @@ class Path(Shape, metaclass=AutoSlots):
# cap_extensions property
@property
def cap_extensions(self) -> Optional[Any]: #TODO mypy#3004 NDArray[numpy.float64]]:
def cap_extensions(self) -> Any | None: # mypy#3004 NDArray[numpy.float64]]:
"""
Path end-cap extension
Note that `cap_extensions` will be reset to default values if
`cap` is changed away from `PathCap.SquareCustom`.
Returns:
2-element ndarray or `None`
"""
return self._cap_extensions
@cap_extensions.setter
def cap_extensions(self, vals: Optional[ArrayLike]) -> None:
def cap_extensions(self, vals: ArrayLike | None) -> None:
custom_caps = (PathCap.SquareCustom,)
if self.cap in custom_caps:
if vals is None:
raise Exception('Tried to set cap extensions to None on path with custom cap type')
raise PatternError('Tried to set cap extensions to None on path with custom cap type')
self._cap_extensions = numpy.array(vals, dtype=float)
else:
if vals is not None:
raise Exception('Tried to set custom cap extensions on path with non-custom cap type')
raise PatternError('Tried to set custom cap extensions on path with non-custom cap type')
self._cap_extensions = vals
# vertices property
@property
def vertices(self) -> Any: #TODO mypy#3004 NDArray[numpy.float64]]:
def vertices(self) -> Any: # mypy#3004 NDArray[numpy.float64]]:
"""
Vertices of the path (Nx2 ndarray: `[[x0, y0], [x1, y1], ...]`)
Vertices of the path (Nx2 ndarray: `[[x0, y0], [x1, y1], ...]`
When setting, note that a copy of the provided vertices will be made.
"""
return self._vertices
@vertices.setter
def vertices(self, val: ArrayLike) -> None:
val = numpy.array(val, dtype=float) # TODO document that these might not be copied
val = numpy.array(val, dtype=float)
if len(val.shape) < 2 or val.shape[1] != 2:
raise PatternError('Vertices must be an Nx2 array')
if val.shape[0] < 2:
@ -147,31 +166,23 @@ class Path(Shape, metaclass=AutoSlots):
width: float = 0.0,
*,
cap: PathCap = PathCap.Flush,
cap_extensions: Optional[ArrayLike] = None,
cap_extensions: ArrayLike | None = None,
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0,
mirrored: Sequence[bool] = (False, False),
layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
annotations: Optional[annotations_t] = None,
locked: bool = False,
repetition: Repetition | None = None,
annotations: annotations_t | None = None,
raw: bool = False,
) -> None:
LockableImpl.unlock(self)
self._cap_extensions = None # Since .cap setter might access it
self.identifier = ()
if raw:
assert(isinstance(vertices, numpy.ndarray))
assert(isinstance(offset, numpy.ndarray))
assert(isinstance(cap_extensions, numpy.ndarray) or cap_extensions is None)
assert isinstance(vertices, numpy.ndarray)
assert isinstance(offset, numpy.ndarray)
assert isinstance(cap_extensions, numpy.ndarray) or cap_extensions is None
self._vertices = vertices
self._offset = offset
self._repetition = repetition
self._annotations = annotations if annotations is not None else {}
self._layer = layer
self._dose = dose
self._width = width
self._cap = cap
self._cap_extensions = cap_extensions
@ -180,38 +191,63 @@ class Path(Shape, metaclass=AutoSlots):
self.offset = offset
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
self.layer = layer
self.dose = dose
self.width = width
self.cap = cap
self.cap_extensions = cap_extensions
self.rotate(rotation)
[self.mirror(a) for a, do in enumerate(mirrored) if do]
self.set_locked(locked)
def __deepcopy__(self, memo: Dict = None) -> 'Path':
def __deepcopy__(self, memo: dict | None = None) -> 'Path':
memo = {} if memo is None else memo
new = copy.copy(self)
Shape.unlock(new)
new._offset = self._offset.copy()
new._vertices = self._vertices.copy()
new._cap = copy.deepcopy(self._cap, memo)
new._cap_extensions = copy.deepcopy(self._cap_extensions, memo)
new._annotations = copy.deepcopy(self._annotations)
new.set_locked(self.locked)
return new
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and numpy.array_equal(self.offset, other.offset)
and numpy.array_equal(self.vertices, other.vertices)
and self.width == other.width
and self.cap == other.cap
and numpy.array_equal(self.cap_extensions, other.cap_extensions) # type: ignore
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def __lt__(self, other: Shape) -> bool:
if type(self) is not type(other):
if repr(type(self)) != repr(type(other)):
return repr(type(self)) < repr(type(other))
return id(type(self)) < id(type(other))
other = cast(Path, other)
if self.width != other.width:
return self.width < other.width
if self.cap != other.cap:
return self.cap < other.cap
if not numpy.array_equal(self.cap_extensions, other.cap_extensions): # type: ignore
if other.cap_extensions is None:
return False
if self.cap_extensions is None:
return True
return tuple(self.cap_extensions) < tuple(other.cap_extensions)
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
@staticmethod
def travel(
travel_pairs: Sequence[Tuple[float, float]],
travel_pairs: Sequence[tuple[float, float]],
width: float = 0.0,
cap: PathCap = PathCap.Flush,
cap_extensions: Optional[Tuple[float, float]] = None,
cap_extensions: tuple[float, float] | None = None,
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0,
mirrored: Sequence[bool] = (False, False),
layer: layer_t = 0,
dose: float = 1.0,
) -> 'Path':
"""
Build a path by specifying the turn angles and travel distances
@ -228,16 +264,11 @@ class Path(Shape, metaclass=AutoSlots):
Default `(0, 0)` or `None`, depending on cap type
offset: Offset, default `(0, 0)`
rotation: Rotation counterclockwise, in radians. Default `0`
mirrored: Whether to mirror across the x or y axes. For example,
`mirrored=(True, False)` results in a reflection across the x-axis,
multiplying the path's y-coordinates by -1. Default `(False, False)`
layer: Layer, default `0`
dose: Dose, default `1.0`
Returns:
The resulting Path object
"""
#TODO: needs testing
# TODO: Path.travel() needs testing
direction = numpy.array([1, 0])
verts = [numpy.zeros(2)]
@ -246,14 +277,13 @@ class Path(Shape, metaclass=AutoSlots):
verts.append(verts[-1] + direction * distance)
return Path(vertices=verts, width=width, cap=cap, cap_extensions=cap_extensions,
offset=offset, rotation=rotation, mirrored=mirrored,
layer=layer, dose=dose)
offset=offset, rotation=rotation)
def to_polygons(
self,
poly_num_points: int = None,
poly_max_arclen: float = None,
) -> List['Polygon']:
num_vertices: int | None = None,
max_arclen: float | None = None,
) -> list['Polygon']:
extensions = self._calculate_cap_extensions()
v = remove_colinear_vertices(self.vertices, closed_path=False)
@ -262,7 +292,7 @@ class Path(Shape, metaclass=AutoSlots):
if self.width == 0:
verts = numpy.vstack((v, v[::-1]))
return [Polygon(offset=self.offset, vertices=verts, dose=self.dose, layer=self.layer)]
return [Polygon(offset=self.offset, vertices=verts)]
perp = dvdir[:, ::-1] * [[1, -1]] * self.width / 2
@ -313,31 +343,33 @@ class Path(Shape, metaclass=AutoSlots):
o1.append(v[-1] - perp[-1])
verts = numpy.vstack((o0, o1[::-1]))
polys = [Polygon(offset=self.offset, vertices=verts, dose=self.dose, layer=self.layer)]
polys = [Polygon(offset=self.offset, vertices=verts)]
if self.cap == PathCap.Circle:
#for vert in v: # not sure if every vertex, or just ends?
for vert in [v[0], v[-1]]:
circ = Circle(offset=vert, radius=self.width / 2, dose=self.dose, layer=self.layer)
polys += circ.to_polygons(poly_num_points=poly_num_points, poly_max_arclen=poly_max_arclen)
circ = Circle(offset=vert, radius=self.width / 2)
polys += circ.to_polygons(num_vertices=num_vertices, max_arclen=max_arclen)
return polys
def get_bounds(self) -> NDArray[numpy.float64]:
def get_bounds_single(self) -> NDArray[numpy.float64]:
if self.cap == PathCap.Circle:
bounds = self.offset + numpy.vstack((numpy.min(self.vertices, axis=0) - self.width / 2,
numpy.max(self.vertices, axis=0) + self.width / 2))
elif self.cap in (PathCap.Flush,
elif self.cap in (
PathCap.Flush,
PathCap.Square,
PathCap.SquareCustom):
PathCap.SquareCustom,
):
bounds = numpy.array([[+inf, +inf], [-inf, -inf]])
polys = self.to_polygons()
for poly in polys:
poly_bounds = poly.get_bounds_nonempty()
poly_bounds = poly.get_bounds_single_nonempty()
bounds[0, :] = numpy.minimum(bounds[0, :], poly_bounds[0, :])
bounds[1, :] = numpy.maximum(bounds[1, :], poly_bounds[1, :])
else:
raise PatternError(f'get_bounds() not implemented for endcaps: {self.cap}')
raise PatternError(f'get_bounds_single() not implemented for endcaps: {self.cap}')
return bounds
@ -346,7 +378,7 @@ class Path(Shape, metaclass=AutoSlots):
self.vertices = numpy.dot(rotation_matrix_2d(theta), self.vertices.T).T
return self
def mirror(self, axis: int) -> 'Path':
def mirror(self, axis: int = 0) -> 'Path':
self.vertices[:, axis - 1] *= -1
return self
@ -373,15 +405,18 @@ class Path(Shape, metaclass=AutoSlots):
x_min = rotated_vertices[:, 0].argmin()
if not is_scalar(x_min):
y_min = rotated_vertices[x_min, 1].argmin()
x_min = x_min[y_min]
x_min = cast(Sequence, x_min)[y_min]
reordered_vertices = numpy.roll(rotated_vertices, -x_min, axis=0)
width0 = self.width / norm_value
return ((type(self), reordered_vertices.data.tobytes(), width0, self.cap, self.layer),
(offset, scale / norm_value, rotation, False, self.dose),
lambda: Path(reordered_vertices * norm_value, width=self.width * norm_value,
cap=self.cap, layer=self.layer))
return ((type(self), reordered_vertices.data.tobytes(), width0, self.cap),
(offset, scale / norm_value, rotation, False),
lambda: Path(
reordered_vertices * norm_value,
width=self.width * norm_value,
cap=self.cap,
))
def clean_vertices(self) -> 'Path':
"""
@ -394,22 +429,22 @@ class Path(Shape, metaclass=AutoSlots):
return self
def remove_duplicate_vertices(self) -> 'Path':
'''
"""
Removes all consecutive duplicate (repeated) vertices.
Returns:
self
'''
"""
self.vertices = remove_duplicate_vertices(self.vertices, closed_path=False)
return self
def remove_colinear_vertices(self) -> 'Path':
'''
"""
Removes consecutive co-linear vertices.
Returns:
self
'''
"""
self.vertices = remove_colinear_vertices(self.vertices, closed_path=False)
return self
@ -417,29 +452,13 @@ class Path(Shape, metaclass=AutoSlots):
if self.cap == PathCap.Square:
extensions = numpy.full(2, self.width / 2)
elif self.cap == PathCap.SquareCustom:
assert(isinstance(self.cap_extensions, numpy.ndarray))
assert isinstance(self.cap_extensions, numpy.ndarray)
extensions = self.cap_extensions
else:
# Flush or Circle
extensions = numpy.zeros(2)
return extensions
def lock(self) -> 'Path':
self.vertices.flags.writeable = False
if self.cap_extensions is not None:
self.cap_extensions.flags.writeable = False
Shape.lock(self)
return self
def unlock(self) -> 'Path':
Shape.unlock(self)
self.vertices.flags.writeable = True
if self.cap_extensions is not None:
self.cap_extensions.flags.writeable = True
return self
def __repr__(self) -> str:
centroid = self.offset + self.vertices.mean(axis=0)
dose = f' d{self.dose:g}' if self.dose != 1 else ''
locked = ' L' if self.locked else ''
return f'<Path l{self.layer} centroid {centroid} v{len(self.vertices)} w{self.width} c{self.cap}{dose}{locked}>'
return f'<Path centroid {centroid} v{len(self.vertices)} w{self.width} c{self.cap}>'

View File

@ -1,41 +1,52 @@
from typing import List, Dict, Optional, Sequence, Any
from typing import Any, cast
from collections.abc import Sequence
import copy
import functools
import numpy
from numpy import pi
from numpy.typing import NDArray, ArrayLike
from . import Shape, normalized_shape_tuple
from .. import PatternError
from ..error import PatternError
from ..repetition import Repetition
from ..utils import is_scalar, rotation_matrix_2d, layer_t, AutoSlots
from ..utils import is_scalar, rotation_matrix_2d, annotations_lt, annotations_eq, rep2key
from ..utils import remove_colinear_vertices, remove_duplicate_vertices, annotations_t
from ..traits import LockableImpl
class Polygon(Shape, metaclass=AutoSlots):
@functools.total_ordering
class Polygon(Shape):
"""
A polygon, consisting of a bunch of vertices (Nx2 ndarray) which specify an
implicitly-closed boundary, and an offset.
Note that the setter for `Polygon.vertices` creates a copy of the
passed vertex coordinates.
A `normalized_form(...)` is available, but can be quite slow with lots of vertices.
"""
__slots__ = ('_vertices',)
__slots__ = (
'_vertices',
# Inherited
'_offset', '_repetition', '_annotations',
)
_vertices: NDArray[numpy.float64]
""" Nx2 ndarray of vertices `[[x0, y0], [x1, y1], ...]` """
# vertices property
@property
def vertices(self) -> Any: #TODO mypy#3004 NDArray[numpy.float64]:
def vertices(self) -> Any: # mypy#3004 NDArray[numpy.float64]:
"""
Vertices of the polygon (Nx2 ndarray: `[[x0, y0], [x1, y1], ...]`)
When setting, note that a copy of the provided vertices will be made,
"""
return self._vertices
@vertices.setter
def vertices(self, val: ArrayLike) -> None:
val = numpy.array(val, dtype=float) # TODO document that these might not be copied
val = numpy.array(val, dtype=float)
if len(val.shape) < 2 or val.shape[1] != 2:
raise PatternError('Vertices must be an Nx2 array')
if val.shape[0] < 3:
@ -78,55 +89,68 @@ class Polygon(Shape, metaclass=AutoSlots):
*,
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0.0,
mirrored: Sequence[bool] = (False, False),
layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
annotations: Optional[annotations_t] = None,
locked: bool = False,
repetition: Repetition | None = None,
annotations: annotations_t | None = None,
raw: bool = False,
) -> None:
LockableImpl.unlock(self)
self.identifier = ()
if raw:
assert(isinstance(vertices, numpy.ndarray))
assert(isinstance(offset, numpy.ndarray))
assert isinstance(vertices, numpy.ndarray)
assert isinstance(offset, numpy.ndarray)
self._vertices = vertices
self._offset = offset
self._repetition = repetition
self._annotations = annotations if annotations is not None else {}
self._layer = layer
self._dose = dose
else:
self.vertices = vertices
self.offset = offset
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
self.layer = layer
self.dose = dose
self.rotate(rotation)
[self.mirror(a) for a, do in enumerate(mirrored) if do]
self.set_locked(locked)
def __deepcopy__(self, memo: Optional[Dict] = None) -> 'Polygon':
def __deepcopy__(self, memo: dict | None = None) -> 'Polygon':
memo = {} if memo is None else memo
new = copy.copy(self)
Shape.unlock(new)
new._offset = self._offset.copy()
new._vertices = self._vertices.copy()
new._annotations = copy.deepcopy(self._annotations)
new.set_locked(self.locked)
return new
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and numpy.array_equal(self.offset, other.offset)
and numpy.array_equal(self.vertices, other.vertices)
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def __lt__(self, other: Shape) -> bool:
if type(self) is not type(other):
if repr(type(self)) != repr(type(other)):
return repr(type(self)) < repr(type(other))
return id(type(self)) < id(type(other))
other = cast(Polygon, other)
if not numpy.array_equal(self.vertices, other.vertices):
min_len = min(self.vertices.shape[0], other.vertices.shape[0])
eq_mask = self.vertices[:min_len] != other.vertices[:min_len]
eq_lt = self.vertices[:min_len] < other.vertices[:min_len]
eq_lt_masked = eq_lt[eq_mask]
if eq_lt_masked.size > 0:
return eq_lt_masked.flat[0]
return self.vertices.shape[0] < other.vertices.shape[0]
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
@staticmethod
def square(
side_length: float,
*,
rotation: float = 0.0,
offset: ArrayLike = (0.0, 0.0),
layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
repetition: Repetition | None = None,
) -> 'Polygon':
"""
Draw a square given side_length, centered on the origin.
@ -135,8 +159,6 @@ class Polygon(Shape, metaclass=AutoSlots):
side_length: Length of one side
rotation: Rotation counterclockwise, in radians
offset: Offset, default `(0, 0)`
layer: Layer, default `0`
dose: Dose, default `1.0`
repetition: `Repetition` object, default `None`
Returns:
@ -147,8 +169,7 @@ class Polygon(Shape, metaclass=AutoSlots):
[+1, +1],
[+1, -1]], dtype=float)
vertices = 0.5 * side_length * norm_square
poly = Polygon(vertices, offset=offset, layer=layer, dose=dose,
repetition=repetition)
poly = Polygon(vertices, offset=offset, repetition=repetition)
poly.rotate(rotation)
return poly
@ -159,9 +180,7 @@ class Polygon(Shape, metaclass=AutoSlots):
*,
rotation: float = 0,
offset: ArrayLike = (0.0, 0.0),
layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
repetition: Repetition | None = None,
) -> 'Polygon':
"""
Draw a rectangle with side lengths lx and ly, centered on the origin.
@ -171,8 +190,6 @@ class Polygon(Shape, metaclass=AutoSlots):
ly: Length along y (before rotation)
rotation: Rotation counterclockwise, in radians
offset: Offset, default `(0, 0)`
layer: Layer, default `0`
dose: Dose, default `1.0`
repetition: `Repetition` object, default `None`
Returns:
@ -182,25 +199,22 @@ class Polygon(Shape, metaclass=AutoSlots):
[-lx, +ly],
[+lx, +ly],
[+lx, -ly]], dtype=float)
poly = Polygon(vertices, offset=offset, layer=layer, dose=dose,
repetition=repetition)
poly = Polygon(vertices, offset=offset, repetition=repetition)
poly.rotate(rotation)
return poly
@staticmethod
def rect(
*,
xmin: Optional[float] = None,
xctr: Optional[float] = None,
xmax: Optional[float] = None,
lx: Optional[float] = None,
ymin: Optional[float] = None,
yctr: Optional[float] = None,
ymax: Optional[float] = None,
ly: Optional[float] = None,
layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
xmin: float | None = None,
xctr: float | None = None,
xmax: float | None = None,
lx: float | None = None,
ymin: float | None = None,
yctr: float | None = None,
ymax: float | None = None,
ly: float | None = None,
repetition: Repetition | None = None,
) -> 'Polygon':
"""
Draw a rectangle by specifying side/center positions.
@ -217,8 +231,6 @@ class Polygon(Shape, metaclass=AutoSlots):
yctr: Center y coordinate
ymax: Maximum y coordinate
ly: Length along y direction
layer: Layer, default `0`
dose: Dose, default `1.0`
repetition: `Repetition` object, default `None`
Returns:
@ -226,79 +238,76 @@ class Polygon(Shape, metaclass=AutoSlots):
"""
if lx is None:
if xctr is None:
assert(xmin is not None)
assert(xmax is not None)
assert xmin is not None
assert xmax is not None
xctr = 0.5 * (xmax + xmin)
lx = xmax - xmin
elif xmax is None:
assert(xmin is not None)
assert(xctr is not None)
assert xmin is not None
assert xctr is not None
lx = 2 * (xctr - xmin)
elif xmin is None:
assert(xctr is not None)
assert(xmax is not None)
assert xctr is not None
assert xmax is not None
lx = 2 * (xmax - xctr)
else:
raise PatternError('Two of xmin, xctr, xmax, lx must be None!')
else:
else: # noqa: PLR5501
if xctr is not None:
pass
elif xmax is None:
assert(xmin is not None)
assert(lx is not None)
assert xmin is not None
assert lx is not None
xctr = xmin + 0.5 * lx
elif xmin is None:
assert(xmax is not None)
assert(lx is not None)
assert xmax is not None
assert lx is not None
xctr = xmax - 0.5 * lx
else:
raise PatternError('Two of xmin, xctr, xmax, lx must be None!')
if ly is None:
if yctr is None:
assert(ymin is not None)
assert(ymax is not None)
assert ymin is not None
assert ymax is not None
yctr = 0.5 * (ymax + ymin)
ly = ymax - ymin
elif ymax is None:
assert(ymin is not None)
assert(yctr is not None)
assert ymin is not None
assert yctr is not None
ly = 2 * (yctr - ymin)
elif ymin is None:
assert(yctr is not None)
assert(ymax is not None)
assert yctr is not None
assert ymax is not None
ly = 2 * (ymax - yctr)
else:
raise PatternError('Two of ymin, yctr, ymax, ly must be None!')
else:
else: # noqa: PLR5501
if yctr is not None:
pass
elif ymax is None:
assert(ymin is not None)
assert(ly is not None)
assert ymin is not None
assert ly is not None
yctr = ymin + 0.5 * ly
elif ymin is None:
assert(ly is not None)
assert(ymax is not None)
assert ly is not None
assert ymax is not None
yctr = ymax - 0.5 * ly
else:
raise PatternError('Two of ymin, yctr, ymax, ly must be None!')
poly = Polygon.rectangle(lx, ly, offset=(xctr, yctr),
layer=layer, dose=dose, repetition=repetition)
poly = Polygon.rectangle(lx, ly, offset=(xctr, yctr), repetition=repetition)
return poly
@staticmethod
def octagon(
*,
side_length: Optional[float] = None,
inner_radius: Optional[float] = None,
side_length: float | None = None,
inner_radius: float | None = None,
regular: bool = True,
center: ArrayLike = (0.0, 0.0),
rotation: float = 0.0,
layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
repetition: Repetition | None = None,
) -> 'Polygon':
"""
Draw an octagon given one of (side length, inradius, circumradius).
@ -316,17 +325,12 @@ class Polygon(Shape, metaclass=AutoSlots):
rotation: Rotation counterclockwise, in radians.
`0` results in four axis-aligned sides (the long sides of the
irregular octagon).
layer: Layer, default `0`
dose: Dose, default `1.0`
repetition: `Repetition` object, default `None`
Returns:
A Polygon object containing the requested octagon
"""
if regular:
s = 1 + numpy.sqrt(2)
else:
s = 2
s = (1 + numpy.sqrt(2)) if regular else 2
norm_oct = numpy.array([
[-1, -s],
@ -344,19 +348,18 @@ class Polygon(Shape, metaclass=AutoSlots):
side_length = 2 * inner_radius / s
vertices = 0.5 * side_length * norm_oct
poly = Polygon(vertices, offset=center, layer=layer, dose=dose, repetition=repetition)
poly = Polygon(vertices, offset=center, repetition=repetition)
poly.rotate(rotation)
return poly
def to_polygons(
self,
poly_num_points: int = None, # unused
poly_max_arclen: float = None, # unused
) -> List['Polygon']:
num_vertices: int | None = None, # unused # noqa: ARG002
max_arclen: float | None = None, # unused # noqa: ARG002
) -> list['Polygon']:
return [copy.deepcopy(self)]
def get_bounds(self) -> NDArray[numpy.float64]:
def get_bounds_single(self) -> NDArray[numpy.float64]: # TODO note shape get_bounds doesn't include repetition
return numpy.vstack((self.offset + numpy.min(self.vertices, axis=0),
self.offset + numpy.max(self.vertices, axis=0)))
@ -365,7 +368,7 @@ class Polygon(Shape, metaclass=AutoSlots):
self.vertices = numpy.dot(rotation_matrix_2d(theta), self.vertices.T).T
return self
def mirror(self, axis: int) -> 'Polygon':
def mirror(self, axis: int = 0) -> 'Polygon':
self.vertices[:, axis - 1] *= -1
return self
@ -376,8 +379,9 @@ class Polygon(Shape, metaclass=AutoSlots):
def normalized_form(self, norm_value: float) -> normalized_shape_tuple:
# Note: this function is going to be pretty slow for many-vertexed polygons, relative to
# other shapes
offset = self.vertices.mean(axis=0) + self.offset
zeroed_vertices = self.vertices - offset
meanv = self.vertices.mean(axis=0)
zeroed_vertices = self.vertices - meanv
offset = meanv + self.offset
scale = zeroed_vertices.std()
normed_vertices = zeroed_vertices / scale
@ -391,14 +395,14 @@ class Polygon(Shape, metaclass=AutoSlots):
x_min = rotated_vertices[:, 0].argmin()
if not is_scalar(x_min):
y_min = rotated_vertices[x_min, 1].argmin()
x_min = x_min[y_min]
x_min = cast(Sequence, x_min)[y_min]
reordered_vertices = numpy.roll(rotated_vertices, -x_min, axis=0)
# TODO: normalize mirroring?
return ((type(self), reordered_vertices.data.tobytes(), self.layer),
(offset, scale / norm_value, rotation, False, self.dose),
lambda: Polygon(reordered_vertices * norm_value, layer=self.layer))
return ((type(self), reordered_vertices.data.tobytes()),
(offset, scale / norm_value, rotation, False),
lambda: Polygon(reordered_vertices * norm_value))
def clean_vertices(self) -> 'Polygon':
"""
@ -411,37 +415,25 @@ class Polygon(Shape, metaclass=AutoSlots):
return self
def remove_duplicate_vertices(self) -> 'Polygon':
'''
"""
Removes all consecutive duplicate (repeated) vertices.
Returns:
self
'''
"""
self.vertices = remove_duplicate_vertices(self.vertices, closed_path=True)
return self
def remove_colinear_vertices(self) -> 'Polygon':
'''
"""
Removes consecutive co-linear vertices.
Returns:
self
'''
"""
self.vertices = remove_colinear_vertices(self.vertices, closed_path=True)
return self
def lock(self) -> 'Polygon':
self.vertices.flags.writeable = False
Shape.lock(self)
return self
def unlock(self) -> 'Polygon':
Shape.unlock(self)
self.vertices.flags.writeable = True
return self
def __repr__(self) -> str:
centroid = self.offset + self.vertices.mean(axis=0)
dose = f' d{self.dose:g}' if self.dose != 1 else ''
locked = ' L' if self.locked else ''
return f'<Polygon l{self.layer} centroid {centroid} v{len(self.vertices)}{dose}{locked}>'
return f'<Polygon centroid {centroid} v{len(self.vertices)}>'

View File

@ -1,57 +1,62 @@
from typing import List, Tuple, Callable, TypeVar, Optional, TYPE_CHECKING
from typing import TYPE_CHECKING, Any
from collections.abc import Callable
from abc import ABCMeta, abstractmethod
import numpy
from numpy.typing import NDArray, ArrayLike
from ..traits import (PositionableImpl, LayerableImpl, DoseableImpl,
from ..traits import (
Rotatable, Mirrorable, Copyable, Scalable,
PivotableImpl, LockableImpl, RepeatableImpl,
AnnotatableImpl)
PositionableImpl, PivotableImpl, RepeatableImpl, AnnotatableImpl,
)
if TYPE_CHECKING:
from . import Polygon
# Type definitions
normalized_shape_tuple = Tuple[Tuple,
Tuple[NDArray[numpy.float64], float, float, bool, float],
Callable[[], 'Shape']]
normalized_shape_tuple = tuple[
tuple,
tuple[NDArray[numpy.float64], float, float, bool],
Callable[[], 'Shape'],
]
# ## Module-wide defaults
# Default number of points per polygon for shapes
DEFAULT_POLY_NUM_POINTS = 24
DEFAULT_POLY_NUM_VERTICES = 24
T = TypeVar('T', bound='Shape')
class Shape(PositionableImpl, LayerableImpl, DoseableImpl, Rotatable, Mirrorable, Copyable, Scalable,
PivotableImpl, RepeatableImpl, LockableImpl, AnnotatableImpl, metaclass=ABCMeta):
class Shape(PositionableImpl, Rotatable, Mirrorable, Copyable, Scalable,
PivotableImpl, RepeatableImpl, AnnotatableImpl, metaclass=ABCMeta):
"""
Abstract class specifying functions common to all shapes.
Class specifying functions common to all shapes.
"""
__slots__ = () # Children should use AutoSlots
__slots__ = () # Children should use AutoSlots or set slots themselves
identifier: Tuple
""" An arbitrary identifier for the shape, usually empty but used by `Pattern.flatten()` """
#def __copy__(self) -> Self:
# cls = self.__class__
# new = cls.__new__(cls)
# for name in self.__slots__: # type: str
# object.__setattr__(new, name, getattr(self, name))
# return new
def __copy__(self) -> 'Shape':
cls = self.__class__
new = cls.__new__(cls)
for name in self.__slots__: # type: str
object.__setattr__(new, name, getattr(self, name))
return new
#
# Methods (abstract)
#
@abstractmethod
def __eq__(self, other: Any) -> bool:
pass
@abstractmethod
def __lt__(self, other: 'Shape') -> bool:
pass
'''
--- Abstract methods
'''
@abstractmethod
def to_polygons(
self,
num_vertices: Optional[int] = None,
max_arclen: Optional[float] = None,
) -> List['Polygon']:
num_vertices: int | None = None,
max_arclen: float | None = None,
) -> list['Polygon']:
"""
Returns a list of polygons which approximate the shape.
@ -68,9 +73,9 @@ class Shape(PositionableImpl, LayerableImpl, DoseableImpl, Rotatable, Mirrorable
pass
@abstractmethod
def normalized_form(self: T, norm_value: int) -> normalized_shape_tuple:
def normalized_form(self, norm_value: int) -> normalized_shape_tuple:
"""
Writes the shape in a standardized notation, with offset, scale, rotation, and dose
Writes the shape in a standardized notation, with offset, scale, and rotation
information separated out from the remaining values.
Args:
@ -85,20 +90,20 @@ class Shape(PositionableImpl, LayerableImpl, DoseableImpl, Rotatable, Mirrorable
`(intrinsic, extrinsic, constructor)`. These are further broken down as:
`intrinsic`: A tuple of basic types containing all information about the instance that
is not contained in 'extrinsic'. Usually, `intrinsic[0] == type(self)`.
`extrinsic`: `([x_offset, y_offset], scale, rotation, mirror_across_x_axis, dose)`
`extrinsic`: `([x_offset, y_offset], scale, rotation, mirror_across_x_axis)`
`constructor`: A callable (no arguments) which returns an instance of `type(self)` with
internal state equivalent to `intrinsic`.
"""
pass
'''
---- Non-abstract methods
'''
#
# Non-abstract methods
#
def manhattanize_fast(
self,
grid_x: ArrayLike,
grid_y: ArrayLike,
) -> List['Polygon']:
) -> list['Polygon']:
"""
Returns a list of polygons with grid-aligned ("Manhattan") edges approximating the shape.
@ -122,7 +127,7 @@ class Shape(PositionableImpl, LayerableImpl, DoseableImpl, Rotatable, Mirrorable
polygon_contours = []
for polygon in self.to_polygons():
bounds = polygon.get_bounds()
bounds = polygon.get_bounds_single()
if bounds is None:
continue
@ -130,7 +135,7 @@ class Shape(PositionableImpl, LayerableImpl, DoseableImpl, Rotatable, Mirrorable
vertex_lists = []
p_verts = polygon.vertices + polygon.offset
for v, v_next in zip(p_verts, numpy.roll(p_verts, -1, axis=0)):
for v, v_next in zip(p_verts, numpy.roll(p_verts, -1, axis=0), strict=True):
dv = v_next - v
# Find x-index bounds for the line # TODO: fix this and err_xmin/xmax for grids smaller than the line / shape
@ -160,7 +165,7 @@ class Shape(PositionableImpl, LayerableImpl, DoseableImpl, Rotatable, Mirrorable
m = dv[1] / dv[0]
def get_grid_inds(xes: ArrayLike) -> NDArray[numpy.float64]:
def get_grid_inds(xes: ArrayLike, m: float = m, v: NDArray = v) -> NDArray[numpy.float64]:
ys = m * (xes - v[0]) + v[1]
# (inds - 1) is the index of the y-grid line below the edge's intersection with the x-grid
@ -175,14 +180,14 @@ class Shape(PositionableImpl, LayerableImpl, DoseableImpl, Rotatable, Mirrorable
return inds
# Find the y indices on all x gridlines
xs = gx[gxi_min:gxi_max]
xs = gx[int(gxi_min):int(gxi_max)]
inds = get_grid_inds(xs)
# Find y-intersections for x-midpoints
xs2 = (xs[:-1] + xs[1:]) / 2
inds2 = get_grid_inds(xs2)
xinds = numpy.rint(numpy.arange(gxi_min, gxi_max - 0.99, 1 / 3), dtype=numpy.int64, casting='unsafe')
xinds = numpy.rint(numpy.arange(gxi_min, gxi_max - 0.99, 1 / 3)).astype(numpy.int64)
# interleave the results
yinds = xinds.copy()
@ -197,12 +202,7 @@ class Shape(PositionableImpl, LayerableImpl, DoseableImpl, Rotatable, Mirrorable
vertex_lists.append(vlist)
polygon_contours.append(numpy.vstack(vertex_lists))
manhattan_polygons = []
for contour in polygon_contours:
manhattan_polygons.append(Polygon(
vertices=contour,
layer=self.layer,
dose=self.dose))
manhattan_polygons = [Polygon(vertices=contour) for contour in polygon_contours]
return manhattan_polygons
@ -210,7 +210,7 @@ class Shape(PositionableImpl, LayerableImpl, DoseableImpl, Rotatable, Mirrorable
self,
grid_x: ArrayLike,
grid_y: ArrayLike,
) -> List['Polygon']:
) -> list['Polygon']:
"""
Returns a list of polygons with grid-aligned ("Manhattan") edges approximating the shape.
@ -259,18 +259,19 @@ class Shape(PositionableImpl, LayerableImpl, DoseableImpl, Rotatable, Mirrorable
polygon_contours = []
for polygon in self.to_polygons():
# Get rid of unused gridlines (anything not within 2 lines of the polygon bounds)
bounds = polygon.get_bounds()
bounds = polygon.get_bounds_single()
if bounds is None:
continue
mins, maxs = bounds
keep_x = numpy.logical_and(grx > mins[0], grx < maxs[0])
keep_y = numpy.logical_and(gry > mins[1], gry < maxs[1])
for k in (keep_x, keep_y):
for s in (1, 2):
k[s:] += k[:-s]
k[:-s] += k[s:]
k = k > 0
# Flood left & rightwards by 2 cells
for kk in (keep_x, keep_y):
for ss in (1, 2):
kk[ss:] += kk[:-ss]
kk[:-ss] += kk[ss:]
kk[:] = kk > 0
gx = grx[keep_x]
gy = gry[keep_y]
@ -293,23 +294,10 @@ class Shape(PositionableImpl, LayerableImpl, DoseableImpl, Rotatable, Mirrorable
for contour in contours:
# /2 deals with supersampling
# +.5 deals with the fact that our 0-edge becomes -.5 in the super-sampled contour output
snapped_contour = numpy.rint((contour + .5) / 2, dtype=numpy.int64, casting='unsafe')
snapped_contour = numpy.rint((contour + .5) / 2).astype(numpy.int64)
vertices = numpy.hstack((grx[snapped_contour[:, None, 0] + offset_i[0]],
gry[snapped_contour[:, None, 1] + offset_i[1]]))
manhattan_polygons.append(Polygon(
vertices=vertices,
layer=self.layer,
dose=self.dose))
manhattan_polygons.append(Polygon(vertices=vertices))
return manhattan_polygons
def lock(self: T) -> T:
PositionableImpl._lock(self)
LockableImpl.lock(self)
return self
def unlock(self: T) -> T:
LockableImpl.unlock(self)
PositionableImpl._unlock(self)
return self

View File

@ -1,33 +1,37 @@
from typing import List, Tuple, Dict, Sequence, Optional, Any
from typing import Self, Any, cast
import copy
import functools
import numpy
from numpy import pi, inf
from numpy import pi, nan
from numpy.typing import NDArray, ArrayLike
from . import Shape, Polygon, normalized_shape_tuple
from .. import PatternError
from ..error import PatternError
from ..repetition import Repetition
from ..traits import RotatableImpl
from ..utils import is_scalar, get_bit, normalize_mirror, layer_t, AutoSlots
from ..utils import annotations_t
from ..traits import LockableImpl
from ..utils import is_scalar, get_bit, annotations_t, annotations_lt, annotations_eq, rep2key
# Loaded on use:
# from freetype import Face
# from matplotlib.path import Path
class Text(RotatableImpl, Shape, metaclass=AutoSlots):
@functools.total_ordering
class Text(RotatableImpl, Shape):
"""
Text (to be printed e.g. as a set of polygons).
This is distinct from non-printed Label objects.
"""
__slots__ = ('_string', '_height', '_mirrored', 'font_path')
__slots__ = (
'_string', '_height', '_mirrored', 'font_path',
# Inherited
'_offset', '_repetition', '_annotations', '_rotation',
)
_string: str
_height: float
_mirrored: NDArray[numpy.bool_]
_mirrored: bool
font_path: str
# vertices property
@ -50,16 +54,13 @@ class Text(RotatableImpl, Shape, metaclass=AutoSlots):
raise PatternError('Height must be a scalar')
self._height = val
# Mirrored property
@property
def mirrored(self) -> Any: #TODO mypy#3004 NDArray[numpy.bool_]:
def mirrored(self) -> bool: # mypy#3004, should be bool
return self._mirrored
@mirrored.setter
def mirrored(self, val: Sequence[bool]) -> None:
if is_scalar(val):
raise PatternError('Mirrored must be a 2-element list of booleans')
self._mirrored = numpy.array(val, dtype=bool, copy=True)
def mirrored(self, val: bool) -> None:
self._mirrored = bool(val)
def __init__(
self,
@ -69,56 +70,71 @@ class Text(RotatableImpl, Shape, metaclass=AutoSlots):
*,
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0.0,
mirrored: ArrayLike = (False, False),
layer: layer_t = 0,
dose: float = 1.0,
repetition: Optional[Repetition] = None,
annotations: Optional[annotations_t] = None,
locked: bool = False,
repetition: Repetition | None = None,
annotations: annotations_t | None = None,
raw: bool = False,
) -> None:
LockableImpl.unlock(self)
self.identifier = ()
if raw:
assert(isinstance(offset, numpy.ndarray))
assert(isinstance(mirrored, numpy.ndarray))
assert isinstance(offset, numpy.ndarray)
self._offset = offset
self._layer = layer
self._dose = dose
self._string = string
self._height = height
self._rotation = rotation
self._mirrored = mirrored
self._repetition = repetition
self._annotations = annotations if annotations is not None else {}
else:
self.offset = offset
self.layer = layer
self.dose = dose
self.string = string
self.height = height
self.rotation = rotation
self.mirrored = mirrored
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
self.font_path = font_path
self.set_locked(locked)
def __deepcopy__(self, memo: Dict = None) -> 'Text':
def __deepcopy__(self, memo: dict | None = None) -> Self:
memo = {} if memo is None else memo
new = copy.copy(self)
Shape.unlock(new)
new._offset = self._offset.copy()
new._mirrored = copy.deepcopy(self._mirrored, memo)
new._annotations = copy.deepcopy(self._annotations)
new.set_locked(self.locked)
return new
def __eq__(self, other: Any) -> bool:
return (
type(self) is type(other)
and numpy.array_equal(self.offset, other.offset)
and self.string == other.string
and self.height == other.height
and self.font_path == other.font_path
and self.rotation == other.rotation
and self.repetition == other.repetition
and annotations_eq(self.annotations, other.annotations)
)
def __lt__(self, other: Shape) -> bool:
if type(self) is not type(other):
if repr(type(self)) != repr(type(other)):
return repr(type(self)) < repr(type(other))
return id(type(self)) < id(type(other))
other = cast(Text, other)
if not self.height == other.height:
return self.height < other.height
if not self.string == other.string:
return self.string < other.string
if not self.font_path == other.font_path:
return self.font_path < other.font_path
if not numpy.array_equal(self.offset, other.offset):
return tuple(self.offset) < tuple(other.offset)
if self.rotation != other.rotation:
return self.rotation < other.rotation
if self.repetition != other.repetition:
return rep2key(self.repetition) < rep2key(other.repetition)
return annotations_lt(self.annotations, other.annotations)
def to_polygons(
self,
poly_num_points: Optional[int] = None, # unused
poly_max_arclen: Optional[float] = None, # unused
) -> List[Polygon]:
num_vertices: int | None = None, # unused # noqa: ARG002
max_arclen: float | None = None, # unused # noqa: ARG002
) -> list[Polygon]:
all_polygons = []
total_advance = 0.0
for char in self.string:
@ -126,8 +142,9 @@ class Text(RotatableImpl, Shape, metaclass=AutoSlots):
# Move these polygons to the right of the previous letter
for xys in raw_polys:
poly = Polygon(xys, dose=self.dose, layer=self.layer)
poly.mirror2d(self.mirrored)
poly = Polygon(xys)
if self.mirrored:
poly.mirror()
poly.scale_by(self.height)
poly.offset = self.offset + [total_advance, 0]
poly.rotate_around(self.offset, self.rotation)
@ -138,45 +155,53 @@ class Text(RotatableImpl, Shape, metaclass=AutoSlots):
return all_polygons
def mirror(self, axis: int) -> 'Text':
self.mirrored[axis] = not self.mirrored[axis]
def mirror(self, axis: int = 0) -> Self:
self.mirrored = not self.mirrored
if axis == 1:
self.rotation += pi
return self
def scale_by(self, c: float) -> 'Text':
def scale_by(self, c: float) -> Self:
self.height *= c
return self
def normalized_form(self, norm_value: float) -> normalized_shape_tuple:
mirror_x, rotation = normalize_mirror(self.mirrored)
rotation += self.rotation
rotation %= 2 * pi
return ((type(self), self.string, self.font_path, self.layer),
(self.offset, self.height / norm_value, rotation, mirror_x, self.dose),
lambda: Text(string=self.string,
rotation = self.rotation % (2 * pi)
return ((type(self), self.string, self.font_path),
(self.offset, self.height / norm_value, rotation, bool(self.mirrored)),
lambda: Text(
string=self.string,
height=self.height * norm_value,
font_path=self.font_path,
rotation=rotation,
mirrored=(mirror_x, False),
layer=self.layer))
).mirror2d(across_x=self.mirrored),
)
def get_bounds(self) -> NDArray[numpy.float64]:
def get_bounds_single(self) -> NDArray[numpy.float64]:
# rotation makes this a huge pain when using slot.advance and glyph.bbox(), so
# just convert to polygons instead
bounds = numpy.array([[+inf, +inf], [-inf, -inf]])
polys = self.to_polygons()
for poly in polys:
poly_bounds = poly.get_bounds()
bounds[0, :] = numpy.minimum(bounds[0, :], poly_bounds[0, :])
bounds[1, :] = numpy.maximum(bounds[1, :], poly_bounds[1, :])
pbounds = numpy.full((len(polys), 2, 2), nan)
for pp, poly in enumerate(polys):
pbounds[pp] = poly.get_bounds_nonempty()
bounds = numpy.vstack((
numpy.min(pbounds[: 0, :], axis=0),
numpy.max(pbounds[: 1, :], axis=0),
))
return bounds
def __repr__(self) -> str:
rotation = f'{numpy.rad2deg(self.rotation):g}' if self.rotation != 0 else ''
mirrored = ' m{:d}' if self.mirrored else ''
return f'<TextShape "{self.string}" o{self.offset} h{self.height:g}{rotation}{mirrored}>'
def get_char_as_polygons(
font_path: str,
char: str,
resolution: float = 48 * 64,
) -> Tuple[List[List[List[float]]], float]:
) -> tuple[list[list[list[float]]], float]:
from freetype import Face # type: ignore
from matplotlib.path import Path # type: ignore
@ -196,7 +221,7 @@ def get_char_as_polygons(
'advance' distance (distance from the start of this glyph to the start of the next one)
"""
if len(char) != 1:
raise Exception('get_char_as_polygons called with non-char')
raise PatternError('get_char_as_polygons called with non-char')
face = Face(font_path)
face.set_char_size(resolution)
@ -205,7 +230,8 @@ def get_char_as_polygons(
outline = slot.outline
start = 0
all_verts_list, all_codes = [], []
all_verts_list = []
all_codes = []
for end in outline.contours:
points = outline.points[start:end + 1]
points.append(points[0])
@ -213,7 +239,7 @@ def get_char_as_polygons(
tags = outline.tags[start:end + 1]
tags.append(tags[0])
segments: List[List[List[float]]] = []
segments: list[list[list[float]]] = []
for j, point in enumerate(points):
# If we already have a segment, add this point to it
if j > 0:
@ -258,20 +284,3 @@ def get_char_as_polygons(
polygons = path.to_polygons()
return polygons, advance
def lock(self) -> 'Text':
self.mirrored.flags.writeable = False
Shape.lock(self)
return self
def unlock(self) -> 'Text':
Shape.unlock(self)
self.mirrored.flags.writeable = True
return self
def __repr__(self) -> str:
rotation = f'{self.rotation*180/pi:g}' if self.rotation != 0 else ''
dose = f' d{self.dose:g}' if self.dose != 1 else ''
locked = ' L' if self.locked else ''
mirrored = ' m{:d}{:d}'.format(*self.mirrored) if self.mirrored.any() else ''
return f'<TextShape "{self.string}" l{self.layer} o{self.offset} h{self.height:g}{rotation}{mirrored}{dose}{locked}>'

View File

@ -1,248 +0,0 @@
"""
SubPattern provides basic support for nesting Pattern objects within each other, by adding
offset, rotation, scaling, and other such properties to the reference.
"""
#TODO more top-level documentation
from typing import Dict, Tuple, Optional, Sequence, TYPE_CHECKING, Any, TypeVar
import copy
import numpy
from numpy import pi
from numpy.typing import NDArray, ArrayLike
from .error import PatternError
from .utils import is_scalar, AutoSlots, annotations_t
from .repetition import Repetition
from .traits import (PositionableImpl, DoseableImpl, RotatableImpl, ScalableImpl,
Mirrorable, PivotableImpl, Copyable, LockableImpl, RepeatableImpl,
AnnotatableImpl)
if TYPE_CHECKING:
from . import Pattern
S = TypeVar('S', bound='SubPattern')
class SubPattern(PositionableImpl, DoseableImpl, RotatableImpl, ScalableImpl, Mirrorable,
PivotableImpl, Copyable, RepeatableImpl, LockableImpl, AnnotatableImpl,
metaclass=AutoSlots):
"""
SubPattern provides basic support for nesting Pattern objects within each other, by adding
offset, rotation, scaling, and associated methods.
"""
__slots__ = ('_pattern',
'_mirrored',
'identifier',
)
_pattern: Optional['Pattern']
""" The `Pattern` being instanced """
_mirrored: NDArray[numpy.bool_]
""" Whether to mirror the instance across the x and/or y axes. """
identifier: Tuple[Any, ...]
""" Arbitrary identifier, used internally by some `masque` functions. """
def __init__(
self,
pattern: Optional['Pattern'],
*,
offset: ArrayLike = (0.0, 0.0),
rotation: float = 0.0,
mirrored: Optional[Sequence[bool]] = None,
dose: float = 1.0,
scale: float = 1.0,
repetition: Optional[Repetition] = None,
annotations: Optional[annotations_t] = None,
locked: bool = False,
identifier: Tuple[Any, ...] = (),
) -> None:
"""
Args:
pattern: Pattern to reference.
offset: (x, y) offset applied to the referenced pattern. Not affected by rotation etc.
rotation: Rotation (radians, counterclockwise) relative to the referenced pattern's (0, 0).
mirrored: Whether to mirror the referenced pattern across its x and y axes.
dose: Scaling factor applied to the dose.
scale: Scaling factor applied to the pattern's geometry.
repetition: TODO
locked: Whether the `SubPattern` is locked after initialization.
identifier: Arbitrary tuple, used internally by some `masque` functions.
"""
LockableImpl.unlock(self)
self.identifier = identifier
self.pattern = pattern
self.offset = offset
self.rotation = rotation
self.dose = dose
self.scale = scale
if mirrored is None:
mirrored = (False, False)
self.mirrored = mirrored
self.repetition = repetition
self.annotations = annotations if annotations is not None else {}
self.set_locked(locked)
def __copy__(self) -> 'SubPattern':
new = SubPattern(pattern=self.pattern,
offset=self.offset.copy(),
rotation=self.rotation,
dose=self.dose,
scale=self.scale,
mirrored=self.mirrored.copy(),
repetition=copy.deepcopy(self.repetition),
annotations=copy.deepcopy(self.annotations),
locked=self.locked)
return new
def __deepcopy__(self, memo: Dict = None) -> 'SubPattern':
memo = {} if memo is None else memo
new = copy.copy(self)
LockableImpl.unlock(new)
new.pattern = copy.deepcopy(self.pattern, memo)
new.repetition = copy.deepcopy(self.repetition, memo)
new.annotations = copy.deepcopy(self.annotations, memo)
new.set_locked(self.locked)
return new
# pattern property
@property
def pattern(self) -> Optional['Pattern']:
return self._pattern
@pattern.setter
def pattern(self, val: Optional['Pattern']) -> None:
from .pattern import Pattern
if val is not None and not isinstance(val, Pattern):
raise PatternError(f'Provided pattern {val} is not a Pattern object or None!')
self._pattern = val
# Mirrored property
@property
def mirrored(self) -> Any: #TODO mypy#3004 NDArray[numpy.bool_]:
return self._mirrored
@mirrored.setter
def mirrored(self, val: ArrayLike) -> None:
if is_scalar(val):
raise PatternError('Mirrored must be a 2-element list of booleans')
self._mirrored = numpy.array(val, dtype=bool, copy=True)
def as_pattern(self) -> 'Pattern':
"""
Returns:
A copy of self.pattern which has been scaled, rotated, etc. according to this
`SubPattern`'s properties.
"""
assert(self.pattern is not None)
pattern = self.pattern.deepcopy().deepunlock()
if self.scale != 1:
pattern.scale_by(self.scale)
if numpy.any(self.mirrored):
pattern.mirror2d(self.mirrored)
if self.rotation % (2 * pi) != 0:
pattern.rotate_around((0.0, 0.0), self.rotation)
if numpy.any(self.offset):
pattern.translate_elements(self.offset)
if self.dose != 1:
pattern.scale_element_doses(self.dose)
if self.repetition is not None:
combined = type(pattern)(name='__repetition__')
for dd in self.repetition.displacements:
temp_pat = pattern.deepcopy()
temp_pat.translate_elements(dd)
combined.append(temp_pat)
pattern = combined
return pattern
def rotate(self: S, rotation: float) -> S:
self.rotation += rotation
if self.repetition is not None:
self.repetition.rotate(rotation)
return self
def mirror(self: S, axis: int) -> S:
self.mirrored[axis] = not self.mirrored[axis]
self.rotation *= -1
if self.repetition is not None:
self.repetition.mirror(axis)
return self
def get_bounds(self) -> Optional[NDArray[numpy.float64]]:
"""
Return a `numpy.ndarray` containing `[[x_min, y_min], [x_max, y_max]]`, corresponding to the
extent of the `SubPattern` in each dimension.
Returns `None` if the contained `Pattern` is empty.
Returns:
`[[x_min, y_min], [x_max, y_max]]` or `None`
"""
if self.pattern is None:
return None
return self.as_pattern().get_bounds()
def lock(self: S) -> S:
"""
Lock the SubPattern, disallowing changes
Returns:
self
"""
self.mirrored.flags.writeable = False
PositionableImpl._lock(self)
LockableImpl.lock(self)
return self
def unlock(self: S) -> S:
"""
Unlock the SubPattern
Returns:
self
"""
LockableImpl.unlock(self)
PositionableImpl._unlock(self)
self.mirrored.flags.writeable = True
return self
def deeplock(self: S) -> S:
"""
Recursively lock the SubPattern and its contained pattern
Returns:
self
"""
assert(self.pattern is not None)
self.lock()
self.pattern.deeplock()
return self
def deepunlock(self: S) -> S:
"""
Recursively unlock the SubPattern and its contained pattern
This is dangerous unless you have just performed a deepcopy, since
the subpattern and its components may be used in more than one once!
Returns:
self
"""
assert(self.pattern is not None)
self.unlock()
self.pattern.deepunlock()
return self
def __repr__(self) -> str:
name = self.pattern.name if self.pattern is not None else None
rotation = f' r{self.rotation*180/pi:g}' if self.rotation != 0 else ''
scale = f' d{self.scale:g}' if self.scale != 1 else ''
mirrored = ' m{:d}{:d}'.format(*self.mirrored) if self.mirrored.any() else ''
dose = f' d{self.dose:g}' if self.dose != 1 else ''
locked = ' L' if self.locked else ''
return f'<SubPattern "{name}" at {self.offset}{rotation}{scale}{mirrored}{dose}{locked}>'

View File

@ -1,13 +1,34 @@
"""
Traits (mixins) and default implementations
Traits and mixins should set `__slots__ = ()` to enable use of `__slots__` in subclasses.
"""
from .positionable import Positionable, PositionableImpl
from .layerable import Layerable, LayerableImpl
from .doseable import Doseable, DoseableImpl
from .rotatable import Rotatable, RotatableImpl, Pivotable, PivotableImpl
from .repeatable import Repeatable, RepeatableImpl
from .scalable import Scalable, ScalableImpl
from .mirrorable import Mirrorable
from .copyable import Copyable
from .lockable import Lockable, LockableImpl
from .annotatable import Annotatable, AnnotatableImpl
from .positionable import (
Positionable as Positionable,
PositionableImpl as PositionableImpl,
Bounded as Bounded,
)
from .layerable import (
Layerable as Layerable,
LayerableImpl as LayerableImpl,
)
from .rotatable import (
Rotatable as Rotatable,
RotatableImpl as RotatableImpl,
Pivotable as Pivotable,
PivotableImpl as PivotableImpl,
)
from .repeatable import (
Repeatable as Repeatable,
RepeatableImpl as RepeatableImpl,
)
from .scalable import (
Scalable as Scalable,
ScalableImpl as ScalableImpl,
)
from .mirrorable import Mirrorable as Mirrorable
from .copyable import Copyable as Copyable
from .annotatable import (
Annotatable as Annotatable,
AnnotatableImpl as AnnotatableImpl,
)

View File

@ -1,4 +1,3 @@
from typing import TypeVar
#from types import MappingProxyType
from abc import ABCMeta, abstractmethod
@ -6,20 +5,19 @@ from ..utils import annotations_t
from ..error import MasqueError
T = TypeVar('T', bound='Annotatable')
I = TypeVar('I', bound='AnnotatableImpl')
_empty_slots = () # Workaround to get mypy to ignore intentionally empty slots for superclass
class Annotatable(metaclass=ABCMeta):
"""
Abstract class for all annotatable entities
Trait class for all annotatable entities
Annotations correspond to GDS/OASIS "properties"
"""
__slots__ = ()
'''
---- Properties
'''
#
# Properties
#
@property
@abstractmethod
def annotations(self) -> annotations_t:
@ -33,23 +31,20 @@ class AnnotatableImpl(Annotatable, metaclass=ABCMeta):
"""
Simple implementation of `Annotatable`.
"""
__slots__ = ()
__slots__ = _empty_slots
_annotations: annotations_t
""" Dictionary storing annotation name/value pairs """
'''
---- Non-abstract properties
'''
#
# Non-abstract properties
#
@property
def annotations(self) -> annotations_t:
return self._annotations
# # TODO: Find a way to make sure the subclass implements Lockable without dealing with diamond inheritance or this extra hasattr
# if hasattr(self, 'is_locked') and self.is_locked():
# return MappingProxyType(self._annotations)
@annotations.setter
def annotations(self, annotations: annotations_t):
def annotations(self, annotations: annotations_t) -> None:
if not isinstance(annotations, dict):
raise MasqueError(f'annotations expected dict, got {type(annotations)}')
self._annotations = annotations

View File

@ -1,21 +1,17 @@
from typing import TypeVar
from abc import ABCMeta
from typing import Self
import copy
T = TypeVar('T', bound='Copyable')
class Copyable(metaclass=ABCMeta):
class Copyable:
"""
Abstract class which adds .copy() and .deepcopy()
Trait class which adds .copy() and .deepcopy()
"""
__slots__ = ()
'''
---- Non-abstract methods
'''
def copy(self: T) -> T:
#
# Non-abstract methods
#
def copy(self) -> Self:
"""
Return a shallow copy of the object.
@ -24,7 +20,7 @@ class Copyable(metaclass=ABCMeta):
"""
return copy.copy(self)
def deepcopy(self: T) -> T:
def deepcopy(self) -> Self:
"""
Return a deep copy of the object.

View File

@ -1,76 +0,0 @@
from typing import TypeVar
from abc import ABCMeta, abstractmethod
from ..error import MasqueError
T = TypeVar('T', bound='Doseable')
I = TypeVar('I', bound='DoseableImpl')
class Doseable(metaclass=ABCMeta):
"""
Abstract class for all doseable entities
"""
__slots__ = ()
'''
---- Properties
'''
@property
@abstractmethod
def dose(self) -> float:
"""
Dose (float >= 0)
"""
pass
# @dose.setter
# @abstractmethod
# def dose(self, val: float):
# pass
'''
---- Methods
'''
def set_dose(self: T, dose: float) -> T:
"""
Set the dose
Args:
dose: new value for dose
Returns:
self
"""
pass
class DoseableImpl(Doseable, metaclass=ABCMeta):
"""
Simple implementation of Doseable
"""
__slots__ = ()
_dose: float
""" Dose """
'''
---- Non-abstract properties
'''
@property
def dose(self) -> float:
return self._dose
@dose.setter
def dose(self, val: float):
if not val >= 0:
raise MasqueError('Dose must be non-negative')
self._dose = val
'''
---- Non-abstract methods
'''
def set_dose(self: I, dose: float) -> I:
self.dose = dose
return self

View File

@ -1,21 +1,21 @@
from typing import TypeVar
from typing import Self
from abc import ABCMeta, abstractmethod
from ..utils import layer_t
T = TypeVar('T', bound='Layerable')
I = TypeVar('I', bound='LayerableImpl')
_empty_slots = () # Workaround to get mypy to ignore intentionally empty slots for superclass
class Layerable(metaclass=ABCMeta):
"""
Abstract class for all layerable entities
Trait class for all layerable entities
"""
__slots__ = ()
'''
---- Properties
'''
#
# Properties
#
@property
@abstractmethod
def layer(self) -> layer_t:
@ -29,10 +29,11 @@ class Layerable(metaclass=ABCMeta):
# def layer(self, val: layer_t):
# pass
'''
---- Methods
'''
def set_layer(self: T, layer: layer_t) -> T:
#
# Methods
#
@abstractmethod
def set_layer(self, layer: layer_t) -> Self:
"""
Set the layer
@ -49,25 +50,25 @@ class LayerableImpl(Layerable, metaclass=ABCMeta):
"""
Simple implementation of Layerable
"""
__slots__ = ()
__slots__ = _empty_slots
_layer: layer_t
""" Layer number, pair, or name """
'''
---- Non-abstract properties
'''
#
# Non-abstract properties
#
@property
def layer(self) -> layer_t:
return self._layer
@layer.setter
def layer(self, val: layer_t):
def layer(self, val: layer_t) -> None:
self._layer = val
'''
---- Non-abstract methods
'''
def set_layer(self: I, layer: layer_t) -> I:
#
# Non-abstract methods
#
def set_layer(self, layer: layer_t) -> Self:
self.layer = layer
return self

View File

@ -1,103 +0,0 @@
from typing import TypeVar, Dict, Tuple, Any
from abc import ABCMeta, abstractmethod
from ..error import PatternLockedError
T = TypeVar('T', bound='Lockable')
I = TypeVar('I', bound='LockableImpl')
class Lockable(metaclass=ABCMeta):
"""
Abstract class for all lockable entities
"""
__slots__ = () # type: Tuple[str, ...]
'''
---- Methods
'''
@abstractmethod
def lock(self: T) -> T:
"""
Lock the object, disallowing further changes
Returns:
self
"""
pass
@abstractmethod
def unlock(self: T) -> T:
"""
Unlock the object, reallowing changes
Returns:
self
"""
pass
@abstractmethod
def is_locked(self) -> bool:
"""
Returns:
True if the object is locked
"""
pass
def set_locked(self: T, locked: bool) -> T:
"""
Locks or unlocks based on the argument.
No action if already in the requested state.
Args:
locked: State to set.
Returns:
self
"""
if locked != self.is_locked():
if locked:
self.lock()
else:
self.unlock()
return self
class LockableImpl(Lockable, metaclass=ABCMeta):
"""
Simple implementation of Lockable
"""
__slots__ = () # type: Tuple[str, ...]
locked: bool
""" If `True`, disallows changes to the object """
'''
---- Non-abstract methods
'''
def __setattr__(self, name, value):
if self.locked and name != 'locked':
raise PatternLockedError()
object.__setattr__(self, name, value)
def __getstate__(self) -> Dict[str, Any]:
if hasattr(self, '__slots__'):
return {key: getattr(self, key) for key in self.__slots__}
else:
return self.__dict__
def __setstate__(self, state: Dict[str, Any]) -> None:
for k, v in state.items():
object.__setattr__(self, k, v)
def lock(self: I) -> I:
object.__setattr__(self, 'locked', True)
return self
def unlock(self: I) -> I:
object.__setattr__(self, 'locked', False)
return self
def is_locked(self) -> bool:
return self.locked

View File

@ -1,22 +1,15 @@
from typing import TypeVar, Tuple
from typing import Self
from abc import ABCMeta, abstractmethod
T = TypeVar('T', bound='Mirrorable')
#I = TypeVar('I', bound='MirrorableImpl')
class Mirrorable(metaclass=ABCMeta):
"""
Abstract class for all mirrorable entities
Trait class for all mirrorable entities
"""
__slots__ = ()
'''
---- Abstract methods
'''
@abstractmethod
def mirror(self: T, axis: int) -> T:
def mirror(self, axis: int = 0) -> Self:
"""
Mirror the entity across an axis.
@ -28,7 +21,7 @@ class Mirrorable(metaclass=ABCMeta):
"""
pass
def mirror2d(self: T, axes: Tuple[bool, bool]) -> T:
def mirror2d(self, across_x: bool = False, across_y: bool = False) -> Self:
"""
Optionally mirror the entity across both axes
@ -38,9 +31,9 @@ class Mirrorable(metaclass=ABCMeta):
Returns:
self
"""
if axes[0]:
if across_x:
self.mirror(0)
if axes[1]:
if across_y:
self.mirror(1)
return self
@ -51,24 +44,24 @@ class Mirrorable(metaclass=ABCMeta):
# """
# __slots__ = ()
#
# _mirrored: numpy.ndarray # ndarray[bool]
# _mirrored: NDArray[numpy.bool]
# """ Whether to mirror the instance across the x and/or y axes. """
#
# '''
# ---- Properties
# '''
# #
# # Properties
# #
# # Mirrored property
# @property
# def mirrored(self) -> numpy.ndarray: # ndarray[bool]
# def mirrored(self) -> NDArray[numpy.bool]:
# """ Whether to mirror across the [x, y] axes, respectively """
# return self._mirrored
#
# @mirrored.setter
# def mirrored(self, val: Sequence[bool]):
# def mirrored(self, val: Sequence[bool]) -> None:
# if is_scalar(val):
# raise MasqueError('Mirrored must be a 2-element list of booleans')
# self._mirrored = numpy.array(val, dtype=bool, copy=True)
# self._mirrored = numpy.array(val, dtype=bool)
#
# '''
# ---- Methods
# '''
# #
# # Methods
# #

View File

@ -1,6 +1,4 @@
# TODO top-level comment about how traits should set __slots__ = (), and how to use AutoSlots
from typing import TypeVar, Any, Optional
from typing import Self, Any
from abc import ABCMeta, abstractmethod
import numpy
@ -9,19 +7,18 @@ from numpy.typing import NDArray, ArrayLike
from ..error import MasqueError
T = TypeVar('T', bound='Positionable')
I = TypeVar('I', bound='PositionableImpl')
_empty_slots = () # Workaround to get mypy to ignore intentionally empty slots for superclass
class Positionable(metaclass=ABCMeta):
"""
Abstract class for all positionable entities
Trait class for all positionable entities
"""
__slots__ = ()
'''
---- Abstract properties
'''
#
# Properties
#
@property
@abstractmethod
def offset(self) -> NDArray[numpy.float64]:
@ -30,13 +27,13 @@ class Positionable(metaclass=ABCMeta):
"""
pass
# @offset.setter
# @abstractmethod
# def offset(self, val: ArrayLike):
# pass
@offset.setter
@abstractmethod
def offset(self, val: ArrayLike) -> None:
pass
@abstractmethod
def set_offset(self: T, offset: ArrayLike) -> T:
def set_offset(self, offset: ArrayLike) -> Self:
"""
Set the offset
@ -49,7 +46,7 @@ class Positionable(metaclass=ABCMeta):
pass
@abstractmethod
def translate(self: T, offset: ArrayLike) -> T:
def translate(self, offset: ArrayLike) -> Self:
"""
Translate the entity by the given offset
@ -61,41 +58,22 @@ class Positionable(metaclass=ABCMeta):
"""
pass
@abstractmethod
def get_bounds(self) -> Optional[NDArray[numpy.float64]]:
"""
Returns `[[x_min, y_min], [x_max, y_max]]` which specify a minimal bounding box for the entity.
Returns `None` for an empty entity.
"""
pass
def get_bounds_nonempty(self) -> NDArray[numpy.float64]:
"""
Returns `[[x_min, y_min], [x_max, y_max]]` which specify a minimal bounding box for the entity.
Asserts that the entity is non-empty (i.e., `get_bounds()` does not return None).
This is handy for destructuring like `xy_min, xy_max = entity.get_bounds_nonempty()`
"""
bounds = self.get_bounds()
assert(bounds is not None)
return bounds
class PositionableImpl(Positionable, metaclass=ABCMeta):
"""
Simple implementation of Positionable
"""
__slots__ = ()
__slots__ = _empty_slots
_offset: NDArray[numpy.float64]
""" `[x_offset, y_offset]` """
'''
---- Properties
'''
#
# Properties
#
# offset property
@property
def offset(self) -> Any: #TODO mypy#3003 NDArray[numpy.float64]:
def offset(self) -> Any: # mypy#3004 NDArray[numpy.float64]:
"""
[x, y] offset
"""
@ -103,40 +81,42 @@ class PositionableImpl(Positionable, metaclass=ABCMeta):
@offset.setter
def offset(self, val: ArrayLike) -> None:
if not isinstance(val, numpy.ndarray) or val.dtype != numpy.float64:
val = numpy.array(val, dtype=float)
if val.size != 2:
raise MasqueError('Offset must be convertible to size-2 ndarray')
self._offset = val.flatten()
'''
---- Methods
'''
def set_offset(self: I, offset: ArrayLike) -> I:
#
# Methods
#
def set_offset(self, offset: ArrayLike) -> Self:
self.offset = offset
return self
def translate(self: I, offset: ArrayLike) -> I:
def translate(self, offset: ArrayLike) -> Self:
self._offset += offset # type: ignore # NDArray += ArrayLike should be fine??
return self
def _lock(self: I) -> I:
"""
Lock the entity, disallowing further changes
Returns:
self
class Bounded(metaclass=ABCMeta):
@abstractmethod
def get_bounds(self, *args, **kwargs) -> NDArray[numpy.float64] | None:
"""
self._offset.flags.writeable = False
return self
Returns `[[x_min, y_min], [x_max, y_max]]` which specify a minimal bounding box for the entity.
Returns `None` for an empty entity.
"""
pass
def _unlock(self: I) -> I:
def get_bounds_nonempty(self, *args, **kwargs) -> NDArray[numpy.float64]:
"""
Unlock the entity
Returns `[[x_min, y_min], [x_max, y_max]]` which specify a minimal bounding box for the entity.
Asserts that the entity is non-empty (i.e., `get_bounds()` does not return None).
Returns:
self
This is handy for destructuring like `xy_min, xy_max = entity.get_bounds_nonempty()`
"""
self._offset.flags.writeable = True
return self
bounds = self.get_bounds(*args, **kwargs)
assert bounds is not None
return bounds

View File

@ -1,29 +1,32 @@
from typing import TypeVar, Optional, TYPE_CHECKING
from typing import Self, TYPE_CHECKING
from abc import ABCMeta, abstractmethod
import numpy
from numpy.typing import NDArray
from ..error import MasqueError
from .positionable import Bounded
_empty_slots = () # Workaround to get mypy to ignore intentionally empty slots for superclass
if TYPE_CHECKING:
from ..repetition import Repetition
T = TypeVar('T', bound='Repeatable')
I = TypeVar('I', bound='RepeatableImpl')
class Repeatable(metaclass=ABCMeta):
"""
Abstract class for all repeatable entities
Trait class for all repeatable entities
"""
__slots__ = ()
'''
---- Properties
'''
#
# Properties
#
@property
@abstractmethod
def repetition(self) -> Optional['Repetition']:
def repetition(self) -> 'Repetition | None':
"""
Repetition object, or None (single instance only)
"""
@ -31,14 +34,14 @@ class Repeatable(metaclass=ABCMeta):
# @repetition.setter
# @abstractmethod
# def repetition(self, repetition: Optional['Repetition']):
# def repetition(self, repetition: 'Repetition | None') -> None:
# pass
'''
---- Methods
'''
#
# Methods
#
@abstractmethod
def set_repetition(self: T, repetition: Optional['Repetition']) -> T:
def set_repetition(self, repetition: 'Repetition | None') -> Self:
"""
Set the repetition
@ -51,32 +54,57 @@ class Repeatable(metaclass=ABCMeta):
pass
class RepeatableImpl(Repeatable, metaclass=ABCMeta):
class RepeatableImpl(Repeatable, Bounded, metaclass=ABCMeta):
"""
Simple implementation of `Repeatable`
Simple implementation of `Repeatable` and extension of `Bounded` to include repetition bounds.
"""
__slots__ = ()
__slots__ = _empty_slots
_repetition: Optional['Repetition']
_repetition: 'Repetition | None'
""" Repetition object, or None (single instance only) """
'''
---- Non-abstract properties
'''
@abstractmethod
def get_bounds_single(self, *args, **kwargs) -> NDArray[numpy.float64] | None:
pass
#
# Non-abstract properties
#
@property
def repetition(self) -> Optional['Repetition']:
def repetition(self) -> 'Repetition | None':
return self._repetition
@repetition.setter
def repetition(self, repetition: Optional['Repetition']):
def repetition(self, repetition: 'Repetition | None') -> None:
from ..repetition import Repetition
if repetition is not None and not isinstance(repetition, Repetition):
raise MasqueError(f'{repetition} is not a valid Repetition object!')
self._repetition = repetition
'''
---- Non-abstract methods
'''
def set_repetition(self: I, repetition: Optional['Repetition']) -> I:
#
# Non-abstract methods
#
def set_repetition(self, repetition: 'Repetition | None') -> Self:
self.repetition = repetition
return self
def get_bounds_single_nonempty(self, *args, **kwargs) -> NDArray[numpy.float64]:
"""
Returns `[[x_min, y_min], [x_max, y_max]]` which specify a minimal bounding box for the entity.
Asserts that the entity is non-empty (i.e., `get_bounds()` does not return None).
This is handy for destructuring like `xy_min, xy_max = entity.get_bounds_nonempty()`
"""
bounds = self.get_bounds_single(*args, **kwargs)
assert bounds is not None
return bounds
def get_bounds(self, *args, **kwargs) -> NDArray[numpy.float64] | None:
bounds = self.get_bounds_single(*args, **kwargs)
if bounds is not None and self.repetition is not None:
rep_bounds = self.repetition.get_bounds()
if rep_bounds is None:
return None
bounds += rep_bounds
return bounds

View File

@ -1,31 +1,29 @@
from typing import TypeVar
from typing import Self, cast, Any
from abc import ABCMeta, abstractmethod
import numpy
from numpy import pi
from numpy.typing import ArrayLike, NDArray
from numpy.typing import ArrayLike
#from .positionable import Positionable
from .positionable import Positionable
from ..error import MasqueError
from ..utils import is_scalar, rotation_matrix_2d
from ..utils import rotation_matrix_2d
T = TypeVar('T', bound='Rotatable')
I = TypeVar('I', bound='RotatableImpl')
P = TypeVar('P', bound='Pivotable')
J = TypeVar('J', bound='PivotableImpl')
_empty_slots = () # Workaround to get mypy to ignore intentionally empty slots for superclass
class Rotatable(metaclass=ABCMeta):
"""
Abstract class for all rotatable entities
Trait class for all rotatable entities
"""
__slots__ = ()
'''
---- Abstract methods
'''
#
# Methods
#
@abstractmethod
def rotate(self: T, val: float) -> T:
def rotate(self, val: float) -> Self:
"""
Rotate the shape around its origin (0, 0), ignoring its offset.
@ -42,33 +40,33 @@ class RotatableImpl(Rotatable, metaclass=ABCMeta):
"""
Simple implementation of `Rotatable`
"""
__slots__ = ()
__slots__ = _empty_slots
_rotation: float
""" rotation for the object, radians counterclockwise """
'''
---- Properties
'''
#
# Properties
#
@property
def rotation(self) -> float:
""" Rotation, radians counterclockwise """
return self._rotation
@rotation.setter
def rotation(self, val: float):
def rotation(self, val: float) -> None:
if not numpy.size(val) == 1:
raise MasqueError('Rotation must be a scalar')
self._rotation = val % (2 * pi)
'''
---- Methods
'''
def rotate(self: I, rotation: float) -> I:
#
# Methods
#
def rotate(self, rotation: float) -> Self:
self.rotation += rotation
return self
def set_rotation(self: I, rotation: float) -> I:
def set_rotation(self, rotation: float) -> Self:
"""
Set the rotation to a value
@ -84,13 +82,13 @@ class RotatableImpl(Rotatable, metaclass=ABCMeta):
class Pivotable(metaclass=ABCMeta):
"""
Abstract class for entites which can be rotated around a point.
Trait class for entites which can be rotated around a point.
This requires that they are `Positionable` but not necessarily `Rotatable` themselves.
"""
__slots__ = ()
@abstractmethod
def rotate_around(self: P, pivot: ArrayLike, rotation: float) -> P:
def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self:
"""
Rotate the object around a point.
@ -110,11 +108,14 @@ class PivotableImpl(Pivotable, metaclass=ABCMeta):
"""
__slots__ = ()
def rotate_around(self: J, pivot: ArrayLike, rotation: float) -> J:
pivot = numpy.array(pivot, dtype=float)
self.translate(-pivot)
self.rotate(rotation)
self.offset = numpy.dot(rotation_matrix_2d(rotation), self.offset) #type: ignore #TODO: mypy#3004
self.translate(+pivot)
offset: Any # TODO see if we can get around defining `offset` in PivotableImpl
""" `[x_offset, y_offset]` """
def rotate_around(self, pivot: ArrayLike, rotation: float) -> Self:
pivot = numpy.asarray(pivot, dtype=float)
cast(Positionable, self).translate(-pivot)
cast(Rotatable, self).rotate(rotation)
self.offset = numpy.dot(rotation_matrix_2d(rotation), self.offset) # type: ignore # mypy#3004
cast(Positionable, self).translate(+pivot)
return self

View File

@ -1,25 +1,24 @@
from typing import TypeVar
from typing import Self
from abc import ABCMeta, abstractmethod
from ..error import MasqueError
from ..utils import is_scalar
T = TypeVar('T', bound='Scalable')
I = TypeVar('I', bound='ScalableImpl')
_empty_slots = () # Workaround to get mypy to ignore intentionally empty slots for superclass
class Scalable(metaclass=ABCMeta):
"""
Abstract class for all scalable entities
Trait class for all scalable entities
"""
__slots__ = ()
'''
---- Abstract methods
'''
#
# Methods
#
@abstractmethod
def scale_by(self: T, c: float) -> T:
def scale_by(self, c: float) -> Self:
"""
Scale the entity by a factor
@ -36,34 +35,34 @@ class ScalableImpl(Scalable, metaclass=ABCMeta):
"""
Simple implementation of Scalable
"""
__slots__ = ()
__slots__ = _empty_slots
_scale: float
""" scale factor for the entity """
'''
---- Properties
'''
#
# Properties
#
@property
def scale(self) -> float:
return self._scale
@scale.setter
def scale(self, val: float):
def scale(self, val: float) -> None:
if not is_scalar(val):
raise MasqueError('Scale must be a scalar')
if not val > 0:
raise MasqueError('Scale must be positive')
self._scale = val
'''
---- Methods
'''
def scale_by(self: I, c: float) -> I:
#
# Methods
#
def scale_by(self, c: float) -> Self:
self.scale *= c
return self
def set_scale(self: I, scale: float) -> I:
def set_scale(self, scale: float) -> Self:
"""
Set the sclae to a value

View File

@ -1,165 +0,0 @@
"""
Various helper functions
"""
from typing import Any, Union, Tuple, Sequence, Dict, List
from abc import ABCMeta
import numpy
from numpy.typing import NDArray, ArrayLike
# Type definitions
layer_t = Union[int, Tuple[int, int], str]
annotations_t = Dict[str, List[Union[int, float, str]]]
def is_scalar(var: Any) -> bool:
"""
Alias for 'not hasattr(var, "__len__")'
Args:
var: Checks if `var` has a length.
"""
return not hasattr(var, "__len__")
def get_bit(bit_string: Any, bit_id: int) -> bool:
"""
Interprets bit number `bit_id` from the right (lsb) of `bit_string` as a boolean
Args:
bit_string: Bit string to test
bit_id: Bit number, 0-indexed from the right (lsb)
Returns:
Boolean value of the requested bit
"""
return bit_string & (1 << bit_id) != 0
def set_bit(bit_string: Any, bit_id: int, value: bool) -> Any:
"""
Returns `bit_string`, with bit number `bit_id` set to boolean `value`.
Args:
bit_string: Bit string to alter
bit_id: Bit number, 0-indexed from right (lsb)
value: Boolean value to set bit to
Returns:
Altered `bit_string`
"""
mask = (1 << bit_id)
bit_string &= ~mask
if value:
bit_string |= mask
return bit_string
def rotation_matrix_2d(theta: float) -> NDArray[numpy.float64]:
"""
2D rotation matrix for rotating counterclockwise around the origin.
Args:
theta: Angle to rotate, in radians
Returns:
rotation matrix
"""
return numpy.array([[numpy.cos(theta), -numpy.sin(theta)],
[numpy.sin(theta), +numpy.cos(theta)]])
def normalize_mirror(mirrored: Sequence[bool]) -> Tuple[bool, float]:
"""
Converts 0-2 mirror operations `(mirror_across_x_axis, mirror_across_y_axis)`
into 0-1 mirror operations and a rotation
Args:
mirrored: `(mirror_across_x_axis, mirror_across_y_axis)`
Returns:
`mirror_across_x_axis` (bool) and
`angle_to_rotate` in radians
"""
mirrored_x, mirrored_y = mirrored
mirror_x = (mirrored_x != mirrored_y) # XOR
angle = numpy.pi if mirrored_y else 0
return mirror_x, angle
def remove_duplicate_vertices(vertices: ArrayLike, closed_path: bool = True) -> NDArray[numpy.float64]:
"""
Given a list of vertices, remove any consecutive duplicates.
Args:
vertices: `[[x0, y0], [x1, y1], ...]`
closed_path: If True, `vertices` is interpreted as an implicity-closed path
(i.e. the last vertex will be removed if it is the same as the first)
Returns:
`vertices` with no consecutive duplicates.
"""
vertices = numpy.array(vertices)
duplicates = (vertices == numpy.roll(vertices, 1, axis=0)).all(axis=1)
if not closed_path:
duplicates[0] = False
return vertices[~duplicates]
def remove_colinear_vertices(vertices: ArrayLike, closed_path: bool = True) -> NDArray[numpy.float64]:
"""
Given a list of vertices, remove any superflous vertices (i.e.
those which lie along the line formed by their neighbors)
Args:
vertices: Nx2 ndarray of vertices
closed_path: If `True`, the vertices are assumed to represent an implicitly
closed path. If `False`, the path is assumed to be open. Default `True`.
Returns:
`vertices` with colinear (superflous) vertices removed.
"""
vertices = remove_duplicate_vertices(vertices)
# Check for dx0/dy0 == dx1/dy1
dv = numpy.roll(vertices, -1, axis=0) - vertices # [y1-y0, y2-y1, ...]
dxdy = dv * numpy.roll(dv, 1, axis=0)[:, ::-1] # [[dx0*(dy_-1), (dx_-1)*dy0], dx1*dy0, dy1*dx0]]
dxdy_diff = numpy.abs(numpy.diff(dxdy, axis=1))[:, 0]
err_mult = 2 * numpy.abs(dxdy).sum(axis=1) + 1e-40
slopes_equal = (dxdy_diff / err_mult) < 1e-15
if not closed_path:
slopes_equal[[0, -1]] = False
return vertices[~slopes_equal]
class AutoSlots(ABCMeta):
"""
Metaclass for automatically generating __slots__ based on superclass type annotations.
Superclasses must set `__slots__ = ()` to make this work properly.
This is a workaround for the fact that non-empty `__slots__` can't be used
with multiple inheritance. Since we only use multiple inheritance with abstract
classes, they can have empty `__slots__` and their attribute type annotations
can be used to generate a full `__slots__` for the concrete class.
"""
def __new__(cls, name, bases, dctn):
parents = set()
for base in bases:
parents |= set(base.mro())
slots = tuple(dctn.get('__slots__', tuple()))
for parent in parents:
if not hasattr(parent, '__annotations__'):
continue
slots += tuple(getattr(parent, '__annotations__').keys())
dctn['__slots__'] = slots
return super().__new__(cls, name, bases, dctn)

View File

@ -1,15 +1,41 @@
"""
Various helper functions, type definitions, etc.
"""
from .types import layer_t, annotations_t
from .array import is_scalar
from .autoslots import AutoSlots
from .bitwise import get_bit, set_bit
from .vertices import (
remove_duplicate_vertices, remove_colinear_vertices, poly_contains_points
from .types import (
layer_t as layer_t,
annotations_t as annotations_t,
SupportsBool as SupportsBool,
)
from .transform import rotation_matrix_2d, normalize_mirror
from .array import is_scalar as is_scalar
from .autoslots import AutoSlots as AutoSlots
from .deferreddict import DeferredDict as DeferredDict
from .decorators import oneshot as oneshot
#from . import pack2d
from .bitwise import (
get_bit as get_bit,
set_bit as set_bit,
)
from .vertices import (
remove_duplicate_vertices as remove_duplicate_vertices,
remove_colinear_vertices as remove_colinear_vertices,
poly_contains_points as poly_contains_points,
)
from .transform import (
rotation_matrix_2d as rotation_matrix_2d,
normalize_mirror as normalize_mirror,
rotate_offsets_around as rotate_offsets_around,
apply_transforms as apply_transforms,
)
from .comparisons import (
annotation2key as annotation2key,
annotations_lt as annotations_lt,
annotations_eq as annotations_eq,
layer2key as layer2key,
ports_lt as ports_lt,
ports_eq as ports_eq,
rep2key as rep2key,
)
from . import ports2data as ports2data
from . import pack2d as pack2d

View File

@ -12,16 +12,16 @@ class AutoSlots(ABCMeta):
classes, they can have empty `__slots__` and their attribute type annotations
can be used to generate a full `__slots__` for the concrete class.
"""
def __new__(cls, name, bases, dctn):
def __new__(cls, name, bases, dctn): # noqa: ANN001,ANN204
parents = set()
for base in bases:
parents |= set(base.mro())
slots = tuple(dctn.get('__slots__', tuple()))
slots = tuple(dctn.get('__slots__', ()))
for parent in parents:
if not hasattr(parent, '__annotations__'):
continue
slots += tuple(getattr(parent, '__annotations__').keys())
slots += tuple(parent.__annotations__.keys())
dctn['__slots__'] = slots
return super().__new__(cls, name, bases, dctn)

106
masque/utils/comparisons.py Normal file
View File

@ -0,0 +1,106 @@
from typing import Any
from .types import annotations_t, layer_t
from ..ports import Port
from ..repetition import Repetition
def annotation2key(aaa: int | float | str) -> tuple[bool, Any]:
return (isinstance(aaa, str), aaa)
def annotations_lt(aa: annotations_t, bb: annotations_t) -> bool:
if aa is None:
return bb is not None
elif bb is None: # noqa: RET505
return False
if len(aa) != len(bb):
return len(aa) < len(bb)
keys_a = tuple(sorted(aa.keys()))
keys_b = tuple(sorted(bb.keys()))
if keys_a != keys_b:
return keys_a < keys_b
for key in keys_a:
va = aa[key]
vb = bb[key]
if len(va) != len(vb):
return len(va) < len(vb)
for aaa, bbb in zip(va, vb, strict=True):
if aaa != bbb:
return annotation2key(aaa) < annotation2key(bbb)
return False
def annotations_eq(aa: annotations_t, bb: annotations_t) -> bool:
if aa is None:
return bb is None
elif bb is None: # noqa: RET505
return False
if len(aa) != len(bb):
return False
keys_a = tuple(sorted(aa.keys()))
keys_b = tuple(sorted(bb.keys()))
if keys_a != keys_b:
return keys_a < keys_b
for key in keys_a:
va = aa[key]
vb = bb[key]
if len(va) != len(vb):
return False
for aaa, bbb in zip(va, vb, strict=True):
if aaa != bbb:
return False
return True
def layer2key(layer: layer_t) -> tuple[bool, bool, Any]:
is_int = isinstance(layer, int)
is_str = isinstance(layer, str)
layer_tup = (layer) if (is_str or is_int) else layer
tup = (
is_str,
not is_int,
layer_tup,
)
return tup
def rep2key(repetition: Repetition | None) -> tuple[bool, Repetition | None]:
return (repetition is None, repetition)
def ports_eq(aa: dict[str, Port], bb: dict[str, Port]) -> bool:
if len(aa) != len(bb):
return False
keys = sorted(aa.keys())
if keys != sorted(bb.keys()):
return False
return all(aa[kk] == bb[kk] for kk in keys)
def ports_lt(aa: dict[str, Port], bb: dict[str, Port]) -> bool:
if len(aa) != len(bb):
return len(aa) < len(bb)
aa_keys = tuple(sorted(aa.keys()))
bb_keys = tuple(sorted(bb.keys()))
if aa_keys != bb_keys:
return aa_keys < bb_keys
for key in aa_keys:
pa = aa[key]
pb = bb[key]
if pa != pb:
return pa < pb
return False

View File

@ -0,0 +1,21 @@
from collections.abc import Callable
from functools import wraps
from ..error import OneShotError
def oneshot(func: Callable) -> Callable:
"""
Raises a OneShotError if the decorated function is called more than once
"""
expired = False
@wraps(func)
def wrapper(*args, **kwargs): # noqa: ANN202
nonlocal expired
if expired:
raise OneShotError(func.__name__)
expired = True
return func(*args, **kwargs)
return wrapper

View File

@ -1,4 +1,5 @@
from typing import Callable, TypeVar, Generic
from typing import TypeVar, Generic
from collections.abc import Callable
from functools import lru_cache

View File

@ -1,37 +1,13 @@
"""
2D bin-packing
"""
from typing import Tuple, List, Set, Sequence, Callable
from collections.abc import Sequence, Mapping, Callable
import numpy
from numpy.typing import NDArray, ArrayLike
from ..error import MasqueError
from ..pattern import Pattern
from ..subpattern import SubPattern
def pack_patterns(patterns: Sequence[Pattern],
regions: numpy.ndarray,
spacing: Tuple[float, float],
presort: bool = True,
allow_rejects: bool = True,
packer: Callable = maxrects_bssf,
) -> Tuple[Pattern, List[Pattern]]:
half_spacing = numpy.array(spacing) / 2
bounds = [pp.get_bounds() for pp in patterns]
sizes = [bb[1] - bb[0] + spacing if bb is not None else spacing for bb in bounds]
offsets = [half_spacing - bb[0] if bb is not None else (0, 0) for bb in bounds]
locations, reject_inds = packer(sizes, regions, presort=presort, allow_rejects=allow_rejects)
pat = Pattern()
pat.subpatterns = [SubPattern(pp, offset=oo + loc)
for pp, oo, loc in zip(patterns, offsets, locations)]
rejects = [patterns[ii] for ii in reject_inds]
return pat, rejects
def maxrects_bssf(
@ -39,18 +15,36 @@ def maxrects_bssf(
containers: ArrayLike,
presort: bool = True,
allow_rejects: bool = True,
) -> Tuple[NDArray[numpy.float64], Set[int]]:
) -> tuple[NDArray[numpy.float64], set[int]]:
"""
sizes should be Nx2
regions should be Mx4 (xmin, ymin, xmax, ymax)
Pack rectangles `rects` into regions `containers` using the "maximal rectangles best short side fit"
algorithm (maxrects_bssf) from "A thousand ways to pack the bin", Jukka Jylanki, 2010.
This algorithm gives the best results, but is asymptotically slower than `guillotine_bssf_sas`.
Args:
rects: Nx2 array of rectangle sizes `[[x_size0, y_size0], ...]`.
containers: Mx4 array of regions into which `rects` will be placed, specified using their
corner coordinates ` [[x_min0, y_min0, x_max0, y_max0], ...]`.
presort: If `True` (default), largest-shortest-side rectangles will be placed
first. Otherwise, they will be placed in the order provided.
allow_rejects: If `False`, `MasqueError` will be raised if any rectangle cannot be placed.
Returns:
`[[x_min0, y_min0], ...]` placement locations for `rects`, with the same ordering.
The second argument is a set of indicies of `rects` entries which were rejected; their
corresponding placement locations should be ignored.
Raises:
MasqueError if `allow_rejects` is `True` but some `rects` could not be placed.
"""
regions = numpy.array(containers, copy=False, dtype=float)
rect_sizes = numpy.array(rects, copy=False, dtype=float)
regions = numpy.asarray(containers, dtype=float)
rect_sizes = numpy.asarray(rects, dtype=float)
rect_locs = numpy.zeros_like(rect_sizes)
rejected_inds = set()
if presort:
rotated_sizes = numpy.sort(rect_sizes, axis=0) # shortest side first
rotated_sizes = numpy.sort(rect_sizes, axis=1) # shortest side first
rect_order = numpy.lexsort(rotated_sizes.T)[::-1] # Descending shortest side
rect_sizes = rect_sizes[rect_order]
@ -68,14 +62,14 @@ def maxrects_bssf(
''' Place the rect '''
# Best short-side fit (bssf) to pick a region
bssf_scores = ((regions[:, 2:] - regions[:, :2]) - rect_size).min(axis=1).astype(float)
region_sizes = regions[:, 2:] - regions[:, :2]
bssf_scores = (region_sizes - rect_size).min(axis=1).astype(float)
bssf_scores[bssf_scores < 0] = numpy.inf # doesn't fit!
rr = bssf_scores.argmin()
if numpy.isinf(bssf_scores[rr]):
if allow_rejects:
rejected_inds.add(rect_ind)
continue
else:
raise MasqueError(f'Failed to find a suitable location for rectangle {rect_ind}')
# Read out location
@ -104,62 +98,146 @@ def maxrects_bssf(
r_top[:, 1] = loc[1] + rect_size[1]
regions = numpy.vstack((regions[~intersects], r_lft, r_bot, r_rgt, r_top))
if presort:
unsort_order = rect_order.argsort()
rect_locs = rect_locs[unsort_order]
rejected_inds = set(unsort_order[list(rejected_inds)])
return rect_locs, rejected_inds
def guillotine_bssf_sas(rect_sizes: numpy.ndarray,
regions: numpy.ndarray,
def guillotine_bssf_sas(
rects: ArrayLike,
containers: ArrayLike,
presort: bool = True,
allow_rejects: bool = True,
) -> Tuple[numpy.ndarray, Set[int]]:
) -> tuple[NDArray[numpy.float64], set[int]]:
"""
sizes should be Nx2
regions should be Mx4 (xmin, ymin, xmax, ymax)
#TODO: test me!
# TODO add rectangle-merge?
Pack rectangles `rects` into regions `containers` using the "guillotine best short side fit with
shorter axis split rule" algorithm (guillotine-BSSF-SAS) from "A thousand ways to pack the bin",
Jukka Jylanki, 2010.
This algorithm gives the worse results than `maxrects_bssf`, but is asymptotically faster.
# TODO consider adding rectangle-merge?
# TODO guillotine could use some additional testing
Args:
rects: Nx2 array of rectangle sizes `[[x_size0, y_size0], ...]`.
containers: Mx4 array of regions into which `rects` will be placed, specified using their
corner coordinates ` [[x_min0, y_min0, x_max0, y_max0], ...]`.
presort: If `True` (default), largest-shortest-side rectangles will be placed
first. Otherwise, they will be placed in the order provided.
allow_rejects: If `False`, `MasqueError` will be raised if any rectangle cannot be placed.
Returns:
`[[x_min0, y_min0], ...]` placement locations for `rects`, with the same ordering.
The second argument is a set of indicies of `rects` entries which were rejected; their
corresponding placement locations should be ignored.
Raises:
MasqueError if `allow_rejects` is `True` but some `rects` could not be placed.
"""
rect_sizes = numpy.array(rect_sizes)
regions = numpy.asarray(containers, dtype=float)
rect_sizes = numpy.asarray(rects, dtype=float)
rect_locs = numpy.zeros_like(rect_sizes)
rejected_inds = set()
if presort:
rotated_sizes = numpy.sort(rect_sizes, axis=0) # shortest side first
rotated_sizes = numpy.sort(rect_sizes, axis=1) # shortest side first
rect_order = numpy.lexsort(rotated_sizes.T)[::-1] # Descending shortest side
rect_sizes = rect_sizes[rect_order]
for rect_ind, rect_size in enumerate(rect_sizes):
''' Place the rect '''
# Best short-side fit (bssf) to pick a region
bssf_scores = ((regions[:, 2:] - regions[:, :2]) - rect_size).min(axis=1).astype(float)
region_sizes = regions[:, 2:] - regions[:, :2]
bssf_scores = (region_sizes - rect_size).min(axis=1).astype(float)
bssf_scores[bssf_scores < 0] = numpy.inf # doesn't fit!
rr = bssf_scores.argmin()
if numpy.isinf(bssf_scores[rr]):
if allow_rejects:
rejected_inds.add(rect_ind)
continue
else:
raise MasqueError(f'Failed to find a suitable location for rectangle {rect_ind}')
# Read out location
loc = regions[rr, :2]
rect_locs[rect_ind] = loc
region_size = regions[rr, 2:] - loc
region_size = region_sizes[rr]
split_horiz = region_size[0] < region_size[1]
new_region0 = regions[rr].copy()
new_region1 = new_region0.copy()
split_vert = loc + rect_size
split_vertex = loc + rect_size
if split_horiz:
new_region0[2] = split_vert[0]
new_region0[1] = split_vert[1]
new_region1[0] = split_vert[0]
new_region0[2] = split_vertex[0]
new_region0[1] = split_vertex[1]
new_region1[0] = split_vertex[0]
else:
new_region0[3] = split_vert[1]
new_region0[0] = split_vert[0]
new_region1[1] = split_vert[1]
new_region0[3] = split_vertex[1]
new_region0[0] = split_vertex[0]
new_region1[1] = split_vertex[1]
regions = numpy.vstack((regions[:rr], regions[rr + 1:],
new_region0, new_region1))
if presort:
unsort_order = rect_order.argsort()
rect_locs = rect_locs[unsort_order]
rejected_inds = set(unsort_order[list(rejected_inds)])
return rect_locs, rejected_inds
def pack_patterns(
library: Mapping[str, Pattern],
patterns: Sequence[str],
containers: ArrayLike,
spacing: tuple[float, float],
presort: bool = True,
allow_rejects: bool = True,
packer: Callable = maxrects_bssf,
) -> tuple[Pattern, list[str]]:
"""
Pick placement locations for `patterns` inside the regions specified by `containers`.
No rotations are performed.
Args:
library: Library from which `Pattern` objects will be drawn.
patterns: Sequence of pattern names which are to be placed.
containers: Mx4 array of regions into which `patterns` will be placed, specified using their
corner coordinates ` [[x_min0, y_min0, x_max0, y_max0], ...]`.
spacing: (x, y) spacing between adjacent patterns. Patterns are effectively expanded outwards
by `spacing / 2` prior to placement, so this also affects pattern position relative to
container edges.
presort: If `True` (default), largest-shortest-side rectangles will be placed
first. Otherwise, they will be placed in the order provided.
allow_rejects: If `False`, `MasqueError` will be raised if any rectangle cannot be placed.
packer: Bin-packing method; see the other functions in this module (namely `maxrects_bssf`
and `guillotine_bssf_sas`).
Returns:
A `Pattern` containing one `Ref` for each entry in `patterns`.
A list of "rejected" pattern names, for which a valid placement location could not be found.
Raises:
MasqueError if `allow_rejects` is `True` but some `rects` could not be placed.
"""
half_spacing = numpy.asarray(spacing, dtype=float) / 2
bounds = [library[pp].get_bounds() for pp in patterns]
sizes = [bb[1] - bb[0] + spacing if bb is not None else spacing for bb in bounds]
offsets = [half_spacing - bb[0] if bb is not None else (0, 0) for bb in bounds]
locations, reject_inds = packer(sizes, containers, presort=presort, allow_rejects=allow_rejects)
pat = Pattern()
for pp, oo, loc in zip(patterns, offsets, locations, strict=True):
pat.ref(pp, offset=oo + loc)
rejects = [patterns[ii] for ii in reject_inds]
return pat, rejects

178
masque/utils/ports2data.py Normal file
View File

@ -0,0 +1,178 @@
"""
Functions for writing port data into Pattern geometry/annotations/labels (`ports_to_data`)
and retrieving it (`data_to_ports`).
These use the format 'name:ptype angle_deg' written into labels, which are placed at
the port locations. This particular approach is just a sensible default; feel free to
to write equivalent functions for your own format or alternate storage methods.
"""
from collections.abc import Sequence, Mapping
import logging
from itertools import chain
import numpy
from ..pattern import Pattern
from ..utils import layer_t
from ..ports import Port
from ..error import PatternError
from ..library import ILibraryView, LibraryView
logger = logging.getLogger(__name__)
def ports_to_data(pattern: Pattern, layer: layer_t) -> Pattern:
"""
Place a text label at each port location, specifying the port data in the format
'name:ptype angle_deg'
This can be used to debug port locations or to automatically generate ports
when reading in a GDS file.
NOTE that `pattern` is modified by this function
Args:
pattern: The pattern which is to have its ports labeled. MODIFIED in-place.
layer: The layer on which the labels will be placed.
Returns:
`pattern`
"""
for name, port in pattern.ports.items():
if port.rotation is None:
angle_deg = numpy.inf
else:
angle_deg = numpy.rad2deg(port.rotation)
pattern.label(layer=layer, string=f'{name}:{port.ptype} {angle_deg:g}', offset=port.offset)
return pattern
def data_to_ports(
layers: Sequence[layer_t],
library: Mapping[str, Pattern],
pattern: Pattern, # Pattern is good since we don't want to do library[name] to avoid infinite recursion.
# LazyLibrary protects against library[ref.target] causing a circular lookup.
# For others, maybe check for cycles up front? TODO
name: str | None = None, # Note: name optional, but arg order different from read(postprocess=)
max_depth: int = 0,
skip_subcells: bool = True,
# TODO missing ok?
) -> Pattern:
"""
# TODO fixup documentation in ports2data
# TODO move to utils.file?
Examine `pattern` for labels specifying port info, and use that info
to fill out its `ports` attribute.
Labels are assumed to be placed at the port locations, and have the format
'name:ptype angle_deg'
Args:
layers: Search for labels on all the given layers.
pattern: Pattern object to scan for labels.
max_depth: Maximum hierarcy depth to search. Default 999_999.
Reduce this to 0 to avoid ever searching subcells.
skip_subcells: If port labels are found at a given hierarcy level,
do not continue searching at deeper levels. This allows subcells
to contain their own port info without interfering with supercells'
port data.
Default True.
Returns:
The updated `pattern`. Port labels are not removed.
"""
if pattern.ports:
logger.warning(f'Pattern {name if name else pattern} already had ports, skipping data_to_ports')
return pattern
if not isinstance(library, ILibraryView):
library = LibraryView(library)
data_to_ports_flat(layers, pattern, name)
if (skip_subcells and pattern.ports) or max_depth == 0:
return pattern
# Load ports for all subpatterns, and use any we find
found_ports = False
for target in pattern.refs:
if target is None:
continue
pp = data_to_ports(
layers=layers,
library=library,
pattern=library[target],
name=target,
max_depth=max_depth - 1,
skip_subcells=skip_subcells,
)
found_ports |= bool(pp.ports)
if not found_ports:
return pattern
for target, refs in pattern.refs.items():
if target is None:
continue
if not refs:
continue
for ref in refs:
aa = library.abstract(target)
if not aa.ports:
break
aa.apply_ref_transform(ref)
pattern.check_ports(other_names=aa.ports.keys())
pattern.ports.update(aa.ports)
return pattern
def data_to_ports_flat(
layers: Sequence[layer_t],
pattern: Pattern,
cell_name: str | None = None,
) -> Pattern:
"""
Examine `pattern` for labels specifying port info, and use that info
to fill out its `ports` attribute.
Labels are assumed to be placed at the port locations, and have the format
'name:ptype angle_deg'
The pattern is assumed to be flat (have no `refs`) and have no pre-existing ports.
Args:
layers: Search for labels on all the given layers.
pattern: Pattern object to scan for labels.
cell_name: optional, used for warning message only
Returns:
The updated `pattern`. Port labels are not removed.
"""
labels = list(chain.from_iterable(pattern.labels[layer] for layer in layers))
if not labels:
return pattern
pstr = cell_name if cell_name is not None else repr(pattern)
if pattern.ports:
raise PatternError(f'Pattern "{pstr}" has pre-existing ports!')
local_ports = {}
for label in labels:
name, property_string = label.string.split(':')
properties = property_string.split(' ')
ptype = properties[0]
angle_deg = float(properties[1]) if len(ptype) else 0
xy = label.offset
angle = numpy.deg2rad(angle_deg)
if name in local_ports:
logger.warning(f'Duplicate port "{name}" in pattern "{pstr}"')
local_ports[name] = Port(offset=xy, rotation=angle, ptype=ptype)
pattern.ports.update(local_ports)
return pattern

View File

@ -1,12 +1,15 @@
"""
Geometric transforms
"""
from typing import Sequence, Tuple
from collections.abc import Sequence
from functools import lru_cache
import numpy
from numpy.typing import NDArray
from numpy.typing import NDArray, ArrayLike
from numpy import pi
@lru_cache
def rotation_matrix_2d(theta: float) -> NDArray[numpy.float64]:
"""
2D rotation matrix for rotating counterclockwise around the origin.
@ -17,11 +20,18 @@ def rotation_matrix_2d(theta: float) -> NDArray[numpy.float64]:
Returns:
rotation matrix
"""
return numpy.array([[numpy.cos(theta), -numpy.sin(theta)],
arr = numpy.array([[numpy.cos(theta), -numpy.sin(theta)],
[numpy.sin(theta), +numpy.cos(theta)]])
# If this was a manhattan rotation, round to remove some inacuraccies in sin & cos
if numpy.isclose(theta % (pi / 2), 0):
arr = numpy.round(arr)
def normalize_mirror(mirrored: Sequence[bool]) -> Tuple[bool, float]:
arr.flags.writeable = False
return arr
def normalize_mirror(mirrored: Sequence[bool]) -> tuple[bool, float]:
"""
Converts 0-2 mirror operations `(mirror_across_x_axis, mirror_across_y_axis)`
into 0-1 mirror operations and a rotation
@ -38,3 +48,71 @@ def normalize_mirror(mirrored: Sequence[bool]) -> Tuple[bool, float]:
mirror_x = (mirrored_x != mirrored_y) # XOR
angle = numpy.pi if mirrored_y else 0
return mirror_x, angle
def rotate_offsets_around(
offsets: NDArray[numpy.float64],
pivot: NDArray[numpy.float64],
angle: float,
) -> NDArray[numpy.float64]:
"""
Rotates offsets around a pivot point.
Args:
offsets: Nx2 array, rows are (x, y) offsets
pivot: (x, y) location to rotate around
angle: rotation angle in radians
Returns:
Nx2 ndarray of (x, y) position after the rotation is applied.
"""
offsets -= pivot
offsets[:] = (rotation_matrix_2d(angle) @ offsets.T).T
offsets += pivot
return offsets
def apply_transforms(
outer: ArrayLike,
inner: ArrayLike,
tensor: bool = False,
) -> NDArray[numpy.float64]:
"""
Apply a set of transforms (`outer`) to a second set (`inner`).
This is used to find the "absolute" transform for nested `Ref`s.
The two transforms should be of shape Ox4 and Ix4.
Rows should be of the form `(x_offset, y_offset, rotation_ccw_rad, mirror_across_x)`.
The output will be of the form (O*I)x4 (if `tensor=False`) or OxIx4 (`tensor=True`).
Args:
outer: Transforms for the container refs. Shape Ox4.
inner: Transforms for the contained refs. Shape Ix4.
tensor: If `True`, an OxIx4 array is returned, with `result[oo, ii, :]` corresponding
to the `oo`th `outer` transform applied to the `ii`th inner transform.
If `False` (default), this is concatenated into `(O*I)x4` to allow simple
chaining into additional `apply_transforms()` calls.
Returns:
OxIx4 or (O*I)x4 array. Final dimension is
`(total_x, total_y, total_rotation_ccw_rad, net_mirrored_x)`.
"""
outer = numpy.atleast_2d(outer).astype(float, copy=False)
inner = numpy.atleast_2d(inner).astype(float, copy=False)
# If mirrored, flip y's
xy_mir = numpy.tile(inner[:, :2], (outer.shape[0], 1, 1)) # dims are outer, inner, xyrm
xy_mir[outer[:, 3].astype(bool), :, 1] *= -1
rot_mats = [rotation_matrix_2d(angle) for angle in outer[:, 2]]
xy = numpy.einsum('ort,oit->oir', rot_mats, xy_mir)
tot = numpy.empty((outer.shape[0], inner.shape[0], 4))
tot[:, :, :2] = outer[:, None, :2] + xy
tot[:, :, 2:] = outer[:, None, 2:] + inner[None, :, 2:] # sum rotations and mirrored
tot[:, :, 2] %= 2 * pi # clamp rot
tot[:, :, 3] %= 2 # clamp mirrored
if tensor:
return tot
return numpy.concatenate(tot)

View File

@ -1,8 +1,13 @@
"""
Type definitions
"""
from typing import Union, Tuple, Sequence, Dict, List
from typing import Protocol
layer_t = Union[int, Tuple[int, int], str]
annotations_t = Dict[str, List[Union[int, float, str]]]
layer_t = int | tuple[int, int] | str
annotations_t = dict[str, list[int | float | str]]
class SupportsBool(Protocol):
def __bool__(self) -> bool:
...

View File

@ -15,9 +15,9 @@ def remove_duplicate_vertices(vertices: ArrayLike, closed_path: bool = True) ->
(i.e. the last vertex will be removed if it is the same as the first)
Returns:
`vertices` with no consecutive duplicates.
`vertices` with no consecutive duplicates. This may be a view into the original array.
"""
vertices = numpy.array(vertices)
vertices = numpy.asarray(vertices)
duplicates = (vertices == numpy.roll(vertices, 1, axis=0)).all(axis=1)
if not closed_path:
duplicates[0] = False
@ -35,7 +35,7 @@ def remove_colinear_vertices(vertices: ArrayLike, closed_path: bool = True) -> N
closed path. If `False`, the path is assumed to be open. Default `True`.
Returns:
`vertices` with colinear (superflous) vertices removed.
`vertices` with colinear (superflous) vertices removed. May be a view into the original array.
"""
vertices = remove_duplicate_vertices(vertices)
@ -73,17 +73,17 @@ def poly_contains_points(
Returns:
ndarray of booleans, [point0_is_in_shape, point1_is_in_shape, ...]
"""
points = numpy.array(points, copy=False)
vertices = numpy.array(vertices, copy=False)
points = numpy.asarray(points, dtype=float)
vertices = numpy.asarray(vertices, dtype=float)
if points.size == 0:
return numpy.zeros(0)
return numpy.zeros(0, dtype=numpy.int8)
min_bounds = numpy.min(vertices, axis=0)[None, :]
max_bounds = numpy.max(vertices, axis=0)[None, :]
trivially_outside = ((points < min_bounds).any(axis=1)
| (points > max_bounds).any(axis=1))
| (points > max_bounds).any(axis=1)) # noqa: E128
nontrivial = ~trivially_outside
if trivially_outside.all():
@ -101,10 +101,10 @@ def poly_contains_points(
dv = numpy.roll(verts, -1, axis=0) - verts
is_left = (dv[:, 0] * (ntpts[..., 1] - verts[:, 1]) # >0 if left of dv, <0 if right, 0 if on the line
- dv[:, 1] * (ntpts[..., 0] - verts[:, 0]))
- dv[:, 1] * (ntpts[..., 0] - verts[:, 0])) # noqa: E128
winding_number = ((upward & (is_left > 0)).sum(axis=0)
- (downward & (is_left < 0)).sum(axis=0))
- (downward & (is_left < 0)).sum(axis=0)) # noqa: E128
nontrivial_inside = winding_number != 0 # filter nontrivial points based on winding number
if include_boundary:
@ -113,5 +113,3 @@ def poly_contains_points(
inside = nontrivial.copy()
inside[nontrivial] = nontrivial_inside
return inside

View File

@ -39,11 +39,11 @@ classifiers = [
"Topic :: Scientific/Engineering :: Electronic Design Automation (EDA)",
"Topic :: Scientific/Engineering :: Visualization",
]
requires-python = ">=3.8"
requires-python = ">=3.11"
dynamic = ["version"]
dependencies = [
"numpy~=1.21",
"klamath~=1.2",
"numpy>=1.26",
"klamath~=1.4",
]
@ -52,9 +52,41 @@ path = "masque/__init__.py"
[project.optional-dependencies]
oasis = ["fatamorgana~=0.11"]
dxf = ["ezdxf"]
dxf = ["ezdxf~=1.0.2"]
svg = ["svgwrite"]
visualize = ["matplotlib"]
text = ["matplotlib", "freetype-py"]
python-gdsii = ["python-gdsii"]
[tool.ruff]
exclude = [
".git",
"dist",
]
line-length = 145
indent-width = 4
lint.dummy-variable-rgx = "^(_+|(_+[a-zA-Z0-9_]*[a-zA-Z0-9]+?))$"
lint.select = [
"NPY", "E", "F", "W", "B", "ANN", "UP", "SLOT", "SIM", "LOG",
"C4", "ISC", "PIE", "PT", "RET", "TCH", "PTH", "INT",
"ARG", "PL", "R", "TRY",
"G010", "G101", "G201", "G202",
"Q002", "Q003", "Q004",
]
lint.ignore = [
#"ANN001", # No annotation
"ANN002", # *args
"ANN003", # **kwargs
"ANN401", # Any
"ANN101", # self: Self
"SIM108", # single-line if / else assignment
"RET504", # x=y+z; return x
"PIE790", # unnecessary pass
"ISC003", # non-implicit string concatenation
"C408", # dict(x=y) instead of {'x': y}
"PLR09", # Too many xxx
"PLR2004", # magic number
"PLC0414", # import x as x
"TRY003", # Long exception message
]