Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
309ed1b
doc fixes
echatav Feb 7, 2026
55b8d81
Update Grammar.hs
echatav Feb 8, 2026
cdc8f7d
Update Distributor.hs
echatav Feb 8, 2026
6cbf094
Update Grammar.hs
echatav Feb 8, 2026
394c2c8
Update Grammar.hs
echatav Feb 8, 2026
8b74db8
Update Grammar.hs
echatav Feb 8, 2026
c5fe85b
Update Boole.hs
echatav Feb 8, 2026
26cd2b0
Caveat notes
echatav Feb 12, 2026
585702e
Update Grammar.hs
echatav Feb 12, 2026
a640e69
Update Grammar.hs
echatav Feb 12, 2026
bf98817
Update Grammar.hs
echatav Feb 12, 2026
5affb7e
imprism
echatav Feb 16, 2026
ad5bddd
filtration & partiality clarity
echatav Feb 16, 2026
5042384
testing for Brzozowski Matching
echatav Feb 16, 2026
bd04c68
Update Monocle.hs
echatav Feb 16, 2026
1bcabc9
Update Filtrator.hs
echatav Feb 16, 2026
886859b
Update Monocle.hs
echatav Feb 16, 2026
114b1f8
monomorphically
echatav Feb 17, 2026
0a30158
Update Wither.hs
echatav Feb 17, 2026
5f82d43
Update Grammar.hs
echatav Feb 17, 2026
fd1e9a2
Update NestedPrismTH.hs
echatav Feb 17, 2026
4979845
simplify replicateP
echatav Feb 17, 2026
9e19e1e
Update Distributor.hs
echatav Feb 17, 2026
d0d8a56
Update Distributor.hs
echatav Feb 17, 2026
06aa886
Update Main.hs
echatav Feb 17, 2026
03068f4
Initial plan
Copilot Feb 23, 2026
f0ed836
Increment minor version to 0.4.0.0
Copilot Feb 23, 2026
b96bfbe
Correct version to 0.3.0.1
Copilot Feb 23, 2026
7728fa0
Merge pull request #21 from morphismtech/copilot/sub-pr-20
echatav Feb 23, 2026
3f4680a
Initial plan
Copilot Feb 24, 2026
e5805a5
Fix spelling error: Doube -> Double in NestedPrismTH.hs comment
Copilot Feb 24, 2026
e7cf300
Merge pull request #22 from morphismtech/copilot/sub-pr-20
echatav Feb 24, 2026
9b53de6
Update distributors.cabal
echatav Feb 26, 2026
01507df
Update Distributor.hs
echatav Feb 26, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions src/Control/Lens/Diopter.hs
Original file line number Diff line number Diff line change
Expand Up @@ -41,8 +41,7 @@ type Diopter s t a b = forall p f.
(Distributor p, Applicative f)
=> p a (f b) -> p s (f t)

{- | If you see `ADiopter` in a signature for a function,
the function is expecting a `Diopter`. -}
{- | `ADiopter` is monomorphically a `Diopter`. -}
type ADiopter s t a b =
Dioptrice a b a (Identity b) -> Dioptrice a b s (Identity t)

Expand Down
83 changes: 58 additions & 25 deletions src/Control/Lens/Grammar.hs
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ module Control.Lens.Grammar
, RegBnf (..)
, regbnfG
, regbnfGrammar
-- * Context-sensitive grammar
-- * Unrestricted, context-sensitive grammar
, CtxGrammar
, printG
, parseG
Expand Down Expand Up @@ -57,7 +57,7 @@ import Witherable
A regular grammar may be constructed using
`Lexical` and `Alternator` combinators.
Let's see an example using
[semantic versioning](https://semver.org/).
[semantic versioning](https://semver.org/) syntax.

>>> import Numeric.Natural (Natural)
>>> :{
Expand All @@ -78,7 +78,8 @@ Unfortunately, we can't use TemplateHaskell to generate it in [GHCi]
which is used to test this documenation.
Normally we would write `makeNestedPrisms` @''SemVer@,
but here is equivalent explicit Haskell code instead.
Since @SemVer@ is a newtype, @_SemVer@ can be an `Control.Lens.Iso.Iso`.
Since @SemVer@ has only one constructor,
@_SemVer@ can be an `Control.Lens.Iso.Iso`.

>>> :set -XRecordWildCards
>>> import Control.Lens (Iso', iso)
Expand Down Expand Up @@ -221,7 +222,7 @@ arithGrammar = ruleRec "arith" sumG
_Num . iso show read >? someP (asIn @Char DecimalNumber)
:}

We can generate a `RegBnf`, printers and parsers from @arithGrammar@.
We can generate grammar strings, printers and parsers from @arithGrammar@.

>>> putStringLn (regbnfG arithGrammar)
{start} = \q{arith}
Expand All @@ -230,34 +231,28 @@ We can generate a `RegBnf`, printers and parsers from @arithGrammar@.
{number} = \p{Nd}+
{product} = \q{factor}(\*\q{factor})*
{sum} = \q{product}(\+\q{product})*

>>> [x | (x,"") <- parseG arithGrammar "1+2*3+4"]
[Add (Add (Num 1) (Mul (Num 2) (Num 3))) (Num 4)]
>>> unparseG arithGrammar (Add (Num 1) (Mul (Num 2) (Num 3))) "" :: Maybe String
Just "1+2*3"
>>> do pr <- printG arithGrammar (Num 69); return (pr "") :: Maybe String
Just "69"

If all `rule`s are non-recursive, then a `Grammar`
can be rewritten as a `RegGrammar`.
Since Haskell permits general recursion, and `RegGrammar`s are
embedded in Haskell, you can define context-free grammars with them.
But it's recommended to use `Grammar`s for `rule` abstraction
and generator support for `ruleRec`.

-}
type Grammar token a = forall p.
( Lexical token p
, forall x. BackusNaurForm (p x x)
, Alternator p
) => p a a

{- |
In addition to context-sensitivity via `Monadic` combinators,
`CtxGrammar`s adds general filtration via `Filtrator` to `Grammar`s.

>>> :{
palindromeG :: CtxGrammar Char String
palindromeG = rule "palindrome" $
satisfied (\wrd -> reverse wrd == wrd) >?< manyP (anyToken @Char)
:}

The `satisfied` pattern is used together with the `Choice` &
`Data.Profunctor.Cochoice` applicator `>?<` for general filtration.
For context-sensitivity,
{- | For context-sensitivity,
the `Monadic` interface is used by importing "Data.Profunctor.Monadic"
qualified and using a "bonding" notation which mixes
"idiom" style with qualified do-notation.
Expand Down Expand Up @@ -290,16 +285,16 @@ The qualified do-notation changes the signature of
@P.@`Data.Profunctor.Monadic.>>=`,
so that we must apply the constructor pattern @_LenVec@
to the do-block with the `>?` applicator.
Any bound named variable, @var <- action@,
Any scoped bound action, @var <- action@,
gets "bonded" to the constructor pattern.
Any unbound actions, except for the last action in the do-block,
does not get bonded to the pattern.
The last action does get bonded to the pattern.
Any unnamed bound action, @_ <- action@,
Any unscoped bound action, @_ <- action@,
also gets bonded to the pattern,
but being unnamed means it isn't added to the context.
If all bound actions are unnamed, then a `CtxGrammar` can
be rewritten as a `Grammar` since it is context-free.
but being unscoped means it isn't added to the context.
If all bound actions are unscoped, and filtration isn't used,
then a `CtxGrammar` can be rewritten as a `Grammar` since it is context-free.
We can't generate a `RegBnf` since the `rule`s
of a `CtxGrammar` aren't static, but dynamic and contextual.
We can generate parsers and printers as expected.
Expand All @@ -312,8 +307,28 @@ We can generate parsers and printers as expected.
["2;6,7"]
>>> [pr "" | pr <- printG lenvecGrammar (LenVec 200 [100])] :: [String]
[]

In addition to context-sensitivity via `Monadic` combinators,
`CtxGrammar`s add unrestricted filtration to `Grammar`s.
The `satisfy` combinator is an unrestricted token filter.
And the `satisfied` pattern is used together with the `Choice` &
`Data.Profunctor.Cochoice` applicator `>?<` for unrestricted filtration.

>>> :{
palindromeG :: CtxGrammar Char String
palindromeG = rule "palindrome" $
satisfied (\wrd -> reverse wrd == wrd) >?< manyP (anyToken @Char)
:}

>>> [pal | word <- ["racecar", "word"], (pal, "") <- parseG palindromeG word]
["racecar"]

Since `CtxGrammar`s are embedded in Haskell, permitting computable predicates,
and `Filtrator` has a default definition for `Monadic` `Alternator`s,
the context-sensitivity of `CtxGrammar` implies
unrestricted filtration of grammars by computable predicates,
which can recognize the class of recursively enumerable languages.

-}
type CtxGrammar token a = forall p.
( Lexical token p
Expand All @@ -336,8 +351,9 @@ type Lexical token p =
) :: Constraint

{- | `RegString`s are an embedded domain specific language
of regular expression strings. Since they are strings,
they have a string-like interface.
of regular expression strings.

Since they are strings, they have a string-like interface.

>>> let rex = fromString "ab|c" :: RegString
>>> putStringLn rex
Expand Down Expand Up @@ -442,13 +458,23 @@ newtype RegString = RegString {runRegString :: RegEx Char}

{- | `RegBnf`s are an embedded domain specific language
of Backus-Naur forms extended by regular expression strings.

A `RegBnf` consists of a distinguished `RegString` "start" rule,
and a set of named `RegString` `rule`s.

>>> putStringLn (rule "baz" (terminal "foo" >|< terminal "bar") :: RegBnf)
{start} = \q{baz}
{baz} = foo|bar

Like `RegString`s they have a string-like interface.

>>> let bnf = fromString "{start} = foo|bar" :: RegBnf
>>> putStringLn bnf
{start} = foo|bar
>>> bnf
"{start} = foo|bar"
>>> :type toList bnf
toList bnf :: [Char]

`RegBnf`s can be generated from context-free `Grammar`s with `regbnfG`.

Expand All @@ -458,6 +484,13 @@ regbnfG regbnfGrammar :: RegBnf
Like `RegString`s, `RegBnf`s can be constructed using
`Lexical`, `Monoid` and `KleeneStarAlgebra` combinators.
But they also support `BackusNaurForm` `rule`s and `ruleRec`s.

>>> putStringLn (rule "baz" (bnf >|< terminal "baz"))
{start} = \q{baz}
{baz} = foo|bar|baz
>>> putStringLn (ruleRec "∞" (\x -> x) :: RegBnf)
{start} = \q{∞}
{∞} = \q{∞}
-}
newtype RegBnf = RegBnf {runRegBnf :: Bnf RegString}
deriving newtype
Expand Down
4 changes: 2 additions & 2 deletions src/Control/Lens/Grammar/Boole.hs
Original file line number Diff line number Diff line change
Expand Up @@ -74,8 +74,8 @@ allB f = foldl' (\b a -> b >&&< f a) (fromBool True)
anyB :: (Foldable f, BooleanAlgebra b) => (a -> b) -> f a -> b
anyB f = foldl' (\b a -> b >||< f a) (fromBool False)

-- | `TokenTest` forms a closed `Tokenized` `BooleanAlgebra`
-- of `Categorized` `tokenClass`es.
-- | `TokenTest` forms a closed `Tokenized` `BooleanAlgebra`,
-- for use an an argument to `tokenClass`.
newtype TokenTest token = TokenTest (RegExam token (TokenTest token))

-- | `TokenAlgebra` extends `Tokenized` methods to support
Expand Down
2 changes: 1 addition & 1 deletion src/Control/Lens/Grammar/Kleene.hs
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ class Monoid k => KleeneStarAlgebra k where
orK :: (Foldable f, KleeneStarAlgebra k) => f k -> k
orK = foldl' (>|<) zeroK

-- | universal
-- | existential
anyK :: (Foldable f, KleeneStarAlgebra k) => (a -> k) -> f a -> k
anyK f = foldl' (\b a -> b >|< f a) zeroK

Expand Down
3 changes: 1 addition & 2 deletions src/Control/Lens/Grate.hs
Original file line number Diff line number Diff line change
Expand Up @@ -48,8 +48,7 @@ type Grate s t a b = forall p f.
(Closed p, Monoidal p, Distributive f, Applicative f)
=> p a (f b) -> p s (f t)

{- | If you see `AGrate` in a signature for a function,
the function is expecting a `Grate`. -}
{- | `AGrate` is monomorphically a `Grate`. -}
type AGrate s t a b =
Grating a b a (Identity b) -> Grating a b s (Identity t)

Expand Down
41 changes: 30 additions & 11 deletions src/Control/Lens/Internal/NestedPrismTH.hs
Original file line number Diff line number Diff line change
Expand Up @@ -33,21 +33,40 @@ import qualified Data.Set as Set
import Data.Set (Set)
import Prelude

-- | Generate a `Control.Lens.Prism.Prism`
-- | Similar to `Control.Lens.Internal.PrismTH.makePrisms`,
-- `makeNestedPrisms` generates a `Control.Lens.Prism.Prism`
-- for each constructor of a data type.
-- `Control.Lens.Iso.Iso`s generated when possible.
-- `Control.Lens.Review.Review`s are created for constructors with existentially
-- quantified constructors and GADTs.
--
-- See `Control.Lens.Internal.PrismTH.makePrisms` for details and examples.
-- `Control.Lens.Iso.Iso`s are generated when possible.
-- `Control.Lens.Review.Review`s are generated for constructors
-- with existentially quantified constructors and GADTs.
-- The difference in `makeNestedPrisms`
-- is that constructors with @n > 2@ arguments
-- will use right-nested pairs, rather than a flat @n@-tuple.
-- This makes them suitable for use on the left-hand-side of
-- `Control.Lens.PartialIso.>~`,
-- `Control.Lens.PartialIso.>?` and `Control.Lens.PartialIso.>?<`;
-- with repeated use of `Data.Profunctor.Distributor.>*<`
-- on the right-hand-side, resulting in right-nested pairs.
-- This makes them suitable for bonding,
-- by use of the applicator `Control.Lens.PartialIso.>?`
-- to `Data.Profunctor.Monoidal.Monoidal` idiom notation
-- with `Data.Profunctor.Monoidal.>*<`,
-- or to `Data.Profunctor.Monadic.Monadic` qualified do-notation.
--
-- /e.g./
--
-- @
-- data FooBarBazBux a
-- = Foo Int
-- | Bar a
-- | Baz Int Char
-- | Bux Double String Bool
-- makePrisms ''FooBarBazBux
-- @
--
-- will create
--
-- @
-- _Foo :: Prism' (FooBarBaz a) Int
-- _Bar :: Prism (FooBarBaz a) (FooBarBaz b) a b
-- _Baz :: Prism' (FooBarBaz a) (Int, Char)
-- _Bux :: Prism' (FooBarBaz a) (Double, (String, Bool))
-- @
makeNestedPrisms :: Name -> DecsQ
makeNestedPrisms typeName =
do info <- D.reifyDatatype typeName
Expand Down
14 changes: 12 additions & 2 deletions src/Control/Lens/Monocle.hs
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ module Control.Lens.Monocle
, monocle
, withMonocle
, cloneMonocle
, imprism
, mapMonocle
, ditraversed
, forevered
Expand All @@ -42,8 +43,7 @@ type Monocle s t a b = forall p f.
(Monoidal p, Applicative f)
=> p a (f b) -> p s (f t)

{- | If you see `AMonocle` in a signature for a function,
the function is expecting a `Monocle`. -}
{- | `AMonocle` is monomorphically a `Monocle`. -}
type AMonocle s t a b =
Monocular a b a (Identity b) -> Monocular a b s (Identity t)

Expand All @@ -61,6 +61,16 @@ monomorphically typed `Monocle` for different purposes.
cloneMonocle :: AMonocle s t a b -> Monocle s t a b
cloneMonocle mon = unwrapPafb . mapMonocle mon . WrapPafb

{- | Convert a `Monocle` to an improper `Control.Lens.Prism.Prism`.

>>> review (imprism ditraversed) 1 :: Complex Int
1 :+ 1
>>> preview (imprism ditraversed) (1 :+ 2 :: Complex Int)
Just 1
-}
imprism :: Monocle s t a b -> Prism s t a b
imprism mon = clonePrism mon

{- | Build a `Monocle` from a `Traversable` & `Distributive`,
homogeneous, countable product.

Expand Down
5 changes: 2 additions & 3 deletions src/Control/Lens/PartialIso.hs
Original file line number Diff line number Diff line change
Expand Up @@ -117,8 +117,7 @@ some equivalence class of terms.
-}
type PartialIso' s a = PartialIso s s a a

{- | If you see `APartialIso` in a signature for a function,
the function is expecting a `PartialIso`. -}
{- | `APartialIso` is monomorphically a `PartialIso`. -}
type APartialIso s t a b =
PartialExchange a b a (Maybe b) -> PartialExchange a b s (Maybe t)

Expand Down Expand Up @@ -222,7 +221,7 @@ infixl 4 >?
(?<) pat = withPrism pat $ \f g -> unright . dimap (either id f) g
infixl 4 ?<

{- | Action of `APartialIso` on `Choice` and `Cochoice` `Profunctor`s. -}
{- | Action of `APartialIso` on `Choice` & `Cochoice` partial profunctors. -}
(>?<)
:: (Choice p, Cochoice p)
=> APartialIso s t a b
Expand Down
4 changes: 1 addition & 3 deletions src/Control/Lens/Wither.hs
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,6 @@ import Witherable

{- | `Wither`s extends `Control.Lens.Traversal.Traversal`s by filtering.


Every one of the following is a `Wither`.

* `Control.Lens.Iso.Iso`
Expand All @@ -48,8 +47,7 @@ Every one of the following is a `Wither`.
-}
type Wither s t a b = forall f. Alternative f => (a -> f b) -> s -> f t

{- | If you see `AWither` in a signature for a function,
the function is expecting a `Wither`. -}
{- | `AWither` is monomorphically a `Wither`. -}
type AWither s t a b = (a -> Altar a b b) -> s -> Altar a b t

{- | `Witheroid`s generalize `Wither`s.
Expand Down
Loading