Skip to content
This repository was archived by the owner on May 15, 2023. It is now read-only.
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions deepmath/deephol/deephol_loop/prooflog_to_tfexamples_lib.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ def __init__(self, tactic_name_id_map: Dict[Text, int],
options: options_pb2.ConvertorOptions):
"""Initializer.

Arguments:
Args:
tactic_name_id_map: mapping from tactic names to ids.
theorem_database: database containing list of global theorems with splits
options: options to control forbidden parameters and graph representations
Expand Down Expand Up @@ -97,7 +97,7 @@ def _extract_theorem_parameters(
Note: it might be misleading to call these theorems. If the source is from
an assumption, the theorem is of the form x |- x. We return x in this case.

Arguments:
Args:
tactic_application: tactic application to extract the parameters from.

Returns:
Expand Down
2 changes: 1 addition & 1 deletion deepmath/deephol/io_util.py
Original file line number Diff line number Diff line change
Expand Up @@ -200,7 +200,7 @@ def options_reader(options_proto, options_proto_path: Text,
overwrite: Optional[Text]):
"""Generic options reader, which can also be easily saved as protos.

Arguments:
Args:
options_proto: Type of the options proto object.
options_proto_path: Path to file containing an options_proto.
overwrite: A string containing options proto object.
Expand Down
2 changes: 1 addition & 1 deletion deepmath/deephol/train/architectures.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ def _pad_to_multiple(value, size, axis, name=None):
def wavenet_encoding(net, params, mode):
"""Embed a given input tensor using multiple wavenet blocks.

Arguments:
Args:
net: input tensor of shape [batch, text_length, word_embedding_size]
params: Hyperparameters.
mode: Estimator mode.
Expand Down
4 changes: 2 additions & 2 deletions deepmath/deephol/train/wavenet.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ def wavenet_layer(inp,
returned without reshaping, this allows a multilayer wavenet to be implemented
by subsequent calls to wavenet_layer and rate=2.

Arguments:
Args:
inp: input tensor
depth: depth of the intermediate nonlinear activations before reduced.
width: the width of the conv filter, 2 by default.
Expand Down Expand Up @@ -96,7 +96,7 @@ def wavenet_block(net,
keep_prob=1.0):
"""Stack many increasingly dilated wavenet layers together.

Arguments:
Args:
net: input tensor, expected to be 4D to start [batch, text_length, 1, dim]
num_layers: Number of wavenet layers to apply in the block, note that This
requires the input text_length to be divisible by 2**num_layers.
Expand Down
4 changes: 2 additions & 2 deletions deepmath/guidance/wavenet.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ def wavenet_layer(inp,
returned without reshaping, this allows a multilayer wavenet to be implemented
by subsequent calls to wavenet_layer and rate=2.

Arguments:
Args:
inp: input tensor
depth: depth of the intermediate nonlinear activations before reduced.
width: the width of the conv filter, 2 by default.
Expand Down Expand Up @@ -96,7 +96,7 @@ def wavenet_block(net,
keep_prob=1.0):
"""Stack many increasingly dilated wavenet layers together.

Arguments:
Args:
net: input tensor, expected to be 4D to start [batch, text_length, 1, dim]
num_layers: Number of wavenet layers to apply in the block, note that This
requires the input text_length to be divisible by 2**num_layers.
Expand Down