fg – Graph Container [doc TODO]#
FunctionGraph#
- class pytensor.graph.fg.FunctionGraph(inputs=None, outputs=None, features=None, clone=True, update_mapping=None, **clone_kwds)[source]#
A
FunctionGraphrepresents a subgraph bound by a set of input variables and a set of output variables, ie a subgraph that specifies an PyTensor function. The inputs list should contain all the inputs on which the outputs depend.Variables of typeConstantare not counted as inputs.The
FunctionGraphsupports the replace operation which allows to replace a variable in the subgraph by another, e.g. replace(x + x).outby(2 * x).out. This is the basis for optimization in PyTensor.This class is also responsible for verifying that a graph is valid (ie, all the dtypes and broadcast patterns are compatible with the way the
Variables are used) and for tracking theVariables with aFunctionGraph.clientsdictthat specifies whichApplynodes use theVariable. TheFunctionGraph.clientsfield, combined with theVariable.ownerand eachApply.inputs, allows the graph to be traversed in both directions.It can also be extended with new features using
FunctionGraph.attach_feature(). SeeFeaturefor event types and documentation. Extra features allow theFunctionGraphto verify new properties of a graph as it is optimized.The constructor creates a
FunctionGraphwhich operates on the subgraph bound by the inputs and outputs sets.This class keeps lists for the inputs and outputs and modifies them in-place.
*TODO*
Note
FunctionGraph(inputs, outputs) clones the inputs by default. To avoid this behavior, add the parameter clone=False. This is needed as we do not want cached constants in fgraph.
- add_input(var, check=True)[source]#
Add a new variable as an input to this
FunctionGraph.- Parameters:
var (pytensor.graph.basic.Variable) –
- add_output(var, reason=None, import_missing=False)[source]#
Add a new variable as an output to this
FunctionGraph.
- attach_feature(feature)[source]#
Add a
graph.features.Featureto this function graph and trigger itson_attachcallback.
- change_node_input(node, i, new_var, reason=None, import_missing=False, check=True)[source]#
Change
node.inputs[i]tonew_var.new_var.type.is_super(old_var.type)must beTrue, whereold_varis the current value ofnode.inputs[i]which we want to replace.For each feature that has an
on_change_inputmethod, this method calls:feature.on_change_input(function_graph, node, i, old_var, new_var, reason)- Parameters:
node – The node for which an input is to be changed.
i – The index in
node.inputsthat we want to change.new_var – The new variable to take the place of
node.inputs[i].import_missing – Add missing inputs instead of raising an exception.
check – When
True, perform a type check between the variable being replaced and its replacement. This is primarily used by theHistoryFeature, which needs to revert types that have been narrowed and would otherwise fail this check.
- clone_get_equiv(check_integrity=True, attach_feature=True, **kwargs)[source]#
Clone the graph and return a
dictthat maps old nodes to new nodes.- Parameters:
check_integrity – Whether or not to check the resulting graph’s integrity.
attach_feature – Whether or not to attach
self’s features to the cloned graph.
- Returns:
e – The cloned
FunctionGraph. Every node in the cloned graph is cloned.equiv – A
dictthat maps old nodes to the new nodes.
- collect_callbacks(name, *args)[source]#
Collects callbacks
Returns a dictionary d such that
d[feature] == getattr(feature, name)(*args)For each feature which has a method called after name.
- dprint(**kwargs)[source]#
Debug print itself
- Parameters:
kwargs – Optional keyword arguments to pass to debugprint function.
- execute_callbacks(name, *args, **kwargs)[source]#
Execute callbacks.
Calls
getattr(feature, name)(*args)for each feature which has a method called after name.
- get_output_client(i)[source]#
Get the dummy Output Op client to output i.
Raises lookup error if not found
- import_node(apply_node, check=True, reason=None, import_missing=False)[source]#
Recursively import everything between an
Applynode and theFunctionGraph’s outputs.- Parameters:
apply_node (Apply) – The node to be imported.
check (bool) – Check that the inputs for the imported nodes are also present in the
FunctionGraph.reason (str) – The name of the optimization or operation in progress.
import_missing (bool) – Add missing inputs instead of raising an exception.
- import_var(var, reason=None, import_missing=False)[source]#
Import a
Variableinto thisFunctionGraph.This will import the
var’sApplynode and inputs.- Parameters:
variable (pytensor.graph.basic.Variable) – The variable to be imported.
reason (str) – The name of the optimization or operation in progress.
import_missing (bool) – Add missing inputs instead of raising an exception.
- orderings()[source]#
Return a map of node to node evaluation dependencies.
Each key node is mapped to a list of nodes that must be evaluated before the key nodes can be evaluated.
This is used primarily by the
DestroyHandlerFeatureto ensure that the clients of any destroyed inputs have already computed their outputs.Notes
This only calls the
Feature.orderings()method of eachFeatureattached to theFunctionGraph. It does not take care of computing the dependencies by itself.
- remove_client(var, client_to_remove, reason=None, remove_if_empty=False)[source]#
Recursively remove clients of a variable.
This is the main method to remove variables or
Applynodes from aFunctionGraph.This will remove
varfrom theFunctionGraphif it doesn’t have any clients remaining. If it has an owner and all the outputs of the owner have no clients, it will also be removed.- Parameters:
var – The clients of
varthat will be removed.client_to_remove – A
(node, i)pair such thatnode.inputs[i]will no longer bevarin thisFunctionGraph.remove_if_empty – When
True, ifvar’sApplynode is removed, remove the entry forvarinself.clients.
- remove_feature(feature)[source]#
Remove a feature from the graph.
Calls
feature.on_detach(function_graph)if anon_detachmethod is defined.
- remove_input(input_idx, reason=None)[source]#
Remove the input at index
input_idx.Any node that depended on such input will also be removed.
- remove_node(node, reason=None)[source]#
Remove an
Applynode from theFunctionGraph.This will remove everything that depends on the outputs of
node, as well as any “orphaned” variables and nodes created bynode’s removal.
- remove_output(output_idx, reason=None, remove_client=True)[source]#
Remove the output at index
output_idxand update the indices in the clients entries.FunctionGraph.clientscontains entries like(output(i)(var), 0)under each output variable inFunctionGraph.outputs. Theivalues correspond to each output’s location within theFunctionGraph.outputslist, so, when an output is removed from the graph, all these entries need to be updated. This method performs those updates.
- replace(var, new_var, reason=None, verbose=None, import_missing=False)[source]#
Replace a variable in the
FunctionGraph.This is the main interface to manipulate the subgraph in
FunctionGraph. For every node that usesvaras input, makes it usenew_varinstead.- Parameters:
var – The variable to be replaced.
new_var – The variable to replace
var.reason – The name of the optimization or operation in progress.
verbose – Print
reason,var, andnew_var.import_missing – Import missing variables.
- replace_all(pairs, **kwargs)[source]#
Replace variables in the
FunctionGraphaccording to(var, new_var)pairs in a list.
- setup_var(var)[source]#
Set up a variable so it belongs to this
FunctionGraph.- Parameters:
var (pytensor.graph.basic.Variable) –
- toposort()[source]#
Return a toposorted list of the nodes.
Return an ordering of the graph’s
Applynodes such that:all the nodes of the inputs of a node are before that node, and
they satisfy the additional orderings provided by
FunctionGraph.orderings().
FunctionGraph Features#
- class pytensor.graph.features.Feature[source]#
Base class for FunctionGraph extensions.
A Feature is an object with several callbacks that are triggered by various operations on FunctionGraphs. It can be used to enforce graph properties at all stages of graph optimization.
See also
pytensor.graph.featuresfor common extensions.
- clone()[source]#
Create a clone that can be attached to a new
FunctionGraph.This default implementation returns
self, which carries the assumption that theFeatureis essentially stateless. If a subclass has state of its own that is in any way relative to a givenFunctionGraph, this method should be overridden with an implementation that actually creates a fresh copy.
- on_attach(fgraph)[source]#
Called by
FunctionGraph.attach_feature, the method that attaches the feature to theFunctionGraph. Since this is called after theFunctionGraphis initially populated, this is where you should run checks on the initial contents of theFunctionGraph.The on_attach method may raise the
AlreadyThereexception to cancel the attach operation if it detects that another Feature instance implementing the same functionality is already attached to theFunctionGraph.The feature has great freedom in what it can do with the
fgraph: it may, for example, add methods to it dynamically.
- on_change_input(fgraph, node, i, var, new_var, reason=None)[source]#
Called whenever
node.inputs[i]is changed fromvartonew_var. At the moment the callback is done, the change has already taken place.If you raise an exception in this function, the state of the graph might be broken for all intents and purposes.
- on_detach(fgraph)[source]#
Called by
FunctionGraph.remove_feature. Should remove any dynamically-added functionality that it installed into the fgraph.
- on_import(fgraph, node, reason)[source]#
Called whenever a node is imported into
fgraph, which is just before the node is actually connected to the graph.Note: this is not called when the graph is created. If you want to detect the first nodes to be implemented to the graph, you should do this by implementing
on_attach.
- on_prune(fgraph, node, reason)[source]#
Called whenever a node is pruned (removed) from the
fgraph, after it is disconnected from the graph.
- orderings(fgraph)[source]#
Called by
FunctionGraph.toposort. It should return a dictionary of{node: predecessors}wherepredecessorsis a list of nodes that should be computed before the key node.If you raise an exception in this function, the state of the graph might be broken for all intents and purposes.
FunctionGraph Feature List#
ReplaceValidate
DestroyHandler