Skip to content

Commit 86771d0

Browse files
author
Sanjoy Das
committed
Introduce a ConditionallySpeculatable op interface
This patch takes the first step towards a more principled modeling of undefined behavior in MLIR as discussed in the following discourse threads: 1. https://discourse.llvm.org/t/semantics-modeling-undefined-behavior-and-side-effects/4812 2. https://discourse.llvm.org/t/rfc-mark-tensor-dim-and-memref-dim-as-side-effecting/65729 This patch in particular does the following: 1. Introduces a ConditionallySpeculatable OpInterface that dynamically determines whether an Operation can be speculated. 2. Re-defines `NoSideEffect` to allow undefined behavior, making it necessary but not sufficient for speculation. Also renames it to `NoMemoryEffect`. 3. Makes LICM respect the above semantics. 4. Changes all ops tagged with `NoSideEffect` today to additionally implement ConditionallySpeculatable and mark themselves as always speculatable. This combined trait is named `Pure`. This makes this change NFC. For out of tree dialects: 1. Replace `NoSideEffect` with `Pure` if the operation does not have any memory effects, undefined behavior or infinite loops. 2. Replace `NoSideEffect` with `NoSideEffect` otherwise. The next steps in this process are (I'm proposing to do these in upcoming patches): 1. Update operations like `tensor.dim`, `memref.dim`, `scf.for`, `affine.for` to implement a correct hook for `ConditionallySpeculatable`. I'm also happy to update ops in other dialects if the respective dialect owners would like to and can give me some pointers. 2. Update other passes that speculate operations to consult `ConditionallySpeculatable` in addition to `NoMemoryEffect`. I could not find any other than LICM on a quick skim, but I could have missed some. 3. Add some documentation / FAQs detailing the differences between side effects, undefined behavior, speculatabilty. Reviewed By: rriddle, mehdi_amini Differential Revision: https://reviews.llvm.org/D135505
1 parent 2eaf6f9 commit 86771d0

File tree

84 files changed

+778
-508
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

84 files changed

+778
-508
lines changed

mlir/docs/OpDefinitions.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -106,7 +106,7 @@ An operation is defined by specializing the `Op` class with concrete contents
106106
for all the fields it requires. For example, `tf.AvgPool` is defined as
107107

108108
```tablegen
109-
def TF_AvgPoolOp : TF_Op<"AvgPool", [NoSideEffect]> {
109+
def TF_AvgPoolOp : TF_Op<"AvgPool", [NoMemoryEffect]> {
110110
let summary = "Performs average pooling on the input.";
111111
112112
let description = [{

mlir/docs/Tutorials/QuickstartRewrites.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ operations are generated from. To define an operation one needs to specify:
4545

4646
```tablegen
4747
def TFL_LeakyReluOp: TFL_Op<TFL_Dialect, "leaky_relu",
48-
[NoSideEffect, SameValueType]>,
48+
[NoMemoryEffect, SameValueType]>,
4949
Results<(outs Tensor)> {
5050
let arguments = (ins
5151
F32Tensor:$x,

mlir/docs/Tutorials/Toy/Ch-3.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -144,10 +144,10 @@ eliminated. That is not ideal! What happened is that our pattern replaced the
144144
last transform with the function input and left behind the now dead transpose
145145
input. The Canonicalizer knows to clean up dead operations; however, MLIR
146146
conservatively assumes that operations may have side-effects. We can fix this by
147-
adding a new trait, `NoSideEffect`, to our `TransposeOp`:
147+
adding a new trait, `NoMemoryEffect`, to our `TransposeOp`:
148148

149149
```tablegen
150-
def TransposeOp : Toy_Op<"transpose", [NoSideEffect]> {...}
150+
def TransposeOp : Toy_Op<"transpose", [NoMemoryEffect]> {...}
151151
```
152152

153153
Let's retry now `toyc-ch3 test/transpose_transpose.toy -emit=mlir -opt`:

mlir/docs/Tutorials/Toy/Ch-4.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -222,7 +222,7 @@ casts between two different shapes.
222222
```tablegen
223223
def CastOp : Toy_Op<"cast", [
224224
DeclareOpInterfaceMethods<CastOpInterface>,
225-
NoSideEffect,
225+
NoMemoryEffect,
226226
SameOperandsAndResultShape]
227227
> {
228228
let summary = "shape cast operation";

mlir/examples/standalone/include/Standalone/StandaloneOps.td

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ include "Standalone/StandaloneDialect.td"
1313
include "mlir/Interfaces/InferTypeOpInterface.td"
1414
include "mlir/Interfaces/SideEffectInterfaces.td"
1515

16-
def Standalone_FooOp : Standalone_Op<"foo", [NoSideEffect,
16+
def Standalone_FooOp : Standalone_Op<"foo", [Pure,
1717
SameOperandsAndResultType]> {
1818
let summary = "Illustrates how to define an operation.";
1919
let description = [{

mlir/examples/toy/Ch2/include/toy/Ops.td

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -43,9 +43,9 @@ class Toy_Op<string mnemonic, list<Trait> traits = []> :
4343

4444
// We define a toy operation by inheriting from our base 'Toy_Op' class above.
4545
// Here we provide the mnemonic and a list of traits for the operation. The
46-
// constant operation is marked as 'NoSideEffect' as it is a pure operation
46+
// constant operation is marked as 'Pure' as it is a pure operation
4747
// and may be removed if dead.
48-
def ConstantOp : Toy_Op<"constant", [NoSideEffect]> {
48+
def ConstantOp : Toy_Op<"constant", [Pure]> {
4949
// Provide a summary and description for this operation. This can be used to
5050
// auto-generate documentation of the operations within our dialect.
5151
let summary = "constant";
@@ -265,7 +265,7 @@ def ReshapeOp : Toy_Op<"reshape"> {
265265
// ReturnOp
266266
//===----------------------------------------------------------------------===//
267267

268-
def ReturnOp : Toy_Op<"return", [NoSideEffect, HasParent<"FuncOp">,
268+
def ReturnOp : Toy_Op<"return", [Pure, HasParent<"FuncOp">,
269269
Terminator]> {
270270
let summary = "return operation";
271271
let description = [{

mlir/examples/toy/Ch3/include/toy/Ops.td

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -42,9 +42,9 @@ class Toy_Op<string mnemonic, list<Trait> traits = []> :
4242

4343
// We define a toy operation by inheriting from our base 'Toy_Op' class above.
4444
// Here we provide the mnemonic and a list of traits for the operation. The
45-
// constant operation is marked as 'NoSideEffect' as it is a pure operation
45+
// constant operation is marked as 'Pure' as it is a pure operation
4646
// and may be removed if dead.
47-
def ConstantOp : Toy_Op<"constant", [NoSideEffect]> {
47+
def ConstantOp : Toy_Op<"constant", [Pure]> {
4848
// Provide a summary and description for this operation. This can be used to
4949
// auto-generate documentation of the operations within our dialect.
5050
let summary = "constant";
@@ -88,7 +88,7 @@ def ConstantOp : Toy_Op<"constant", [NoSideEffect]> {
8888
// AddOp
8989
//===----------------------------------------------------------------------===//
9090

91-
def AddOp : Toy_Op<"add", [NoSideEffect]> {
91+
def AddOp : Toy_Op<"add", [Pure]> {
9292
let summary = "element-wise addition operation";
9393
let description = [{
9494
The "add" operation performs element-wise addition between two tensors.
@@ -199,7 +199,7 @@ def GenericCallOp : Toy_Op<"generic_call"> {
199199
// MulOp
200200
//===----------------------------------------------------------------------===//
201201

202-
def MulOp : Toy_Op<"mul", [NoSideEffect]> {
202+
def MulOp : Toy_Op<"mul", [Pure]> {
203203
let summary = "element-wise multiplication operation";
204204
let description = [{
205205
The "mul" operation performs element-wise multiplication between two
@@ -239,7 +239,7 @@ def PrintOp : Toy_Op<"print"> {
239239
// ReshapeOp
240240
//===----------------------------------------------------------------------===//
241241

242-
def ReshapeOp : Toy_Op<"reshape", [NoSideEffect]> {
242+
def ReshapeOp : Toy_Op<"reshape", [Pure]> {
243243
let summary = "tensor reshape operation";
244244
let description = [{
245245
Reshape operation is transforming its input tensor into a new tensor with
@@ -267,7 +267,7 @@ def ReshapeOp : Toy_Op<"reshape", [NoSideEffect]> {
267267
// ReturnOp
268268
//===----------------------------------------------------------------------===//
269269

270-
def ReturnOp : Toy_Op<"return", [NoSideEffect, HasParent<"FuncOp">,
270+
def ReturnOp : Toy_Op<"return", [Pure, HasParent<"FuncOp">,
271271
Terminator]> {
272272
let summary = "return operation";
273273
let description = [{
@@ -309,7 +309,7 @@ def ReturnOp : Toy_Op<"return", [NoSideEffect, HasParent<"FuncOp">,
309309
// TransposeOp
310310
//===----------------------------------------------------------------------===//
311311

312-
def TransposeOp : Toy_Op<"transpose", [NoSideEffect]> {
312+
def TransposeOp : Toy_Op<"transpose", [Pure]> {
313313
let summary = "transpose operation";
314314

315315
let arguments = (ins F64Tensor:$input);

mlir/examples/toy/Ch4/include/toy/Ops.td

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -45,9 +45,9 @@ class Toy_Op<string mnemonic, list<Trait> traits = []> :
4545

4646
// We define a toy operation by inheriting from our base 'Toy_Op' class above.
4747
// Here we provide the mnemonic and a list of traits for the operation. The
48-
// constant operation is marked as 'NoSideEffect' as it is a pure operation
48+
// constant operation is marked as 'Pure' as it is a pure operation
4949
// and may be removed if dead.
50-
def ConstantOp : Toy_Op<"constant", [NoSideEffect]> {
50+
def ConstantOp : Toy_Op<"constant", [Pure]> {
5151
// Provide a summary and description for this operation. This can be used to
5252
// auto-generate documentation of the operations within our dialect.
5353
let summary = "constant";
@@ -92,7 +92,7 @@ def ConstantOp : Toy_Op<"constant", [NoSideEffect]> {
9292
//===----------------------------------------------------------------------===//
9393

9494
def AddOp : Toy_Op<"add",
95-
[NoSideEffect, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
95+
[Pure, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
9696
let summary = "element-wise addition operation";
9797
let description = [{
9898
The "add" operation performs element-wise addition between two tensors.
@@ -118,7 +118,7 @@ def AddOp : Toy_Op<"add",
118118
def CastOp : Toy_Op<"cast", [
119119
DeclareOpInterfaceMethods<CastOpInterface>,
120120
DeclareOpInterfaceMethods<ShapeInferenceOpInterface>,
121-
NoSideEffect,
121+
Pure,
122122
SameOperandsAndResultShape
123123
]> {
124124
let summary = "shape cast operation";
@@ -231,7 +231,7 @@ def GenericCallOp : Toy_Op<"generic_call",
231231
//===----------------------------------------------------------------------===//
232232

233233
def MulOp : Toy_Op<"mul",
234-
[NoSideEffect, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
234+
[Pure, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
235235
let summary = "element-wise multiplication operation";
236236
let description = [{
237237
The "mul" operation performs element-wise multiplication between two
@@ -271,7 +271,7 @@ def PrintOp : Toy_Op<"print"> {
271271
// ReshapeOp
272272
//===----------------------------------------------------------------------===//
273273

274-
def ReshapeOp : Toy_Op<"reshape", [NoSideEffect]> {
274+
def ReshapeOp : Toy_Op<"reshape", [Pure]> {
275275
let summary = "tensor reshape operation";
276276
let description = [{
277277
Reshape operation is transforming its input tensor into a new tensor with
@@ -299,7 +299,7 @@ def ReshapeOp : Toy_Op<"reshape", [NoSideEffect]> {
299299
// ReturnOp
300300
//===----------------------------------------------------------------------===//
301301

302-
def ReturnOp : Toy_Op<"return", [NoSideEffect, HasParent<"FuncOp">,
302+
def ReturnOp : Toy_Op<"return", [Pure, HasParent<"FuncOp">,
303303
Terminator]> {
304304
let summary = "return operation";
305305
let description = [{
@@ -342,7 +342,7 @@ def ReturnOp : Toy_Op<"return", [NoSideEffect, HasParent<"FuncOp">,
342342
//===----------------------------------------------------------------------===//
343343

344344
def TransposeOp : Toy_Op<"transpose",
345-
[NoSideEffect, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
345+
[Pure, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
346346
let summary = "transpose operation";
347347

348348
let arguments = (ins F64Tensor:$input);

mlir/examples/toy/Ch5/include/toy/Ops.td

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -45,9 +45,9 @@ class Toy_Op<string mnemonic, list<Trait> traits = []> :
4545

4646
// We define a toy operation by inheriting from our base 'Toy_Op' class above.
4747
// Here we provide the mnemonic and a list of traits for the operation. The
48-
// constant operation is marked as 'NoSideEffect' as it is a pure operation
48+
// constant operation is marked as 'Pure' as it is a pure operation
4949
// and may be removed if dead.
50-
def ConstantOp : Toy_Op<"constant", [NoSideEffect]> {
50+
def ConstantOp : Toy_Op<"constant", [Pure]> {
5151
// Provide a summary and description for this operation. This can be used to
5252
// auto-generate documentation of the operations within our dialect.
5353
let summary = "constant";
@@ -92,7 +92,7 @@ def ConstantOp : Toy_Op<"constant", [NoSideEffect]> {
9292
//===----------------------------------------------------------------------===//
9393

9494
def AddOp : Toy_Op<"add",
95-
[NoSideEffect, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
95+
[Pure, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
9696
let summary = "element-wise addition operation";
9797
let description = [{
9898
The "add" operation performs element-wise addition between two tensors.
@@ -118,7 +118,7 @@ def AddOp : Toy_Op<"add",
118118
def CastOp : Toy_Op<"cast", [
119119
DeclareOpInterfaceMethods<CastOpInterface>,
120120
DeclareOpInterfaceMethods<ShapeInferenceOpInterface>,
121-
NoSideEffect,
121+
Pure,
122122
SameOperandsAndResultShape
123123
]> {
124124
let summary = "shape cast operation";
@@ -231,7 +231,7 @@ def GenericCallOp : Toy_Op<"generic_call",
231231
//===----------------------------------------------------------------------===//
232232

233233
def MulOp : Toy_Op<"mul",
234-
[NoSideEffect, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
234+
[Pure, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
235235
let summary = "element-wise multiplication operation";
236236
let description = [{
237237
The "mul" operation performs element-wise multiplication between two
@@ -272,7 +272,7 @@ def PrintOp : Toy_Op<"print"> {
272272
// ReshapeOp
273273
//===----------------------------------------------------------------------===//
274274

275-
def ReshapeOp : Toy_Op<"reshape", [NoSideEffect]> {
275+
def ReshapeOp : Toy_Op<"reshape", [Pure]> {
276276
let summary = "tensor reshape operation";
277277
let description = [{
278278
Reshape operation is transforming its input tensor into a new tensor with
@@ -300,7 +300,7 @@ def ReshapeOp : Toy_Op<"reshape", [NoSideEffect]> {
300300
// ReturnOp
301301
//===----------------------------------------------------------------------===//
302302

303-
def ReturnOp : Toy_Op<"return", [NoSideEffect, HasParent<"FuncOp">,
303+
def ReturnOp : Toy_Op<"return", [Pure, HasParent<"FuncOp">,
304304
Terminator]> {
305305
let summary = "return operation";
306306
let description = [{
@@ -343,7 +343,7 @@ def ReturnOp : Toy_Op<"return", [NoSideEffect, HasParent<"FuncOp">,
343343
//===----------------------------------------------------------------------===//
344344

345345
def TransposeOp : Toy_Op<"transpose",
346-
[NoSideEffect, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
346+
[Pure, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
347347
let summary = "transpose operation";
348348

349349
let arguments = (ins F64Tensor:$input);

mlir/examples/toy/Ch6/include/toy/Ops.td

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -45,9 +45,9 @@ class Toy_Op<string mnemonic, list<Trait> traits = []> :
4545

4646
// We define a toy operation by inheriting from our base 'Toy_Op' class above.
4747
// Here we provide the mnemonic and a list of traits for the operation. The
48-
// constant operation is marked as 'NoSideEffect' as it is a pure operation
48+
// constant operation is marked as 'Pure' as it is a pure operation
4949
// and may be removed if dead.
50-
def ConstantOp : Toy_Op<"constant", [NoSideEffect]> {
50+
def ConstantOp : Toy_Op<"constant", [Pure]> {
5151
// Provide a summary and description for this operation. This can be used to
5252
// auto-generate documentation of the operations within our dialect.
5353
let summary = "constant";
@@ -92,7 +92,7 @@ def ConstantOp : Toy_Op<"constant", [NoSideEffect]> {
9292
//===----------------------------------------------------------------------===//
9393

9494
def AddOp : Toy_Op<"add",
95-
[NoSideEffect, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
95+
[Pure, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
9696
let summary = "element-wise addition operation";
9797
let description = [{
9898
The "add" operation performs element-wise addition between two tensors.
@@ -118,7 +118,7 @@ def AddOp : Toy_Op<"add",
118118
def CastOp : Toy_Op<"cast", [
119119
DeclareOpInterfaceMethods<CastOpInterface>,
120120
DeclareOpInterfaceMethods<ShapeInferenceOpInterface>,
121-
NoSideEffect,
121+
Pure,
122122
SameOperandsAndResultShape
123123
]> {
124124
let summary = "shape cast operation";
@@ -231,7 +231,7 @@ def GenericCallOp : Toy_Op<"generic_call",
231231
//===----------------------------------------------------------------------===//
232232

233233
def MulOp : Toy_Op<"mul",
234-
[NoSideEffect, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
234+
[Pure, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
235235
let summary = "element-wise multiplication operation";
236236
let description = [{
237237
The "mul" operation performs element-wise multiplication between two
@@ -272,7 +272,7 @@ def PrintOp : Toy_Op<"print"> {
272272
// ReshapeOp
273273
//===----------------------------------------------------------------------===//
274274

275-
def ReshapeOp : Toy_Op<"reshape", [NoSideEffect]> {
275+
def ReshapeOp : Toy_Op<"reshape", [Pure]> {
276276
let summary = "tensor reshape operation";
277277
let description = [{
278278
Reshape operation is transforming its input tensor into a new tensor with
@@ -300,7 +300,7 @@ def ReshapeOp : Toy_Op<"reshape", [NoSideEffect]> {
300300
// ReturnOp
301301
//===----------------------------------------------------------------------===//
302302

303-
def ReturnOp : Toy_Op<"return", [NoSideEffect, HasParent<"FuncOp">,
303+
def ReturnOp : Toy_Op<"return", [Pure, HasParent<"FuncOp">,
304304
Terminator]> {
305305
let summary = "return operation";
306306
let description = [{
@@ -343,7 +343,7 @@ def ReturnOp : Toy_Op<"return", [NoSideEffect, HasParent<"FuncOp">,
343343
//===----------------------------------------------------------------------===//
344344

345345
def TransposeOp : Toy_Op<"transpose",
346-
[NoSideEffect, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
346+
[Pure, DeclareOpInterfaceMethods<ShapeInferenceOpInterface>]> {
347347
let summary = "transpose operation";
348348

349349
let arguments = (ins F64Tensor:$input);

0 commit comments

Comments
 (0)