diff --git a/docs/snowpark/snowpark-backend-limitations.md b/docs/snowpark/snowpark-backend-limitations.md new file mode 100644 index 000000000..7bf08f04c --- /dev/null +++ b/docs/snowpark/snowpark-backend-limitations.md @@ -0,0 +1,129 @@ +# Limitations of the Snowpark backend + + +The goal of the Snowpark backend is to generate expressions that can take advantage of the **Snowpark** infrastructure. Because of this, there are several language practices that are valid in `Mophir-IR/Elm` that are not supported by this backend. When possible a warning is generated in the code or in the `GenerationReport.md` file. + +Some of these limitations include: + +## Recursive functions manipulating dataframe expressions + +This backend tries to generate as many dataframe expressions as possible. That is why many functions are generated as Scala functions that generate dataframe expression (or [Column](https://docs.snowflake.com/developer-guide/snowpark/reference/scala/com/snowflake/snowpark/Column.html) objects). For example: + +For example: + +```elm +double : Int -> Int +double x = + if x == 0 then + 0 + else + x * x +``` + +Is converted to: + +```elm + def double( + x: com.snowflake.snowpark.Column + )( + implicit sfSession: com.snowflake.snowpark.Session + ): com.snowflake.snowpark.Column = + com.snowflake.snowpark.functions.when( + (x) === (com.snowflake.snowpark.functions.lit(0)), + com.snowflake.snowpark.functions.lit(0) + ).otherwise((x) * (x)) +``` + +As shown above this function is going to return a `Column` instance. This object represents the actual expression tree that is processed by the **Snowpark** library. This transformation makes it impossible to convert functions that make recursive calls. For example: + +```elm +factorial : Int -> Int +factorial n = + if n == 0 then + 1 + else + n * (factorial (n - 1)) +``` + +The generated Scala code looks like this: + +```scala + def factorial( + n: com.snowflake.snowpark.Column + )( + implicit sfSession: com.snowflake.snowpark.Session + ): com.snowflake.snowpark.Column = + com.snowflake.snowpark.functions.when( + (n) === (com.snowflake.snowpark.functions.lit(0)), + com.snowflake.snowpark.functions.lit(1) + ).otherwise((n) * (mymodel.Basic.factorial((n) - (com.snowflake.snowpark.functions.lit(1))))) +``` + +Since this code is composed only of nested function calls, there is nothing preventing the execution of the recursive call. This + +```bash +java.lang.StackOverflowError + com.snowflake.snowpark.Column.$eq$eq$eq(Column.scala:269) + mymodel.Basic$.factorial(Basic.scala:751) + mymodel.Basic$.factorial(Basic.scala:753) + mymodel.Basic$.factorial(Basic.scala:753) + mymodel.Basic$.factorial(Basic.scala:753) + mymodel.Basic$.factorial(Basic.scala:753) + mymodel.Basic$.factorial(Basic.scala:753) + mymodel.Basic$.factorial(Basic.scala:753) + mymodel.Basic$.factorial(Basic.scala:753) + ... +``` + +## Code that do not manipulate lists of table-like records + +To take advantage of this backend, the code being processed has to manipulate lists of table-like records (ex. with fields of only basic). These structure are identified as [DataFrames](https://docs.snowflake.com/en/developer-guide/snowpark/scala/working-with-dataframes). + + +## Code that do not follow the backend conventions + +The backend assumes some conventions to determine how to interpret the code that is being processed. These conventions are described in [snowpark-backend.md](snowpark-backend.md). + + +## Unsupported elements + +There may be situations when this backend cannot convert an element from the **Morphir-IR** . Depending on the scenario the backend generates default expressions or types to indicate that something was not converted. + +For example, given that there is no support for the `List.range` function we can try to convert the following snippet: + +```elm +myFunc2: Int -> Int +myFunc2 x = + let + y = x + 1 + z = y + 1 + r = List.range 10 20 + in + x + y + z +``` + +The resulting code is the following: + +```scala + def myFunc2( + x: com.snowflake.snowpark.Column + )( + implicit sfSession: com.snowflake.snowpark.Session + ): com.snowflake.snowpark.Column = { + val y = (x) + (com.snowflake.snowpark.functions.lit(1)) + + val z = (y) + (com.snowflake.snowpark.functions.lit(1)) + + val r = "Call not generated" + + ((x) + (y)) + (z) + } +``` + +Notice that the `"Call not generated"` expression was generated. Also, the `GenerationReport.md` file is going to include an error message for this function: + +```markdown +### MyModel:Basic:myFunc2 + +- Call to function not generated: Morphir.SDK:List:range +``` \ No newline at end of file diff --git a/docs/snowpark/snowpark-backend.md b/docs/snowpark/snowpark-backend.md index 5a9e4f3bb..3ecac96b2 100644 --- a/docs/snowpark/snowpark-backend.md +++ b/docs/snowpark/snowpark-backend.md @@ -4,26 +4,24 @@ id: snowpark-backend # Snowpark Backend -**TODO** - -**Snowpark** backend uses Scala as its JVM language. +The Morphir **Snowpark** backend generates Scala code that uses the [Snowpark](https://docs.snowflake.com/en/developer-guide/snowpark/scala/index) API . ## Generation conventions and strategies -The **Snowpark** backend supports two basic code generation strategies: +The **Snowpark** backend supports two main code generation strategies: -- Generating code that manipulates DataFrame expressions +- Generating code that manipulates [DataFrame](https://docs.snowflake.com/en/developer-guide/snowpark/scala/working-with-dataframes) expressions - Generating "plain" Scala code -The backend uses a series of conventions for deciding which strategy is used to convert the code of a function. The conventions apply to types and function definitions. +The backend uses a series of conventions for deciding which strategy is used to convert the a function. These conventions apply to the way types and function are defined. ### Type definition conventions -Type definitions in the input **Morphir IR** are classified according to the following conventions: +Type definitions in the input **Morphir IR** are classified using the following conventions: #### Records that represent tables -Records are classified as "representing a table definition" according to the types of its members. A DataFrame compatible type is one of the following: +Records are classified as "representing a table definition" according to the types of its members. A [DataFrame](https://docs.snowflake.com/en/developer-guide/snowpark/scala/working-with-dataframes) compatible type is one of the following: - A basic datatype - Int @@ -173,7 +171,7 @@ In this case references to `List Employee` are converted to DataFrames: ### Custom types -The **Snowpark** backend uses two conventions to deal with [custom types](https://guide.elm-lang.org/types/custom_types.html) used as a field of a *DataFrame record*. These conventions depend of the presence of parameters for type constructors. +Two conventions are used to process [custom types](https://guide.elm-lang.org/types/custom_types.html) used as a field of a *DataFrame record*. These conventions depend of the presence of parameters for type constructors. #### 1. Convention for custom types without data @@ -200,7 +198,7 @@ northDirections dirs = |> List.filter (\e -> e.direction == North) ``` -In this case the backend assumes that the code stored in a `Directions` table has a column of type `VARCHAR` or `CHAR` with text with the name of field. For example: +In this case it is assumed that the code stored in the `Directions` table has a column of type `VARCHAR` or `CHAR` with text with the name of field. For example: | ID | DIRECTION | |-----|------------| @@ -208,7 +206,7 @@ In this case the backend assumes that the code stored in a `Directions` table ha | 23 | 'East' | | 43 | 'South' | -As a convenience the backend generates a Scala object with the definition of the possible values: +As a convenience a Scala object is generated with the definition of the possible values: ```Scala object CardinalDirection{ @@ -229,7 +227,7 @@ def West: com.snowflake.snowpark.Column = ``` Notice the use of [lit](https://docs.snowflake.com/developer-guide/snowpark/reference/scala/com/snowflake/snowpark/functions$.html#lit(literal:Any):com.snowflake.snowpark.Column) to indicate that we expect a literal string value for each constructor. -This class is used where the value of the possible constructors is used. For example for the definition of `northDirections` above the comparison with `North` is generated as follows: +This object is used where the value of the possible constructors is used. For example for the definition of `northDirections` above the comparison with `North` is generated as follows: ```Scala def northDirections( @@ -245,12 +243,12 @@ This class is used where the value of the possible constructors is used. For exa #### 2. Convention for custom types with data -In the case that a custom type has constructors with parameters this backend assumes that values of this type are stored in a [OBJECT column](https://docs.snowflake.com/en/sql-reference/data-types-semistructured#object) . +In the case that a custom type has constructors with parameters this backend assumes that values of this type are stored in an [OBJECT column](https://docs.snowflake.com/en/sql-reference/data-types-semistructured#object) . The encoding of column is defined as follows: - Values are encoded as a `JSON` object -- A special property of this object called `__tag` is used to determine which variant is used in the current value +- A special property of this object called `"__tag"` is used to determine which variant is used in the current value - All the parameters in order are stored in properties called `field0`, `field1`, `field2` ... `fieldN` Given the following custom type definition: @@ -268,7 +266,7 @@ type alias TasksEstimations = } ``` -The data for `TaskEstimations estimation` is expected to be stored in a table using an `OBJECT` column: +The data for `TaskEstimations` is expected to be stored in a table using an `OBJECT` column: | TASKID | ESTIMATION | |--------|-------------------------------------------------------------------| @@ -320,12 +318,21 @@ This code is generated as: ((tasksColumns.estimation("field0")) * (com.snowflake.snowpark.functions.lit(60))) + (tasksColumns.estimation("field1")) ).as("seconds")) } - ``` +#### 3. Convention for `Maybe` types + +The [Maybe a](https://package.elm-lang.org/packages/elm/core/latest/Maybe) type is assumed to be a nullable database value. This means that the data is expected to be stored as follows: + +| Elm value | Value stored in the Database | +|--------------|-------------------------------| +| `Just 10` | `10` | +| `Nothing` | `NULL` | + + ### Function definition conventions -These conventions are based on the input and return types of a function. There are strategies: using DataFrame expressions or using Scala expressions. The following sections have more details. +These conventions are based on the input and return types of a function. Two strategies are used: using DataFrame expressions or using Scala expressions. The following sections have more details. #### Code generation using DataFrame expressions manipulation @@ -519,7 +526,33 @@ In this case code for `avgSalaries` is going to perform a Scala division operati } ``` -Code generation for this strategy is meant to be used for code that manipulates the result of performing DataFrame operations . At this moment its coverage is very limited. +Code generation for this strategy is meant to be used for code that manipulates the result of performing DataFrame operations. At this moment its coverage is very limited. + +### Creation of empty DataFrames + +Creating an empty list of table-like records is interpreted as creating an empty DataFrame. For example: + +```elm +createDataForTest : List Employee -> DataFromCompany +createDataForTest emps = + { employees = emps , departments = [] } +``` +In this case the code is generated as follows: +```scala + def createDataForTest( + emps: com.snowflake.snowpark.DataFrame + )( + implicit sfSession: com.snowflake.snowpark.Session + ): mymodel.Basic.DataFromCompany = { + val empsColumns: mymodel.Basic.Employee = new mymodel.Basic.EmployeeWrapper(emps) + + mymodel.Basic.DataFromCompany( + employees = emps, + departments = mymodel.Basic.Department.createEmptyDataFrame(sfSession) + ) + } +``` +Notice that this is the main reason for having an `implicit` with the [Session object](https://docs.snowflake.com/en/developer-guide/snowpark/reference/scala/com/snowflake/snowpark/Session.html). diff --git a/src/Morphir/Snowpark/PatternMatchMapping.elm b/src/Morphir/Snowpark/PatternMatchMapping.elm index 7cf009f0e..a855258d8 100644 --- a/src/Morphir/Snowpark/PatternMatchMapping.elm +++ b/src/Morphir/Snowpark/PatternMatchMapping.elm @@ -158,7 +158,7 @@ mapMaybeValue maybeValue ctx mapValue = in (mappedDefaultMaybe, defaultMaybeIssues) -createCaseCodeForTuplePattern : (List TuplePatternResult, Value () (Type ())) -> +createCaseCodeForTuplePattern : (List NestedPatternResult, Value () (Type ())) -> List Scala.Value -> Constants.MapValueType -> ValueMappingContext -> @@ -168,12 +168,12 @@ createCaseCodeForTuplePattern (tuplePats, value) positionValues mapValue ctx = tuplesForConversion = List.map2 (\x y -> (x, y)) tuplePats positionValues mappingsContextWithReplacements = tuplesForConversion - |> List.foldl (\((TuplePatternResult funcs), val) currCtx -> funcs.contextManipulation val currCtx) ctx + |> List.foldl (\((NestedPatternResult funcs), val) currCtx -> funcs.contextManipulation val currCtx) ctx condition = tuplesForConversion - |> List.foldr (\((TuplePatternResult funcs), val) currentCondition -> + |> List.foldr (\((NestedPatternResult funcs), val) currentCondition -> Scala.BinOp (funcs.conditionGenerator val) "&&" currentCondition ) - (Scala.Literal (Scala.BooleanLit True)) + (applySnowparkFunc "lit" [Scala.Literal (Scala.BooleanLit True)]) |> simplifyBooleanExpression (mappedSuccessValue, successValueIssues) = mapValue value mappingsContextWithReplacements in @@ -199,30 +199,30 @@ generateBindingVariableExpr name expr = Scala.Apply expr [Scala.ArgValue Nothing (Scala.Literal (Scala.StringLit name))] -addBindingReplacementsToContext : ValueMappingContext -> List TuplePatternResult -> Scala.Value -> ValueMappingContext +addBindingReplacementsToContext : ValueMappingContext -> List NestedPatternResult -> Scala.Value -> ValueMappingContext addBindingReplacementsToContext ctxt nestedPatternsInfo referenceExpr = let newContext = nestedPatternsInfo |> List.indexedMap (\i nestedPatternInfo -> (nestedPatternInfo, generateBindingVariableExpr (getCustomTypeParameterFieldAccess i) referenceExpr)) - |> List.foldl (\(TuplePatternResult nestedPatternInfo, expr) newCtxt -> nestedPatternInfo.contextManipulation expr newCtxt) ctxt + |> List.foldl (\(NestedPatternResult nestedPatternInfo, expr) newCtxt -> nestedPatternInfo.contextManipulation expr newCtxt) ctxt in newContext -generateNestedPatternsCondition : List TuplePatternResult -> Scala.Value -> Scala.Value -> Scala.Value +generateNestedPatternsCondition : List NestedPatternResult -> Scala.Value -> Scala.Value -> Scala.Value generateNestedPatternsCondition nestedPatternsInfo referenceExpr tagCondition = nestedPatternsInfo |> List.indexedMap (\i nestedPatternInfo -> (nestedPatternInfo, generateBindingVariableExpr (getCustomTypeParameterFieldAccess i) referenceExpr)) - |> List.foldl (\(TuplePatternResult nestedPatternInfo, expr) currentExpr -> + |> List.foldl (\(NestedPatternResult nestedPatternInfo, expr) currentExpr -> (Scala.BinOp currentExpr "&&" (nestedPatternInfo.conditionGenerator expr))) tagCondition |> simplifyBooleanExpression type PatternMatchScenario ta = LiteralsWithDefault (List (Literal, TypedValue)) (TypedValue) | UnionTypesWithoutParams (List (Scala.Value, TypedValue)) (Maybe (TypedValue)) - | UnionTypesWithParams (List (Name.Name, List TuplePatternResult, TypedValue)) (Maybe (TypedValue)) + | UnionTypesWithParams (List (Name.Name, List NestedPatternResult, TypedValue)) (Maybe (TypedValue)) -- Right now we just support `Just` with variables | MaybeCase (Maybe Name.Name, TypedValue) (TypedValue) - | TupleCases (List (List TuplePatternResult, TypedValue)) (Maybe (TypedValue)) + | TupleCases (List (List NestedPatternResult, TypedValue)) (Maybe (TypedValue)) | Unsupported checkForLiteralCase : ( Pattern (Type ()), TypedValue ) -> Maybe (Literal, TypedValue) @@ -249,11 +249,11 @@ checkForUnionOfWithNoParams (pattern, caseValue) = _ -> Nothing -checkConstructorForUnionOfWithParams : ( Pattern (Type ()), TypedValue ) -> Maybe (Name.Name, List TuplePatternResult, TypedValue) -checkConstructorForUnionOfWithParams (pattern, caseValue) = +checkConstructorForUnionOfWithParams : ValueMappingContext -> ( Pattern (Type ()), TypedValue ) -> Maybe (Name.Name, List NestedPatternResult, TypedValue) +checkConstructorForUnionOfWithParams ctx (pattern, caseValue) = case pattern of (ConstructorPattern _ name patternArgs) -> - collectMaybeList checkTuplePatternItemPattern patternArgs + collectMaybeList (checkNestedItemPattern ctx) patternArgs |> Maybe.map (\patInfo -> (FQName.getLocalName name, patInfo, caseValue)) _ -> Nothing @@ -280,11 +280,11 @@ checkUnionWithParams expr cases ctx = if isUnionTypeRefWithParams (Value.valueAttribute expr) ctx.typesContextInfo then case List.reverse cases of ((WildcardPattern _, wildCardResult)::restReversed) -> - (collectMaybeList checkConstructorForUnionOfWithParams restReversed) + (collectMaybeList (checkConstructorForUnionOfWithParams ctx) restReversed) |> Maybe.map List.reverse |> Maybe.map (\parts -> (UnionTypesWithParams parts (Just wildCardResult))) ((ConstructorPattern _ _ _, _)::_) as constructorCases -> - (collectMaybeList checkConstructorForUnionOfWithParams constructorCases) + (collectMaybeList (checkConstructorForUnionOfWithParams ctx) constructorCases) |> Maybe.map List.reverse |> Maybe.map (\parts -> (UnionTypesWithParams parts Nothing)) _ -> @@ -312,49 +312,58 @@ checkMaybePattern expr cases ctx = _ -> Nothing _ -> Nothing -checkForTuplePatternCase : ( Pattern (Type ()), TypedValue ) -> Maybe (List TuplePatternResult, TypedValue) -checkForTuplePatternCase (pattern, caseValue) = +checkForTuplePatternCase : ValueMappingContext -> ( Pattern (Type ()), TypedValue ) -> Maybe (List NestedPatternResult, TypedValue) +checkForTuplePatternCase ctx (pattern, caseValue) = case pattern of (TuplePattern _ options) -> (options - |> collectMaybeList checkTuplePatternItemPattern) + |> collectMaybeList (checkNestedItemPattern ctx)) |> Maybe.map (\lst -> (lst, caseValue)) _ -> Nothing -type TuplePatternResult = - TuplePatternResult { conditionGenerator: Scala.Value -> Scala.Value +type NestedPatternResult = + NestedPatternResult { conditionGenerator: Scala.Value -> Scala.Value , contextManipulation: Scala.Value -> ValueMappingContext -> ValueMappingContext } -checkTuplePatternItemPattern : Pattern (Type ()) -> Maybe TuplePatternResult -checkTuplePatternItemPattern pattern = +checkNestedItemPattern : ValueMappingContext -> Pattern (Type ()) -> Maybe NestedPatternResult +checkNestedItemPattern ctx pattern = case pattern of WildcardPattern _ -> - Just <| TuplePatternResult { conditionGenerator = (\_ -> Scala.Literal (Scala.BooleanLit True)) - , contextManipulation = (\_ ctx -> ctx) } + Just <| NestedPatternResult { conditionGenerator = (\_ -> applySnowparkFunc "lit" [ Scala.Literal (Scala.BooleanLit True) ]) + , contextManipulation = (\_ innerCtx -> innerCtx) } AsPattern _ (WildcardPattern _) name -> - Just <| TuplePatternResult { conditionGenerator = (\_ -> Scala.Literal (Scala.BooleanLit True)) + Just <| NestedPatternResult { conditionGenerator = (\_ -> applySnowparkFunc "lit" [Scala.Literal (Scala.BooleanLit True)]) , contextManipulation = addReplacementForIdentifier name } Value.ConstructorPattern _ ( [ [ "morphir" ], [ "s", "d", "k" ] ], [ [ "maybe" ] ], [ "just" ] ) [ innerPattern ] -> - (checkTuplePatternItemPattern innerPattern) - |> Maybe.map (\(TuplePatternResult patObject) -> - (TuplePatternResult { conditionGenerator = \refr -> Scala.BinOp (Scala.Select refr "is_not_null") "&&" (patObject.conditionGenerator refr) + (checkNestedItemPattern ctx innerPattern) + |> Maybe.map (\(NestedPatternResult patObject) -> + (NestedPatternResult { conditionGenerator = \refr -> Scala.BinOp (Scala.Select refr "is_not_null") "&&" (patObject.conditionGenerator refr) , contextManipulation = patObject.contextManipulation })) + Value.ConstructorPattern ((Type.Reference _ typeName _) as tpe) fullConstructorName [ ] -> + if isUnionTypeRefWithoutParams tpe ctx.typesContextInfo then + let + unionCaseReference = scalaReferenceToUnionTypeCase typeName fullConstructorName + in + Just <| NestedPatternResult { conditionGenerator = (\e -> Scala.BinOp e "===" unionCaseReference) + , contextManipulation = (\_ innerCtx -> innerCtx) } + else + Nothing Value.LiteralPattern _ literal -> - Just <| TuplePatternResult { conditionGenerator = (\e -> Scala.BinOp e "===" (mapLiteral () literal)) - , contextManipulation = (\_ ctx -> ctx) } + Just <| NestedPatternResult { conditionGenerator = (\e -> Scala.BinOp e "===" (mapLiteral () literal)) + , contextManipulation = (\_ innerCtx -> innerCtx) } _ -> Nothing -checkTuplePattern : List ( Pattern (Type ()), TypedValue ) -> (Maybe (PatternMatchScenario ta)) -checkTuplePattern cases = +checkTuplePattern : ValueMappingContext -> List ( Pattern (Type ()), TypedValue ) -> (Maybe (PatternMatchScenario ta)) +checkTuplePattern ctx cases = case List.reverse cases of ((WildcardPattern _, wildCardResult)::restReversed) -> - (collectMaybeList checkForTuplePatternCase restReversed) + (collectMaybeList (checkForTuplePatternCase ctx) restReversed) |> Maybe.map List.reverse |> Maybe.map (\casesToProcess -> (TupleCases casesToProcess (Just wildCardResult))) ((TuplePattern _ _, _)::_) as constructorCases -> - (collectMaybeList checkForTuplePatternCase constructorCases) + (collectMaybeList (checkForTuplePatternCase ctx) constructorCases) |> Maybe.map List.reverse |> Maybe.map (\casesToProcess -> (TupleCases casesToProcess Nothing)) _ -> @@ -371,4 +380,4 @@ classifyScenario value cases ctx = , (\_ -> checkUnionWithNoParamsWithDefault value cases ctx) , (\_ -> checkUnionWithParams value cases ctx) , (\_ -> checkMaybePattern value cases ctx) - , (\_ -> checkTuplePattern cases) ]) + , (\_ -> checkTuplePattern ctx cases) ]) diff --git a/src/Morphir/Snowpark/ReferenceUtils.elm b/src/Morphir/Snowpark/ReferenceUtils.elm index 8bca0d803..9d0fd4d3b 100644 --- a/src/Morphir/Snowpark/ReferenceUtils.elm +++ b/src/Morphir/Snowpark/ReferenceUtils.elm @@ -168,13 +168,13 @@ simplifyBooleanExpression exp = simplifyBooleanExpression right in case (simplLeft, simplRight) of - (Scala.Literal (Scala.BooleanLit True), result) -> + (Scala.Apply (Scala.Ref _ "lit") [ Scala.ArgValue Nothing (Scala.Literal (Scala.BooleanLit True)) ], result) -> result - (result, Scala.Literal (Scala.BooleanLit True)) -> + (result, Scala.Apply (Scala.Ref _ "lit") [ Scala.ArgValue Nothing (Scala.Literal (Scala.BooleanLit True)) ]) -> result - (Scala.Literal (Scala.BooleanLit False), _) -> + (Scala.Apply (Scala.Ref _ "lit") [ Scala.ArgValue Nothing (Scala.Literal (Scala.BooleanLit False)) ], _) -> simplLeft - (_, Scala.Literal (Scala.BooleanLit False)) -> + (_, Scala.Apply (Scala.Ref _ "lit") [ Scala.ArgValue Nothing (Scala.Literal (Scala.BooleanLit False)) ]) -> simplRight _ -> exp @@ -186,13 +186,13 @@ simplifyBooleanExpression exp = simplifyBooleanExpression right in case (simplLeft, simplRight) of - (Scala.Literal (Scala.BooleanLit True), _) -> + (Scala.Apply (Scala.Ref _ "lit") [ Scala.ArgValue Nothing (Scala.Literal (Scala.BooleanLit True)) ], _) -> simplLeft - (_, Scala.Literal (Scala.BooleanLit True)) -> + (_, Scala.Apply (Scala.Ref _ "lit") [ Scala.ArgValue Nothing (Scala.Literal (Scala.BooleanLit True)) ]) -> simplRight - (Scala.Literal (Scala.BooleanLit False), result) -> + (Scala.Apply (Scala.Ref _ "lit") [ Scala.ArgValue Nothing (Scala.Literal (Scala.BooleanLit False)) ], result) -> result - (result, Scala.Literal (Scala.BooleanLit False)) -> + (result, Scala.Apply (Scala.Ref _ "lit") [ Scala.ArgValue Nothing (Scala.Literal (Scala.BooleanLit False)) ]) -> result _ -> exp diff --git a/src/Morphir/Snowpark/TypeRefMapping.elm b/src/Morphir/Snowpark/TypeRefMapping.elm index b469cea16..8e0c8797f 100644 --- a/src/Morphir/Snowpark/TypeRefMapping.elm +++ b/src/Morphir/Snowpark/TypeRefMapping.elm @@ -73,6 +73,7 @@ checkForColumnCase typeReference ctx = if isBasicType typeReference || isAliasedBasicType typeReference ctx || isDataFrameFriendlyType typeReference ctx || + isTypeVariable typeReference || isMaybeWithGenericType typeReference then Just <| typeRefForSnowparkType "Column" else @@ -187,6 +188,14 @@ isMaybeWithGenericType tpe = _ -> False +isTypeVariable : Type () -> Bool +isTypeVariable tpe = + case tpe of + Type.Variable _ _ -> + True + _ -> + False + checkForListOfSimpleTypes : Type () -> MappingContextInfo () -> Maybe Scala.Type checkForListOfSimpleTypes typeReference ctx = if isListOfSimpleType typeReference ctx then diff --git a/src/Morphir/Snowpark/UserDefinedFunctionMapping.elm b/src/Morphir/Snowpark/UserDefinedFunctionMapping.elm index 75c03034f..274ee6b9f 100644 --- a/src/Morphir/Snowpark/UserDefinedFunctionMapping.elm +++ b/src/Morphir/Snowpark/UserDefinedFunctionMapping.elm @@ -55,8 +55,7 @@ tryToConvertUserFunctionCall (func, args) mapValue ctx = |> List.unzip in (Constants.applySnowparkFunc "array_construct" argsToUse, List.concat issues) - else - if isLocalFunctionName constructorName ctx && List.length args > 0 then + else if isLocalFunctionName constructorName ctx && List.length args > 0 then let (mappedArgs, issuesPerArg) = args @@ -71,7 +70,7 @@ tryToConvertUserFunctionCall (func, args) mapValue ctx = tag = [ Constants.applySnowparkFunc "lit" [ Scala.Literal (Scala.StringLit "__tag") ], Constants.applySnowparkFunc "lit" [ Scala.Literal (Scala.StringLit tagName)] ] in (Constants.applySnowparkFunc "object_construct" (tag ++ argsToUse), List.concat issuesPerArg) - else + else errorValueAndIssue ("Constructor call not converted: `" ++ (FQName.toString constructorName) ++ "`") ValueIR.Variable _ funcName -> if List.member funcName ctx.parameters || diff --git a/tests-integration/snowpark/model/elm.json b/tests-integration/snowpark/model/elm.json new file mode 100644 index 000000000..ce2a08dc7 --- /dev/null +++ b/tests-integration/snowpark/model/elm.json @@ -0,0 +1,24 @@ +{ + "type": "application", + "source-directories": [ + "src" + ], + "elm-version": "0.19.1", + "dependencies": { + "direct": { + "elm/browser": "1.0.2", + "elm/core": "1.0.5", + "elm/html": "1.0.0" + }, + "indirect": { + "elm/json": "1.1.3", + "elm/time": "1.0.0", + "elm/url": "1.0.0", + "elm/virtual-dom": "1.0.3" + } + }, + "test-dependencies": { + "direct": {}, + "indirect": {} + } +} diff --git a/tests-integration/snowpark/model/morphir.json b/tests-integration/snowpark/model/morphir.json new file mode 100644 index 000000000..a4648f366 --- /dev/null +++ b/tests-integration/snowpark/model/morphir.json @@ -0,0 +1,12 @@ +{ + "name": "CompanyAssets", + "sourceDirectory": "src", + "decorations": { + "snowparkgendecorations": { + "displayName" : "Snowpark generation customization", + "entryPoint": "SnowparkGenCustomization:Decorations:GenerationCustomization", + "ir": "outsp/src/main/scala/decorations/morphir-ir.json", + "storageLocation": "spdecorations.json" + } + } +} diff --git a/tests-integration/snowpark/model/src/CompanyAssets/DataDefinition/Assets.elm b/tests-integration/snowpark/model/src/CompanyAssets/DataDefinition/Assets.elm new file mode 100644 index 000000000..1a905bd7e --- /dev/null +++ b/tests-integration/snowpark/model/src/CompanyAssets/DataDefinition/Assets.elm @@ -0,0 +1,12 @@ +module CompanyAssets.DataDefinition.Assets exposing (..) +import CompanyAssets.DataDefinition.Types exposing (Price, AssetCategory, AssetSubCategory(..)) + +type alias Asset = + { assetId : Int + , name : String + , category : AssetCategory + , subcategory : Maybe AssetSubCategory + , price : Price + , purchaseYear : Int + , vendorId : Int + } diff --git a/tests-integration/snowpark/model/src/CompanyAssets/DataDefinition/Types.elm b/tests-integration/snowpark/model/src/CompanyAssets/DataDefinition/Types.elm new file mode 100644 index 000000000..4c8cc0aeb --- /dev/null +++ b/tests-integration/snowpark/model/src/CompanyAssets/DataDefinition/Types.elm @@ -0,0 +1,24 @@ +module CompanyAssets.DataDefinition.Types exposing (Price, AssetCategory(..), AssetSubCategory(..), Year) + +type alias Price = Float + +type alias Year = Int + + +type AssetCategory = + Vehicle + | Furniture + | Building + | OfficeEquipment + + +type AssetSubCategory = + Car + | Boat + | Office + | Warehouse + | Truck + | Computer + | Printer + | Phone + | Rental \ No newline at end of file diff --git a/tests-integration/snowpark/model/src/CompanyAssets/DataDefinition/Vendors.elm b/tests-integration/snowpark/model/src/CompanyAssets/DataDefinition/Vendors.elm new file mode 100644 index 000000000..de8f94c1f --- /dev/null +++ b/tests-integration/snowpark/model/src/CompanyAssets/DataDefinition/Vendors.elm @@ -0,0 +1,7 @@ +module CompanyAssets.DataDefinition.Vendors exposing (..) + + +type alias Vendor = + { vendorId : Int + , description : String + } \ No newline at end of file diff --git a/tests-integration/snowpark/model/src/CompanyAssets/Rules/DepreciationRules.elm b/tests-integration/snowpark/model/src/CompanyAssets/Rules/DepreciationRules.elm new file mode 100644 index 000000000..8a9029de3 --- /dev/null +++ b/tests-integration/snowpark/model/src/CompanyAssets/Rules/DepreciationRules.elm @@ -0,0 +1,72 @@ +module CompanyAssets.Rules.DepreciationRules exposing (usefulLifeExceeded) + +import CompanyAssets.DataDefinition.Assets exposing (Asset) +import CompanyAssets.DataDefinition.Types exposing (AssetCategory(..), AssetSubCategory(..)) +import CompanyAssets.DataDefinition.Types exposing (Price, Year) + + +usefulLifeExceeded : Year -> List Asset -> List { category : String, price : Price } +usefulLifeExceeded currentYear assets = + assets + |> List.map + (\asset -> + { category = + if check_rental_property currentYear asset then + "rental property" + else if check_vessels currentYear asset then + "vessel" + else if check_office_equipment currentYear asset then + "office" + else if check_for_computer_equipment currentYear asset then + "computer" + else if check_for_cars currentYear asset then + "cars" + else + "" + , price = asset.price + } + ) + |> List.filter (\p -> p.category /= "") + +check_for_cars : Year -> Asset -> Bool +check_for_cars currentYear asset = + case (asset.category, asset.subcategory) of + (Vehicle, Just Car) -> (currentYear - asset.purchaseYear) >= 5 + (Vehicle, Just Truck) -> (currentYear - asset.purchaseYear) >= 5 + (_,_) -> False + +check_for_computer_equipment : Year -> Asset -> Bool +check_for_computer_equipment currentYear asset = + case (asset.category, asset.subcategory) of + (OfficeEquipment, Just Computer) -> (currentYear - asset.purchaseYear) >= 6 + (OfficeEquipment, Just Printer) -> (currentYear - asset.purchaseYear) >= 7 + _ -> False + + +check_office_equipment : Year -> Asset -> Bool +check_office_equipment currentYear asset = + case (asset.category, asset.subcategory) of + (OfficeEquipment, Just Phone) -> (currentYear - asset.purchaseYear) >= 6 + (Furniture, _) -> (currentYear - asset.purchaseYear) >= 7 + _ -> False + + +compare_maybe_value : Maybe a -> a -> Bool +compare_maybe_value maybeValue toCompare = + maybeValue + |> Maybe.map (\t -> t == toCompare) + |> Maybe.withDefault False + +check_vessels : Year -> Asset -> Bool +check_vessels currentYear asset = + compare_maybe_value asset.subcategory Boat + && currentYear - asset.purchaseYear >= 10 + + +check_rental_property : Year -> Asset -> Bool +check_rental_property currentYear asset = + asset.category == Building + && compare_maybe_value asset.subcategory Rental + && currentYear - asset.purchaseYear >= 27 + + \ No newline at end of file diff --git a/tests-integration/snowpark/scala/build.sc b/tests-integration/snowpark/scala/build.sc new file mode 100644 index 000000000..5768171c8 --- /dev/null +++ b/tests-integration/snowpark/scala/build.sc @@ -0,0 +1,21 @@ +// build.sc +import mill._ +import scalalib._ + + +object snowparkExample extends ScalaModule{ + def scalaVersion = "2.12.9" + + val paths = Seq( + millSourcePath / os.up / "src" / "main" / "scala" + ) + def sources = T.sources { + paths.map(p => PathRef(p)) + } + + def ivyDeps = Agg( + ivy"com.snowflake:snowpark:1.8.0", + ivy"org.ini4j:ini4j:0.5.4", + ivy"org.scala-lang.modules::scala-collection-compat:2.3.1" + ) +} diff --git a/tests-integration/snowpark/scala/mill b/tests-integration/snowpark/scala/mill new file mode 100755 index 000000000..93ef60438 --- /dev/null +++ b/tests-integration/snowpark/scala/mill @@ -0,0 +1,49 @@ +#!/usr/bin/env sh + +# This is a wrapper script, that automatically download mill from GitHub release pages +# You can give the required mill version with MILL_VERSION env variable +# If no version is given, it falls back to the value of DEFAULT_MILL_VERSION +DEFAULT_MILL_VERSION=0.10.4 + +set -e + +if [ -z "$MILL_VERSION" ] ; then + if [ -f ".mill-version" ] ; then + MILL_VERSION="$(head -n 1 .mill-version 2> /dev/null)" + elif [ -f "mill" ] && [ "$0" != "mill" ] ; then + MILL_VERSION=$(grep -F "DEFAULT_MILL_VERSION=" "mill" | head -n 1 | cut -d= -f2) + else + MILL_VERSION=$DEFAULT_MILL_VERSION + fi +fi + +if [ "x${XDG_CACHE_HOME}" != "x" ] ; then + MILL_DOWNLOAD_PATH="${XDG_CACHE_HOME}/mill/download" +else + MILL_DOWNLOAD_PATH="${HOME}/.cache/mill/download" +fi +MILL_EXEC_PATH="${MILL_DOWNLOAD_PATH}/${MILL_VERSION}" + +version_remainder="$MILL_VERSION" +MILL_MAJOR_VERSION="${version_remainder%%.*}"; version_remainder="${version_remainder#*.}" +MILL_MINOR_VERSION="${version_remainder%%.*}"; version_remainder="${version_remainder#*.}" + +if [ ! -s "$MILL_EXEC_PATH" ] ; then + mkdir -p "$MILL_DOWNLOAD_PATH" + if [ "$MILL_MAJOR_VERSION" -gt 0 ] || [ "$MILL_MINOR_VERSION" -ge 5 ] ; then + ASSEMBLY="-assembly" + fi + DOWNLOAD_FILE=$MILL_EXEC_PATH-tmp-download + MILL_VERSION_TAG=$(echo $MILL_VERSION | sed -E 's/([^-]+)(-M[0-9]+)?(-.*)?/\1\2/') + MILL_DOWNLOAD_URL="https://github.com/lihaoyi/mill/releases/download/${MILL_VERSION_TAG}/$MILL_VERSION${ASSEMBLY}" + curl --fail -L -o "$DOWNLOAD_FILE" "$MILL_DOWNLOAD_URL" + chmod +x "$DOWNLOAD_FILE" + mv "$DOWNLOAD_FILE" "$MILL_EXEC_PATH" + unset DOWNLOAD_FILE + unset MILL_DOWNLOAD_URL +fi + +unset MILL_DOWNLOAD_PATH +unset MILL_VERSION + +exec $MILL_EXEC_PATH "$@" \ No newline at end of file diff --git a/tests-integration/snowpark/scala/mill.bat b/tests-integration/snowpark/scala/mill.bat new file mode 100644 index 000000000..16fe05668 --- /dev/null +++ b/tests-integration/snowpark/scala/mill.bat @@ -0,0 +1,95 @@ +@echo off + +rem This is a wrapper script, that automatically download mill from GitHub release pages +rem You can give the required mill version with --mill-version parameter +rem If no version is given, it falls back to the value of DEFAULT_MILL_VERSION +rem +rem Project page: https://github.com/lefou/millw +rem +rem If you want to improve this script, please also contribute your changes back! +rem +rem Licensed under the Apache License, Version 2.0 + +rem setlocal seems to be unavailable on Windows 95/98/ME +rem but I don't think we need to support them in 2019 +setlocal enabledelayedexpansion + +set "DEFAULT_MILL_VERSION=0.7.3" + +rem %~1% removes surrounding quotes +if [%~1%]==[--mill-version] ( + rem shift command doesn't work within parentheses + if not [%~2%]==[] ( + set MILL_VERSION=%~2% + set "STRIP_VERSION_PARAMS=true" + ) else ( + echo You specified --mill-version without a version. + echo Please provide a version that matches one provided on + echo https://github.com/lihaoyi/mill/releases + exit /b 1 + ) +) + +if [!MILL_VERSION!]==[] ( + if exist .mill-version ( + set /p MILL_VERSION=<.mill-version + ) +) + +if [!MILL_VERSION!]==[] ( + set MILL_VERSION=%DEFAULT_MILL_VERSION% +) + +set MILL_DOWNLOAD_PATH=%USERPROFILE%\.mill\download + +rem without bat file extension, cmd doesn't seem to be able to run it +set MILL=%MILL_DOWNLOAD_PATH%\!MILL_VERSION!.bat + +if not exist "%MILL%" ( + set VERSION_PREFIX=%MILL_VERSION:~0,4% + set DOWNLOAD_SUFFIX=-assembly + if [!VERSION_PREFIX!]==[0.0.] set DOWNLOAD_SUFFIX= + if [!VERSION_PREFIX!]==[0.1.] set DOWNLOAD_SUFFIX= + if [!VERSION_PREFIX!]==[0.2.] set DOWNLOAD_SUFFIX= + if [!VERSION_PREFIX!]==[0.3.] set DOWNLOAD_SUFFIX= + if [!VERSION_PREFIX!]==[0.4.] set DOWNLOAD_SUFFIX= + set VERSION_PREFIX= + + rem there seems to be no way to generate a unique temporary file path (on native Windows) + set DOWNLOAD_FILE=%MILL%.tmp + + echo Downloading mill %MILL_VERSION% from https://github.com/lihaoyi/mill/releases ... + + rem curl is bundled with recent Windows 10 + rem but I don't think we can expect all the users to have it in 2019 + rem bitadmin seems to be available on Windows 7 + rem without /dynamic, github returns 403 + rem bitadmin is sometimes needlessly slow but it looks better with /priority foreground + if not exist "%MILL_DOWNLOAD_PATH%" mkdir "%MILL_DOWNLOAD_PATH%" + bitsadmin /transfer millDownloadJob /dynamic /priority foreground "https://github.com/lihaoyi/mill/releases/download/%MILL_VERSION%/%MILL_VERSION%!DOWNLOAD_SUFFIX!" "!DOWNLOAD_FILE!" + if not exist "!DOWNLOAD_FILE!" ( + echo Could not download mill %MILL_VERSION% + exit 1 + ) + + move /y "!DOWNLOAD_FILE!" "%MILL%" + + set DOWNLOAD_FILE= + set DOWNLOAD_SUFFIX= +) + +set MILL_DOWNLOAD_PATH= +set MILL_VERSION= + +set MILL_PARAMS=%* + +if defined STRIP_VERSION_PARAMS ( + for /f "tokens=1-2*" %%a in ("%*") do ( + rem strip %%a - It's the "--mill-version" option. + rem strip %%b - it's the version number that comes after the option. + rem keep %%c - It's the remaining options. + set MILL_PARAMS=%%c + ) +) + +"%MILL%" -i %MILL_PARAMS% \ No newline at end of file diff --git a/tests-integration/snowpark/scala/src/main/scala/entrypoint/Program.scala b/tests-integration/snowpark/scala/src/main/scala/entrypoint/Program.scala new file mode 100644 index 000000000..b36037eeb --- /dev/null +++ b/tests-integration/snowpark/scala/src/main/scala/entrypoint/Program.scala @@ -0,0 +1,17 @@ +package entryPoint; + +import com.snowflake.snowpark._ +import com.snowflake.snowpark.functions._ + +object Program extends App { + println("Snowpark test program"); + implicit val session = Session.builder.configFile("session.properties").create + + val assets = session.table("EXAMPLE_ASSETS"); + + companyassets.rules.DepreciationRules.usefulLifeExceeded(lit(2023))(assets).show + + session.close() +} + + diff --git a/tests/Morphir/Snowpark/CommonTestUtils.elm b/tests/Morphir/Snowpark/CommonTestUtils.elm index 84139a8bf..85e7b766e 100644 --- a/tests/Morphir/Snowpark/CommonTestUtils.elm +++ b/tests/Morphir/Snowpark/CommonTestUtils.elm @@ -66,7 +66,6 @@ mIntLiteralOf : Int -> Value.TypedValue mIntLiteralOf value = Value.Literal intTypeInstance (Literal.WholeNumberLiteral value) - mListTypeOf : Type.Type () -> Type.Type () mListTypeOf tpe = Type.Reference () (listFunctionName [ "list" ]) [ tpe ] @@ -103,12 +102,19 @@ listFilterMapFunction collectionFrom collectionTo = (mFuncTypeOf (mListTypeOf collectionFrom) (mListTypeOf collectionTo))) (listFunctionName [ "filter", "map" ]) -equalFunction : Type.Type () -> Type.Type () -> Value.TypedValue -equalFunction left right = +equalFunction : Type.Type () -> Value.TypedValue +equalFunction tpe = Value.Reference - (mFuncTypeOf left right) + (mFuncTypeOf tpe (mFuncTypeOf tpe boolTypeInstance)) (basicsFunctionName [ "equal" ]) +addFunction : Type.Type () -> Value.TypedValue +addFunction tpe = + Value.Reference + (mFuncTypeOf tpe (mFuncTypeOf tpe tpe)) + (basicsFunctionName [ "add" ]) + + listMapFunction : Type.Type () -> Type.Type () -> Value.TypedValue listMapFunction collectionFrom collectionTo = Value.Reference @@ -214,6 +220,7 @@ sDot expr memberName = sLit : String -> Scala.Value sLit stringLit = Scala.Literal (Scala.StringLit stringLit) + sIntLit : Int -> Scala.Value sIntLit intLiteral = Scala.Literal (Scala.IntegerLit intLiteral) @@ -228,4 +235,17 @@ sFalse = sSpEqual : Scala.Value -> Scala.Value -> Scala.Value sSpEqual left right = - Scala.BinOp left "===" right \ No newline at end of file + Scala.BinOp left "===" right + +sBlock : Scala.Value -> List (Scala.Name, Scala.Value) -> Scala.Value +sBlock body bindings = + let + valDecls = + bindings + |> List.map (\(name, value) -> + Scala.ValueDecl { modifiers = [] + , pattern = Scala.NamedMatch name + , valueType = Nothing + , value = value }) + in + Scala.Block valDecls body \ No newline at end of file diff --git a/tests/Morphir/Snowpark/FunctionMappingsTests.elm b/tests/Morphir/Snowpark/FunctionMappingsTests.elm index 8c0f778da..b6297776f 100644 --- a/tests/Morphir/Snowpark/FunctionMappingsTests.elm +++ b/tests/Morphir/Snowpark/FunctionMappingsTests.elm @@ -296,7 +296,7 @@ functionMappingsTests = \_ -> let lambdaBody = - curryCall ( equalFunction stringTypeInstance stringTypeInstance + curryCall ( equalFunction stringTypeInstance , [ mStringLiteralOf "Smith" , Value.Field stringTypeInstance (Value.Variable empType ["k"]) [ "lastname" ] ]) filterCall = diff --git a/tests/Morphir/Snowpark/MapValueLiteralTests.elm b/tests/Morphir/Snowpark/MapValueLiteralTests.elm index c9df52660..7b370ced1 100644 --- a/tests/Morphir/Snowpark/MapValueLiteralTests.elm +++ b/tests/Morphir/Snowpark/MapValueLiteralTests.elm @@ -1,4 +1,6 @@ -module Morphir.Snowpark.MapValueLiteralTests exposing (mapValueLiteralTests) +module Morphir.Snowpark.MapValueLiteralTests exposing ( mapValueLiteralTests + , mapIfValueExpressionsTests + , mapLetValueExpressionsTests ) import Expect import Test exposing (Test, describe, test) import Morphir.IR.Literal as Literal @@ -6,8 +8,13 @@ import Morphir.Snowpark.MapExpressionsToDataFrameOperations exposing (mapValue) import Morphir.Scala.AST as Scala import Morphir.IR.Value as Value import Morphir.IR.Type as Type -import Morphir.Snowpark.MappingContext as MappingContext import Morphir.Snowpark.MappingContext exposing (emptyValueMappingContext) +import Morphir.Snowpark.Constants exposing (applySnowparkFunc) +import Morphir.Snowpark.CommonTestUtils exposing ( sIntLit, mIntLiteralOf, sCall, sExpCall + , mIdOf, sVar, intTypeInstance, sBlock + , mLetOf, mFuncTypeOf, addFunction) +import Morphir.Snowpark.ReferenceUtils exposing (curryCall) +import Morphir.Snowpark.Constants as Constants functionNamespace : List String functionNamespace = ["com", "snowflake", "snowpark", "functions"] @@ -105,4 +112,105 @@ mapValueLiteralTests = assertCharacterLiteral, assertFloatLiteral, assertIntegerLiteral - ] \ No newline at end of file + ] + + +mapIfValueExpressionsTests: Test +mapIfValueExpressionsTests = + let + emptyContext = emptyValueMappingContext + assertIfExprGeneration = + test ("Generation for if expressions") <| + \_ -> + let + ifExample = + Value.IfThenElse intTypeInstance (mIdOf [ "flag" ] booleanReference) (mIntLiteralOf 10) (mIntLiteralOf 20) + (mapped, _ ) = + mapValue ifExample emptyContext + expectedIf = + sCall ( applySnowparkFunc "when" [ sVar "flag", applySnowparkFunc "lit" [ sIntLit 10 ] ] + , "otherwise" ) + [ applySnowparkFunc "lit" [ sIntLit 20 ] ] + + in + Expect.equal expectedIf mapped + in + describe "IF Value mappings" + [ assertIfExprGeneration + ] + +mapLetValueExpressionsTests: Test +mapLetValueExpressionsTests = + let + emptyContext = emptyValueMappingContext + assertLetExprGenerationWithOneBinding = + test ("Generation for let expressions with one binding") <| + \_ -> + let + + letExample = + mLetOf ["tmp"] (mIntLiteralOf 10) (curryCall (addFunction intTypeInstance, [mIdOf [ "tmp" ] intTypeInstance, mIntLiteralOf 10]) ) + (mapped, _ ) = + mapValue letExample emptyContext + expectedVal = + sBlock (Scala.BinOp (sVar "tmp") "+" (applySnowparkFunc "lit" [ sIntLit 10 ])) + [ ( "tmp", applySnowparkFunc "lit" [ sIntLit 10 ] ) ] + in + Expect.equal expectedVal mapped + assertLetExprGenerationWithSeveralBindings = + test ("Generation for let expressions with several bindings") <| + \_ -> + let + letExample = + mLetOf ["tmp1"] (mIntLiteralOf 10) <| + mLetOf ["tmp2"] (mIntLiteralOf 20) <| + mLetOf ["tmp3"] (mIntLiteralOf 30) (curryCall (addFunction intTypeInstance, [mIdOf [ "tmp1" ] intTypeInstance, mIntLiteralOf 10]) ) + (mapped, _ ) = + mapValue letExample emptyContext + expectedVal = + sBlock (Scala.BinOp (sVar "tmp1") "+" (applySnowparkFunc "lit" [ sIntLit 10 ])) + [ ( "tmp1", applySnowparkFunc "lit" [ sIntLit 10 ] ) + , ( "tmp2", applySnowparkFunc "lit" [ sIntLit 20 ] ) + , ( "tmp3", applySnowparkFunc "lit" [ sIntLit 30 ] ) ] + in + Expect.equal expectedVal mapped + + assertLetExprGenerationWithFunctionDecl = + test ("Generation for let expressions with function decls") <| + \_ -> + let + userDefinedFuncType = + mFuncTypeOf intTypeInstance (mFuncTypeOf intTypeInstance intTypeInstance) + letLambda = + Value.Lambda userDefinedFuncType + (Value.AsPattern intTypeInstance (Value.WildcardPattern intTypeInstance) ["x"]) + (curryCall (addFunction intTypeInstance, [mIdOf ["x"] intTypeInstance, mIdOf ["x"] intTypeInstance])) + + letExample = + Value.LetDefinition + userDefinedFuncType + ["double"] + { inputTypes = [ ] + , outputType = userDefinedFuncType + , body = letLambda } + (curryCall (mIdOf ["double"] userDefinedFuncType, [mIdOf [ "tmp1" ] intTypeInstance]) ) + (mapped, _ ) = + mapValue letExample emptyContext + lambdaArgDecl = + { modifiers = [], tpe = Constants.typeRefForSnowparkType "Column", name = "x", defaultValue = Nothing } + expectedVal = + Scala.Block [ + Scala.FunctionDecl { modifiers = [] + , name = "double" + , typeArgs = [] + , args = [ [ lambdaArgDecl ] ] + , returnType = Nothing + , body = Just <| Scala.BinOp (sVar "x") "+" (sVar "x") } ] + (sExpCall (sVar "double") [ sVar "tmp1"]) + in + Expect.equal expectedVal mapped + in + describe "Let Value mappings" + [ assertLetExprGenerationWithOneBinding + , assertLetExprGenerationWithSeveralBindings + , assertLetExprGenerationWithFunctionDecl ] \ No newline at end of file