Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[#152269178] Migrate ComputeVisibleServices to runtime v2 #2

Merged
merged 7 commits into from
Sep 20, 2019

Conversation

francescopersico
Copy link
Contributor

No description provided.

updatedAt: new Date().getTime()
};
const dfClient = df.getClient(context);
await dfClient.startNew("UpsertServiceOrchestrator", undefined, event);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've found that relying on durable functions to serialize/deserialize parameters is too error prone - it's safer to explicitly encode/decode parameters with io-ts, see for example pagopa/io-functions-services#17

visibleService: VisibleService
});

const Input = t.union([AddVisibleServiceInput, RemoveVisibleServiceInput]);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you want a taggedUnion here to be able to encode the input from the orchestrator

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was using union because taggedUnion is now deprecated (union is now able to detect and optimize tagged unions). The problem is that this new feature is available only from v1.9.0 (we are using v1.8.6 that is just before v1.9.0) so i am going to revert the code to use taggedUnion as you suggested.

currentVisibleService !== undefined &&
currentVisibleService.version >= visibleService.version
) {
// A newer version is already stored in the blob, so skip the remove/update
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

when could this scenario happen? it looks like a conflict (another process updated this data concurrently), perhaps we need to error in this case?

}
);

// Map None to empty object
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
// Map None to empty object
// Default to an empty object when the blob does not exist yet

): Promise<Either<Error, true>> {
// Retrieve the current visibleServices blob using the leaseId
const errorOrMaybeVisibleServices = await getBlobAsObject(
t.dictionary(ServiceId, VisibleService),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for clarity, give a name and declare this type outside this function

if (isLeft(errorOrVisibleServices)) {
return left(
Error(
`UpdateVisibleServicesActivity|Cannot decode blob|ERROR=${errorOrVisibleServices.value}`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if these are validation errors, you can use readableReport to format them in a way that is suitable for logging

VISIBLE_SERVICE_CONTAINER,
VISIBLE_SERVICE_BLOB_ID,
{
leaseDuration: LEASE_DURATION
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what happens if this function takes more than LEASE_DURATION to release the lease?

are we protected from the scenarios described in "Protecting a resource with a lock" section from this article?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes we are protected because when you WRITE you have to pass the leaseId, and if the leaseId is expired no write happens.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cloudify it is exactly like the "Making the lock safe with fencing" paragraph but the token is the leaseId.

export const UpsertServiceEvent = t.intersection([
t.interface({
newService: RetrievedService,
updatedAt: t.number
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can use UTCISODateFromString to serialize/deserialize Dates

@digitalcitizenship
Copy link

digitalcitizenship commented Sep 20, 2019

Warnings
⚠️

Please include a description of your PR changes.

Affected stories

  • 🌟 #152269178: Migrare ad Azure Functions runtime v2

New dependencies added: azure-storage and durable-functions.

azure-storage

Author: Microsoft Corporation

Description: Microsoft Azure Storage Client Library for Node.js

Homepage: http://github.com/Azure/azure-storage-node

Createdover 5 years ago
Last Updated5 months ago
LicenseApache-2.0
Maintainers1
Releases46
Direct Dependenciesbrowserify-mime, extend, json-edm-parser, md5.js, readable-stream, request, underscore, uuid, validator, xml2js and xmlbuilder
Keywordsnode, azure and storage
This README is too long to show.

durable-functions

Author: kashimiz

Description: Durable Functions library for Node.js Azure Functions

Homepage: https://github.com/Azure/azure-functions-durable-js#readme

Createdover 1 year ago
Last Updated21 days ago
LicenseMIT
Maintainers1
Releases12
Direct Dependencies@azure/functions, @types/lodash, @types/uuid, @types/validator, axios, commander, debug, lodash, rimraf, uuid and validator
Keywordsazure-functions
README
Branch Status
master Build Status
dev Build Status

Durable Functions for Node.js

The durable-functions npm package allows you to write Durable Functions for Node.js. Durable Functions is an extension of Azure Functions that lets you write stateful functions and workflows in a serverless environment. The extension manages state, checkpoints, and restarts for you. Durable Functions' advantages include:

  • Define workflows in code. No JSON schemas or designers are needed.
  • Call other functions synchronously and asynchronously. Output from called functions can be saved to local variables.
  • Automatically checkpoint progress whenever the function schedules async work. Local state is never lost if the process recycles or the VM reboots.

You can find more information at the following links:

A durable function, or orchestration, is a solution made up of different types of Azure Functions:

  • Activity: the functions and tasks being orchestrated by your workflow.
  • Orchestrator: a function that describes the way and order actions are executed in code.
  • Client: the entry point for creating an instance of a durable orchestration.

Durable Functions' function types and features are documented in-depth here.

Getting Started

You can follow the Visual Studio Code quickstart to get started with a function chaining example, or follow the general checklist below:

  1. Install prerequisites:

  2. Create an Azure Functions app. Visual Studio Code's Azure Functions plugin is recommended.

  3. Install the Durable Functions extension

Run this command from the root folder of your Azure Functions app:

func extensions install -p Microsoft.Azure.WebJobs.Extensions.DurableTask -v 1.7.0

durable-functions requires Microsoft.Azure.WebJobs.Extensions.DurableTask 1.7.0 or greater.

  1. Install the durable-functions npm package at the root of your function app:
npm install durable-functions
  1. Write an activity function (see sample):
module.exports = async function(context) {
    // your code here
};
  1. Write an orchestrator function (see sample):
const df = require('durable-functions');
module.exports = df.orchestrator(function*(context){
    // your code here
});

Note: Orchestrator functions must follow certain code constraints.

  1. Write your client function (see sample):
module.exports = async function (context, req) {
    const client = df.getClient(context);
    const instanceId = await client.startNew(req.params.functionName, undefined, req.body);

    context.log(`Started orchestration with ID = '${instanceId}'.`);

    return client.createCheckStatusResponse(context.bindingData.req, instanceId);
};

Note: Client functions are started by a trigger binding available in the Azure Functions 2.x major version. Read more about trigger bindings and 2.x-supported bindings.

Samples

The Durable Functions samples demonstrate several common use cases. They are located in the samples directory. Descriptive documentation is also available:

const df = require("durable-functions");

module.exports = df.orchestrator(function*(context){
    context.log("Starting chain sample");
    const output = [];
    output.push(yield context.df.callActivity("E1_SayHello", "Tokyo"));
    output.push(yield context.df.callActivity("E1_SayHello", "Seattle"));
    output.push(yield context.df.callActivity("E1_SayHello", "London"));

    return output;
});

How it works

Durable Functions

One of the key attributes of Durable Functions is reliable execution. Orchestrator functions and activity functions may be running on different VMs within a data center, and those VMs or the underlying networking infrastructure is not 100% reliable.

In spite of this, Durable Functions ensures reliable execution of orchestrations. It does so by using storage queues to drive function invocation and by periodically checkpointing execution history into storage tables (using a cloud design pattern known as Event Sourcing). That history can then be replayed to automatically rebuild the in-memory state of an orchestrator function.

Read more about Durable Functions' reliable execution.

Durable Functions JS

The durable-functions shim lets you express a workflow in code as a generator function wrapped by a call to the orchestrator method. orchestrator treats yield-ed calls to your function context's df object, like context.df.callActivity, as points where you want to schedule an asynchronous unit of work and wait for it to complete.

These calls return a Task or TaskSet object signifying the outstanding work. The orchestrator method appends the action(s) of the Task or TaskSet object to a list which it passes back to the Functions runtime, plus whether the function is completed, and any output or errors.

The Azure Functions extension schedules the desired actions. When the actions complete, the extension triggers the orchestrator function to replay up to the next incomplete asynchronous unit of work or its end, whichever comes first.

Generated by 🚫 dangerJS

@codecov
Copy link

codecov bot commented Sep 20, 2019

Codecov Report

Merging #2 into master will increase coverage by 0.65%.
The diff coverage is 92.3%.

@@            Coverage Diff            @@
##           master      #2      +/-   ##
=========================================
+ Coverage   85.84%   86.5%   +0.65%     
=========================================
  Files           6       7       +1     
  Lines         106     126      +20     
  Branches        7       8       +1     
=========================================
+ Hits           91     109      +18     
- Misses         15      17       +2

@francescopersico
Copy link
Contributor Author

@cloudify code updated.
I have tested worst scenarios like:

  • Lease expires before the WRITE
  • Many concurrent service update requests

and the blob is always correctly updated to the last version.

@cloudify
Copy link
Contributor

@francescopersico excellent, thank you!


const createdService = errorOrCreatedService.value;

const errorOrUpsertServiceEvent = UpsertServiceEvent.decode({
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we should actually encode the typed object to an untyped one, this should be something like:

const upsertServiceOrchestratorInput = UpsertServiceEvent.encode({
  newService: createdService,
  updatedAt: new Date()
});

const dfClient = df.getClient(context);
    await dfClient.startNew(
      "UpsertServiceOrchestrator",
      undefined,
      upsertServiceOrchestratorInput
    );

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done also for the Activity input.


const upsertServiceEvent = UpsertServiceEvent.encode({
newService: createdService,
updatedAt: new Date().getTime()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we usually encode dates as UTC ISO strings, you should use UTCISODateFromString in the definition of UpsertServiceEvent (instead of t.integer)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Repository io-functions-app is using t.integer for events https://github.com/teamdigitale/io-functions-app/blob/master/utils/UpdatedProfileEvent.ts#L14.
I am going to change this PR to use UTCISODateFromString anyway.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks for noticing, can you also fix the functions app? thanks

}
);

const event = UpsertServiceEvent.encode({
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for consistency

Suggested change
const event = UpsertServiceEvent.encode({
const upsertServiceEvent = UpsertServiceEvent.encode({

retrievedServiceToApiService(maybeUpdatedService.value)
const updatedService = maybeUpdatedService.value;

const errorOrUpsertServiceEvent = UpsertServiceEvent.encode({
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
const errorOrUpsertServiceEvent = UpsertServiceEvent.encode({
const upsertServiceEvent = UpsertServiceEvent.encode({

@cloudify cloudify merged commit 1f4f117 into master Sep 20, 2019
@cloudify cloudify deleted the 152269178-migrate-computevisibleservices-to-runtime2 branch September 20, 2019 14:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants