Podcast : Thunder and Lightning with Peter Coffee

This week I have for you a conversation with Peter Coffee who is the Vice President for Strategic Research at Salesforce.

Many of you will recognise Peter as the host of pre-keynote conversations at many Salesforce events. I think I am right in saying that it is his voice which announces every keynote session at Dreamforce, which he has to pre-record in the run up to the conference.

Peter Coffee and the IoT Hierarchy Of Needs

Peter Coffee and the IoT Hierarchy Of Needs at Dreamforce 2015

If you’re interested in “the future”, then Peter is your man. He leads the emerging trends track at Dreamforce, which from what I have seen gives a 2-3 year view into the future of Salesforce and the technology industry.

Before he was hired by Marc Benioff in 2007, Peter was a technology journalist. He’s also a graduate of the Massachusetts Institute of Technology and back in the day he even wrote a book entitled How to program Java.

If you want to improve your understanding of how technology and business work together and where they are leading us Peter is your man.

I sat down with Peter in the press room at Dreamforce on the very last day with the hope that he could help straighten out my thoughts about all the announcements and technologies which were on show that week.  Clearly my head was still spinning as you’ll hear we made reference to the I’m a Mac adverts which ran from 2006 to about 2009, so a good 6 years ago. Not the 2 or 3 that came out of my mouth. Things move fast today it seems.

You can find Peter on twitter @petercoffee

Please feel free to leave feedback on the blog at TechnologyFlows.com or tweet me directly, I am @matmorris

Subscribe Podcast Feed

Podcast : Learn Lightning With Don Robins

This week I’m speaking with Don Robins, who is an award winning instructor for Salesforce University and he has been a pioneer in providing education and advice for Salesforce admins and developers wanting to learn how to exploit Salesforce’s new Lightning technologies.

Don and I got together in a San Francisco bar, directly across the road from the Dreamforce conference which had just closed.  This conversation came almost exactly a year after our first conversation about the release of Salesforce Lightning components.  We started by talking about how quickly the time had passed, and how people (including me) can catch up and start understanding how to exploit the power of Lightning.

Over the past 18 months, Don has produced some excellent materials which introduce developers and admins to building with Lightning.  Top of the list is Don’s Salesforce University online course which you can access for FREE.  Don also produced a series of 6 blog posts which introduce the concepts of lightning and are a great primer for the training course.

If your job is as Salesforce developer or Admin then you should to continue to the on-line training, which is a high quality Salesforce University production with video and slides.

All of these great resources can be seen by going to the link http://bit.ly/lightningDon.

Take a look at the blog at http://www.technologyflows.com, you can tweet me @matmorris.

Thanks for listening!

Subscribe Podcast Feed

Podcast : Salesforce Lightning With Skip Sauls & Doug Chasman

This time we hear from 2 people right at the heart of Salesforce’s Lighting technology evolution.

Skip Sauls is a Director of Product Management at Salesforce and he is working alongside Doug Chasman who is Principal Architect on the Platform team.

The work of their team is making it possible for developers and admins to customise Salesforces’ user interface like never before.

Doug was one of the two original architects of Visualforce, and I started by asking him if Lighting components will replace VisualForce.

SkipDougInterview

This interview was recorded about 2 weeks prior to the announcement of the new Salesforce Lighting Experience, so we we’re able to discuss that on tape. Skip and Doug were extremely open about the technologies which they have waiting in the wings and Lighting Out sounds like it has the potential to be a transformational technology.

Dreamforce 2015 is taking place next week from the 15th September, if you’re going be sure to seek out Skip and Doug. If you’re staying at home, watch the online broadcasts and keep an eye on the Salesforce YouTube channel for the sessions to be published in the weeks following the conference.

Take a look at the blog at http://www.technologyflows.com, you can tweet me @matmorris.

Thanks for listening!

Subscribe Podcast Feed

Podcast : London Salesforce Admins Summer Of Trailhead


This is the first episode of the new Technology Flows podcast, recorded at the London Salesforce Admins “Summer Of Trailhead” event at the end of July 2015.

Thanks to everyone who took part.  More details about the group can be found at http://www.meetup.com/London-Salesforce-Admins/ and on the Salesforce Success Community site.

trailhead-emblemTry Salesforce Trailhead for yourself at https://developer.salesforce.com/trailhead

Thanks for listening!

Subscribe Podcast Feed

Lightning Strikes The U.K.

Attend a user group near you!

Monday the 9th March 2015 signals the official start of Salesforce Lightning Developer Week.  Have you registered to attend and event at one of the UK groups?  It may not be too late to sign up!

What can I expect?

These events are going to allow you to get hands on with many of the new Salesforce Lightning product family members.  Lightning Components and Lightning App builder, for the creation of the next generation Salesforce user interface.  Lightning Process Builder, for the next generation of workflow, and Lightning Connect for (what I think is) a paradigm shift in line-of-business data integrations.

If HM Revenue & Customs have allowed the boxes through the border, you can can also expect some nice swag items.  Maybe stickers, definitely t-shirts and possibly some sponsor give away prizes.  Make sure that you arrive on time to maximise your chances!

The biggest benefits though will be the people you meet and the conversations that you have.  Whether that is with one of the presenters or with people “just like you”, we are all going to need to work together to digest and apply all this new stuff in our projects.

Who should attend?

If you have anything to do with the configuration and customisation of Salesforce, Lightning Developer Week is for you.  You may go by the label of being a Salesforce “developer”, or you may refer to yourself as a Saleforce “administrator”, maybe your job title may contain words like “solution architect” or “business analyst”.  The content (I have seen it) is going to be relevant to you, and you’re are going to find out about, and try out first-hand, the tools and techniques which we’re all going to come to rely on to get our jobs done in the next couple of years.

It’s not too late to sign up!

There is still time! Whilst the London event is sold out, there are a total of four events running in the U.K. this week.  If you really can’t make it, then make sure that you join the meet up groups and attend an event soon.  You’ll be glad that you did!

Bristol Salesforce Developer Group – Bristol

LDVBristol

Wednesday, March 11, 2015

My home town, so I am doubly disappointed to not be there with a team of talented Salesforce experts that I know well.  You are assured of a warm welcome in Bristol, so what are you waiting for?

Birmingham Salesforce Developer Group – Birmingham

Thursday, March 12, 2015

I’ve not visited yet, but have heard good things about this group.  Check it out if you’re based around the Midlands.

LDVBirmingham

North UK Salesforce Developer Group – Leeds

Wednesday, March 11, 2015

Salesforce MVP & Maven, Paul Battisson will be supported by Salesforce MVPs Phil Walton & Josh Hoskins.  This is a great group with lots of talented people happy to share their knowledge.

LDVLeeds

London Salesforce Developers & London Salesforce Admin

Wednesday, March 11, 2015 Devs & Admins!

Currently sold out.  This is a joint meeting held by the developer and admin user groups.  The total number of RSVPs was capped at 150, and there was still a waiting list!  Just goes to show how strong Salesforce technologies are in the capital city.

LDVLondonDev

LDVLondonAdm

Along with some famous names, I will be here too!  I am presenting the section about Lightning Connect.  Wish me luck & I hope that you have a great Lightning Week!  It’s going to go lightning fast!

Flow–Process Log : Part 2

In last weeks “part 1” post, I introduced why I needed to log faults and information from Salesforce Visual Workflows which I have been writing .  I recommend reading that post before digging into the detail here.

With a custom object in which to store any fault or status messages I was able to use a simple flow component to create a new record from inside any flows which I wrote.  I enhanced this pattern by having several flavours of flow to log different information.

LogFault is my flow which I wire up to fault connectors in every flow that I write.   When I just want to send application messages, I use my LogInfo and LogInfo flows.

LogFault, LogError & LogInfo Flows

All three logging flows share the same structure, here is the LogFault flow.

LogFaultFlow1

There are three variables.

FaultMessage

Input only, passes the message from the calling flow.

FlowName

Input only, passes a combination of the calling flow’s Id and name (as explained in part 1).

ProcessLogId

Output only, returns the Id of the created process log record to the calling flow.

The formula to FormatDefintionURL takes the flow’s Id and appends it to the Salesforce URL to make a link which can be clicked to quickly access the flow definition when I am investigating any problems.

Here is how the values are mapped to the record create component as part of the call.

LogFaultFlow2

The value for Type__c, the type of message is the only thing which changes between the different versions.  LogError flow sets the type to be “Error” and LogInfo is “Info”

That is all that is needed to create the flow.  Don’t forget to make the start element with the green arrow.

LogFaultFlow3

Because there is only 1 element in the flow, the editor will give a warning when the flow is saved.  It’s okay, it will still run.

Using The Log Flows

Now the LogFault flow (and it’s siblings, LogError and LogInfo) have been created they can be used in new and existing flows.  I actually went back to all the flows in our production org and updated them to add not only fault handling, but also information messages.  I was very glad that I did as it helped me solve a number of logic problems and field permission errors.

Taking the idea of adding logging to an existing flow, lets start with the following very simple flow which checks if a contact exists, and if not creates a new one before returning the Id of the new or existing contact.

LogFaultFlow4

This flow has two data access components, if a fault is raised by them nothing is reported and nobody is any the wiser, save for maybe the original author that wrote the flow who might get a system generated email.  We can do better than this!

The log flows are available in the palette and can be added to the flow.

LogFaultFlow5

The flow is transformed!   It still does the same basic function, but we have two very important additional features.  The fault connections are wired up for the two data access components, now if there is a problem we will have a record of it.  There is also a record of the beginning middle and end of the flow’s progress.  This type of information is invaluable when troubleshooting a flow.

Here is the information which the flow logs when it is run under different conditions.

LogFaultFlow6

Looking at a single record, all the captured fields can be seen.

LogFaultFlow7

My favourite feature is the URL field as it saves me so much time to be able to jump to the flow definition when reviewing the logs.  And because the process log is a Salesforce custom object I am able to use it in reports, dashboards and I’ve even hooked it up to a push topic on the Streaming API.  But that’s another story for another post.

I hope that these two posts have provided ideas around instrumenting your business processes, and I am sure that there is a lot more which can be done to build on this foundation.

Process Builder – Update Contact Address From Account

ZJ10

@ZacharyJeans is a Twitter machine (nearly 100K posts!).  So it’s good to see him roll up his sleeves and have a crack at Salesforce’s Trailhead learning challenges.

It can be hard work to get started, but social media can provide a lot of moral and practical support!

The actual problem that Zachary had to solve for his Trailhead challenge was the opposite of what I explained here.  In this example we have a contact which will refresh it’s address information from the parent account every time it is updated.  The Trailhead challenge called to update all child Contacts whenever the account was updated.  Good job I didn’t do his homework for him!

Here is the walk through which I wrote at the time.  I found it very interesting to see where someone new to Process Builder started with the tool.  Hopefully these pointers helped, I think they did as he tweeted later showing some of his new Trialhead badges!

Create New Process

Create the process and set it to fire on create & update of a Contact record.  Remember though that in a nice clean Salesforce environment, when the new Contact is displayed it defaults the address fields from the owning Account.  So this is more a scenario for updates.

ZJ1

Add Criteria

Depending on the scenario, you can set the filter conditions as needed.  Here I am working with just the street and postal code.  Thanks to the new feature in Process Builder, we can “OR” these conditions.

In a real situation my preference here would be to have a formula field checkbox outside of the process which I would used to model the criteria.  I’d call the field something like “Update Contact Address Required”.  But for now, let’s use the standard fields on the object.

ZJ2

Define The Update Action

With the criteria set, the branch to the action will be followed whenever they are true.

We need an action to “Update Records”, where the record will be the Contact record we are working with.  This can be slightly confusing when new to Process Builder, the trick is to click the name of the object and not get distracted by the list of the other fields:

ZJ4

Set Object Variables

Then set the Contact fields to come from the related Account fields.

Set each field, I have just updated the street and postal code.

ZJ5

Save, Activate, TEST TEST TEST!

You are going to feel so great when this runs and works first time!

Here’s my Account.

ZJ6

Here’s my new Contact with the street and postal code copied in by the process!

zj9

If it doesn’t work first time, then retrace your steps.  If it does run, don’t just go and have a coffee to celebrate, TEST SOME MORE!

Why?  Because there is a gotcha in here which will cause an error… can you find the cause?

ZJ8

After all, this is a computer program!

Good luck & enjoy the process…

 

….update.  The gotcha is that updating a contact which has no Account will cause an error.

The solution is to add an extra condition into the process filter to make sure that the update is only fired when an Account is listed for the current Contact.

Flow– Process Log : Part 1

I have been experimenting with this error handling pattern for my own flows for the last couple of months and I just co-presented to a meeting of the London Salesforce Admin user group.  I shared the presentation with Salesforce MVP Mike Gill, who has been very active in applying Flow to solve lots of different problems.

My need for a structured approach to managing the errors thrown by a running flow came about from my work to engineer a lot of processing within my own Salesforce instance.  I didn’t have the happy experience of the flow running correctly first time, that’s never going to happen.  So there was a learning curve in using flow and I was finding myself struggling to uncover the reason why parts of the overall process were failing.  There were also situations where the flow completed but I didn’t get the outcome I was expecting, such as order records not being created or no invoice line items to accompany the invoice header.  I had no visibility to the process of the flow as it ran, other than checking for record updates and creates and interpreting if my logic was correct.

The muddle of my early flow development only started to fit into place when I started to record both, flow fault messages, and log messages to report the state of the system at key milestones.

The solution is very simple, I don’t expect anyone to be impressed by the technology.  But, if like me you find yourself wanting to track faults and successes from your flows, maybe this is a software pattern which can help you.

Logging Pattern – Custom Object

I need somewhere to store the messages sent from my Flows.  Where better to store and classify that data than in a custom object.  Here is what it looks like:

ProcessLog__c

The fields within the object allow me to perform useful functions with the Process Log when it’s displayed.

Type

Having different types of log entry will allow me to filter by Fault/Error/Information.

Message & Summary

The role of these fields is simple, capture the message text sent from the flow.  This could be the fault message text when something goes wrong, or it could be an informational message indicating that a milestone has been reached.

The Summary field is a shorter version of the Message long text field and will contain the first 255 characters of the message.  Having a simple text summary allows the start of the message (often the whole message) to be cleanly displayed in list views and used in other places within Salesforce where long text fields are not in favour such as formulas and streaming API topics.

Flow Origin, Flow Name & Flow Definition

The name of the flow is displayed by the Flow Name formula field.  The formula field substrings the value from the Flow Origin field.  There is a reason for this, flows have a couple of different ways to identify them.  In common with everything else in Salesforce the flow has a unique Id of 15 characters, it also has a human readable unique name.  The flow will put both values into the Flow Origin field in the format:

30024000000TQNv AA_Case_Contact_Selector

The Flow Name formula field can then extract the unique name using the formula:

MID(Origin__c,17, 238)

The Flow Definition field is a URL was a late addition to the pattern but have proven to be very useful.  Using the Id portion of the Flow Name in combination with the URL of the Salesforce org provides a very quick and easy way to open the flow definition and start looking for the cause of the fault.

https://emea.salesforce.com/01I24000000Etqe

Displaying The Process Log

Set up the custom object and have Salesforce add a tab for it, and I now have everything that I need to understand how my flow is behaving, and when it is misbehaving.

ProcessLog

Catch ALL The Faults

I have reviewed a lot of sample flows, and read a lot of blog posts written by other Salesforce users who have been making use of this technology.  In all the examples I am very surprised that the topic of fault handling is very often absent.  In any system errors are inevitable, and we need to plan for them happening.

Salesforce Flow provides what are called “Faults” for many of the components which can be placed onto the design surface.  Any component which accesses the database (Creates, Updates, Fetch) provide a fault connector.  Here’s the documentation from Salesforce about Flow Fault Connectors, so I don’t need to write them.

My approach is to make sure that I use the fault connector for every component in my flow which can return a fault.  In order to do that in a way that is reusable I created a flow specifically for logging fault messages.  I even gave it a simple name “LogFault”.

In every flow that I write I make sure to add a sub-flow to my LogFault and make sure that all components which can raise a fault are wired up to it.

LogFault1

The contents of the LogFault flow could not be simpler, it’s a RecordCreate call to the Process Log custom object.

LogFault2

In the next post I will describe the anatomy of the LogFault flow, and describe how to make additional flows to LogError and LogInfo.

Flow ‘bugging

In celebration of #FlowFeb I describe the common problems I encountered when adopting Flow.  If you are starting out with Flow, I hope that you will find a few guiding principles below.

Next time I will go into more detail about how monitoring the progress of Flows helped me improve quality, and my productivity.

The Scenario

I’ve been writing Flows in earnest for about 6 months now, since the pilot/beta for Flow triggers came about and made it possible to run background processing.  In the beginning I was VERY lazy and didn’t bother to put any error handling into my Flows preferring to hope for the best when they ran.  In my defence let me say that the Flows were running in developer orgs or a very quiet corner of the Desynit live org, just to see what was possible.

Don’t Hope For The Best

My first piece of advice is to point out that hoping for the best and implementing business processes are not a healthy combination.  First of all, if a business process is supposed to happen and it doesn’t, somebody is going to be unhappy and start looking for you.  Second, if there is no alert or evidence that something went wrong it’s doubly bad (and they will realise eventually and come looking for you!).

My most immediate problem though was that my Flows weren’t running correctly first time, or for quite a few times during development and testing.  This was leaving me frustrated as I battled to develop my Flows.

Debug Log Monitoring

If I was going to make use of Flows seriously to solve real world problems for paying customers then I was going to have to up my efficiency and productivity, and that meant getting stuff working quicker.  I quickly found that it was possible to monitor Flows in near-real time by having the Developer Console open when I test ran the Flow.  Just as with debugging classic workflow and Apex, this was an easy way to capture the log and then read/search through it to find out what’s gone wrong.

I was able to use debug logs and the Developer Console to sus out the cause of Flow misbehaviour (“misbehaviour” is what I am choosing to call the results of my early slap-dash Flow programming) and these logs were helping me along the road of exploring the capability of Flow to model more complex business processes within our own live Salesforce org.

Handle All The Exceptions

Capturing logs worked well for development and testing, and taught me a lesson (as if it ever needed to be taught) that I needed to make sure I caught every exception which was raised in the Flow.  I ran into so many unexpected “unhandled exceptions” when first deploying my Flows.  The most common cause was that the Flow user did not have permission to update a field referenced in the Flow.

Ensure Correct Object Permissions

I banged my head against the wall many times, even with what I thought were the simplest of business processes, but Flows were just grinding to a halt because of field permissions.  That’s a whole other blog post, but take my advice and have a Permission Set for each of your Flows and make sure that the objects and fields in the Flow are set to read or write access as needed by the Flow.

Monitor Production

My next lesson came from trying to support my colleagues the first version of an invoice processing Flow which I had written.  There were problems with field permissions (I had not taken my own advice to have a Permission Set for the Flow), and also data scenarios which the Flow was not yet set up to handle.  In some cases I had exception handling which was reporting the fault to the user on their screen, in other cases the fault was not handled, and the Flow died in front of the user with a generic message.  For all my energy and enthusiasm about Flow and what is was going to do for us, they were not very impressed.

Get The Truth From The Source

I had my first Flow in production but it wasn’t working smoothly.  The feedback that I was getting from the users varied in quality and detail, and I could not always be on hand to look over their shoulder and get the information needed to correct the problem.

What I needed was the ability to capture not just the fault message from the Flow and any other information which would give me as much detail as possible so that I could understand how far the Flow was getting before it failed.  I needed to instrument my Flows..

Search #FlowFeb on Twitter for a selection of posts from other Flow fans.

Flow – Create Task With Reminder

In this post I describe how I created a Salesforce Visual Workflow to create a new Task object which can optionally specify the date and time of the reminder value.

The Scenario

Let me start by putting this in context.  It’s a common customisation is to add a couple of extra fields to the page for the Opportunity to capture the next step and next step date.  This makes it easy for the sales person to manage the progression of the deal, and for anyone to see at a glance where the opportunity is right now.

Of course, Salesforce already has a feature set to help manage to-do lists and meetings, which it stores as Tasks and Events.  Tasks can be used as reminders for things which need to be done in the future, and they come with all the functionality needed to pop-up reminders and sync with things like Outlook and mobile device calendars.

It would be very useful then, if a new task was created every time the next step information was updated so that we can combine the clarity and convenience of the capture fields with the power of a task to pop up a reminder and track that it gets done.

When the field “Next Step Date” is updated and saved, a new task is auto created with the same date as the relevant next step date.
The subject of the task is set to the same as the next step field.
The reminder is set to 10:00am for the next step date.
The status is set to “In Progress”.

This change request came to me as part of some work that I was doing on the opportunity process.  Given that it had been hanging around for a while I thought that I would take it on and knock it out in double quick time.

Not so easy as it turns out.  Looking at the change request I could see that it had been logged some time ago but not been delivered.  Why could this be?

My first stop was to look at the classic Salesforce Workflow which is quick and easy to setup and provides the ability to create a task when a field value is updated.  But, the task which is created doesn’t have the option to set the reminder date reminder.  This is not a new problem as this post on the Idea Exchange shows: https://success.salesforce.com/ideaView?id=08730000000BphDAAS

If classic workflow techniques aren’t going to work, what about the new Process Builder workflow.  Surely all the excitement surrounding the recent arrival of the feature means that it can do everything and make you a sandwich when it’s done?!  Not quite.  There is no built in support for “create a task” in Process Builder.  I found that the closest I could get was to use a Global Action to create a new Task record.  Guess what though, no ability to specify that a reminder is needed for the task.  Process Builder would let me create a new record in the Activity object and set the fields need, but the reuse for this was zero.  Each Process, even each branch in the Process would need the fields selected and defined each time.  I did not find that option very appealing.

That leaves two programming options for delivering this feature, use Visual Workflow (a.k.a Flow) or Apex code.  Until recently only the latter would have been suitable because the processing needed to detect the field change when the record was updated.  Now that the feature knows as Trigger Ready Flows has been added by Salesforce it is possible to create Flows which can be launched from classic workflow and Process Builder, the only condition is that the Flow doesn’t include use user interface components.  This was my solution, let’s take a look at how it’s was done.

The Solution

After a long description of why I needed to build this Flow, the steps needed to make it happen are very brief.  In Flow terms, it’s just one step.

TaskWithReminder2

Those of you familiar will Flow may well be asking yourselves why I would bother writing a blog post for something so trivial then?  Well, there is still a twist in our tale and it has to do with the format of the reminder date.

I added input variables for the reminder to my Flow as follows:

  • Reminder Date (Date)
  • ReminderTimeHours (Number, scale 0)
  • ReminderTimeMinutes (Number, scale 0)

I added the other input variables needed for the creation of the task:

  • ActivityDate (Date)
  • IsReminderSet (Boolean)
  • OwnerId (Text)
  • Priority (Text)
  • Status (Text)
  • Subject (Text)
  • WhatId (Text)
  • WhoId (Text)

Creating the task record will provide a value for the Id of the new record, and we will need a variable to store that.  I made this be an output variable in order to make it available to the user of the Flow if they need it.

  • TaskId (Text)

With the variables all created it is a simple task to map the inputs and output to the parameters of the record create in the Flow.

TaskWithReminder3

The only field on the list that I haven’t described is the CalculatedReminderDateTime which is used to set when the reminder is due.  We have the values we need to to set the value, but they are not yet in the right format.

The reminder date and time is stored as a single field using the DateTime data type.  I wanted my solution to make it as easy as possible to specify the reminder details when calling the Flow.  Not many fields in day-to-day Salesforce make use of the DateTime data type, simple date fields are more common.  I didn’t want to have to convert the formats outside of the Flow, and decided that it would be best to separate the date and time components into separate fields which were passed to my flow and then deal with the DateTime data type there.

The requirement stated that the reminder should be set at 10AM on the date that the task is due.  This defines the values which will be placed to the Flow in this example as:

    • Reminder Date = 11-01-2015
    • ReminderTimeHours = 10
    • ReminderTimeMinutes = 00

Within the Flow we need to format the DateTime for these values as “2015-01-11 10:00”

Visual Workflow provides the ability to define formula fields which we can use to easily construct the DateTime value.

TaskWithReminder4

The formula is shown below and just put the parts together in the right order, and then returns a DateTime value:

DATETIMEVALUE(TEXT(YEAR({!ReminderDate})) & “-” & TEXT(MONTH({!ReminderDate}))& “-” & TEXT(DAY({!ReminderDate})) & ” ” & TEXT({!ReminderTimeHours}) & “:” & TEXT({!ReminderTimeMinutes}) & “:00”)

The formula field its self is specified as a DateTime, and this field can then be mapped to the ReminderDateTime of the new Task.

That’s it! Don’t forget to mark the single flow element as the start of the flow.  When you save the Flow it will warn you that there is only one step, it does not cause  problem to the operation of the flow.

Using The Flow

The new Flow can be called from a Flow Trigger or from the new Process Builder, which is what I used.  I actually needed two processes, one to process when the Opportunity is created and one for when the Opportunity is updated.  In both scenarios the Flow is called and the values from the Opportunity are used to create the task.

TaskWithReminder5