Start a new topic

Setting up an expert profile from RE to LO.

Hi friends. We need to set up an export profile from RE to LO, and need help with a starting spot. We are finding a lot of issues between the two systems in terms of data matching, and the one way constituent import from LO to RE isn't enough.


I presume we need to send up the following fields:

  • RE Constituent ID
  • LO Alias
  • First Name
  • Last Name
  • Email Address
  • Email opt in status


What else should we be sending into LO to make our data better?


Would anyone be willing to export their data file map and share it with me?


I am also curious to know when you are running this export? Before or after running your daily set of imports?


THANKS IN ADVANCE! :)


Ashley

We have a nightly scheduled synch from RE to LO that kicks off at 2am. It picks up all constituent records changed on the day before and since we can't tell what has changed (only that something changed) we export all standing info. The field mapping is attached. 


Hope this is helpful.


Robert

csv

1 person likes this

Ashley, this is a great question - the answer really depends on what is your primary use of LO and how you use LO.  For us, the primary driver for exporting from RE to LO is recognizing that LO is our primary email communication tool with our constituent base (i.e. we want to send out a few 100K of emails, we use LO).


I will list our export list below however here are a few things to note;

1. Our export is nothing more than SQL code (a bit complex) based on the Omatic data format standard which works with LO connector.  This (like all of our daily imports/exports) run automatically without user intervention.

2. We only include RE constituent records based on;

 a. Updated/changed/added since the last time the export ran

 b. Records which have a primary email address

 c. We specifically EXCLUDE any record whose last update was by any of our standard import accounts - we also do not import from LO to RE if the LO record was last updated by our export account - this assures we are not thrashing our integration process.

 d. we exclude our general 'RE facebook' donation record

3. Our exports run at 2000hrs each evening and our imports run at 0100 each morning.  

4. By default, we limit the export to 20K/day M-F and 5K/day Sat-Sun - there is another process which runs @ 1600hrs each day to alert us if any of these thresholds are exceeded. This is a safety check (it has been valuable in a couple of instances) if one of our data team does a large import not using our standard RE import account - As an example, we really do not want to accidentally export 500K+ records from RE to LO when the only thing changed was a minor attribute not affecting any LO processes.

4. We have a separate constituent export that runs nightly and processes 20K constituent records at a time - it simply runs through our database 20K at a time from start to finish then starts over.  This is to assure accuracy between the 2 systems as, sometimes, 'stuff' happens and an individual record update may fail.


Here are the fields we export;

LO ID

First Name

Last Name

Middle Name

Maiden Name

Nickname

Gender

Title

Birthdate

Deceased

Accept Email

Address Street1

Address Street2

City

State

Zip

Country

Accepts Postal Mail

Email

Phone

Ethnicity

Martial Status

First Gift Amount

First Gift Date

Last Gift Amount

Last Gift Date


Hope it helps - Dennis, dennis@ladnierfamily.com



1 person likes this

@Robert Badley, thank you so very much!

@Dennis Ladnier thank you!!!

Dennis, you have a very impressive schedule for importing and exporting LO data to RE. 


I am wondering if I could ask how you handle duplicate records identified in either RE or LO? Specifically, have you come up with any process for ensuring that duplicates identified and merged in either system are also merged in the other or is it a manual merge in both? This is a huge task that we never seem to be able to get a handle on. This is the only feature of RELO that I really liked.


Robert

ugh - duplicate records - the bane of 'data integrity and quality'. Obviously we try to minimize dups up front however, to be honest, our efforts are probably similar to what everyone else does. 


As far as merging, we do the following;

1. Standard Omatic Duplicate/Merge 

2. One off merges when dups as identified

3. Daily scan of overall 'data inconsistencies/errors' with corrections which then feed into the standard Omatic or one-off merging.' 


We currently maintain a 'many to one' relationship from LO to RE using the LO ID # as the RE alias hence our RE records match to more than one LO  record - again, probably similar to what most others do.


Every few months I have a python script which scraps the LO webpage emulating the manual keystrokes within LO to merge records which were already merged in RE.  Suffice to say, we are FAR FAR behind with LO merging. I do so miss the merging in both RE and LO which RELO provided!!  


Dennis



1 person likes this

Dennis, can I ask if you have any plans to move to RE:NXT and if so how you will manage your nightly synchs?


Robert

Dennis, 


Hello! We have an export that we use to make updates to address and many of the other fields you mention in your export. What we have noticed about the address lines however, is that if we have a one line update that replaces a two line address in Luminate, the second line is not overwritten with a blank but persists so that we end up with an incorrect address in Luminate. (e.g., Line 1: Apartment 2, Line 2: 123 Lark Lane replaced with Line 1: 456 Second Avenue would appear in Luminate as Line 1: 456 Second Avenue, Line 2: 123 Lark Lane.) We submitted this issue to Omatic and they said they are not able to overwrite with a blank or null a field that contains data in Luminate. Has your organization encountered this issue? If so, how have you addressed it? I would so appreciate your insight on this problem!



Deborah, that's interesting - let me do a specific test to see if we are also seeing this issue.  Thanks, Dennis dladnier@ladnierfamily.com

Deborah, I will spare you the details with control vs test record testing, etc - the net is I confirmed the error you outlined and, in fact, we also see this issue however never noticed. THANK YOU for sharing and making us aware,


In our case we may not have noticed because we do not use LO postal address 'street 2'  for anything and when performing data integrity checks between LO and RE we check 'street 1' and not 'street 2'.  With that said, there would be obvious issues when LO (with now outdated 'street 2' data) updates RE which would cause integrity concerns.


I also tested this by sending 'street 2' as as space (" ") instead of a blank (ex. case when isnull(<your street 2 RE field>,"")="" then " " ELSE <your street2 RE field>) end) - the result was that LO 'street 2' was not changed.


I then tested the standard LO import and, as it should, it blanked LO 'street 2'. This appears to indicate a standard csv LO import can and does blank 'street 2' hence we have a fallback plan that can be used to clean-up the incorrect LO 'street 2' data issue.


It is odd Omatic indicates they cannot fix this issue but could be it is actually within the LO WSDL however the XML schema shows the field can be nill - it's been a while since I used/developed WSDL with LO but may be interesting to confirm the issue is actually on the LO side.  I assume Omatic has already confirmed and is working/escalating within LO technical support to resolve (assuming that is what the issue actually is).


Our temp solution will be to manually bulk update all RE records which do not have a second address line every month or so and hope the issue is eventually resolved. 


Again, thanks for letting us know.


Dennis Dennis@ladnierfamily.com


Dennis,


Thanks so much for detailed response and description of your temporary solution! I am glad to know that raising this issue helped you identify and address it for your organization. 


Unfortunately, the response from Omatic support was to suggest that I enter this issue on their IOM user voice page so it can be seen by developers and perhaps added to an upcoming Omatic release. This response may mean that they have identified that the issue is on their side and do not plan to fix it unless there is user support for a fix. I found only one topic on the ImportOmatic page concerning overwriting with nulls/ blanks, so either many people are choosing not to export, are not including address in their exports, or worse, also don't know that this is happening. 


We did some research on the Blackbaud side and found that the RELO Connector had a similar issue to this some time ago and did fix it as a defect in their September 2017 release, (https://kb.blackbaud.com/articles/Article/108451) so I do hope that Omatic will respond quickly to do the same. 


If you would like to see or add votes to the user voice entries concerning the ability to overwrite existing data, the links are below. 


IOM Connector: https://iom.uservoice.com/forums/599935-importomatic-connectors/suggestions/39585742-luminate-connector-overwrite-existing-data-with


IOM: https://iom.uservoice.com/forums/18118-importomatic/suggestions/8342820-overwrite-existing-values-with-blank-null


Thanks so much again for your feedback and help!


Regards,

Deb


Deb, for a company whose business is 'integration' it is unfortunate that a known defect which affects customers critical business data requires customer 'votes' to resolve. It is unfortunate but am sure there is valid business reasons  (i.e. prioritization of limited resources, financial roi for the fix, etc).


Anyway, it seems like we have a work around.


Thanks, Dennis

Update: We raised the issue with our customer success contact and she did see our point of view and took the matter to product development. They are now attempting to work on a fix - the user voice issue now reflects a Planned status - and suggested this work around in the interim:

 

- Add a dictionary to Address Line 2 that replaces [blank] with some sort of special character, like a dash.

- Create a query in LO that searches the “Street” field for values that equal (not just contain) that character.

- Work through the results of the query to manually remove the second address line from those records, or use a global change option if one is available in LO.

 

Just thought I'd pass on their recommendation in case it is of any use in addition to your current work around. Thanks again!

 

Regards,

Deb

Deb, thanks for the update!  Dennis

Login to post a comment