CSV Import Tool
complete
E
Eugenio
complete
The feature is now available in DaDaBIK 11 Elba!
E
Eugenio
planned
D
Don Worth
I'm in the process of migrating several tables from EasyNetSites to DaDaBIK. In every case I am loading .csv files into empty tables - so update and delete are not an issue for me. I've used mySQL Workbench to do CREATE TABLE and LOAD DATA LOCAL INFILE for each table. The biggest problems I had had to do with converting fields that contained fixed responses (from a dropdown or checkboxes) from ENS's format (eg. "California,Washington,Oregon" to "~California~Washington~Oregon~" for ENS and then to create an external lookup table of responses and replace the strings with row numbers in the lookup table. "~5~17~24~" I also had to convert date formats for date fields as ENS used mm/dd/yyyy. Most of this was possible with SQL statements, except for the select multiple fields in the library locations and subject areas which require my writing a program to massage the data before it could be loaded.
For me, the ability to load a newly created DadaBIK database (created in the DATA section of DaDaBIK) with a .csv file is most important (and really an absolute requirement for customers who aren't starting from scratch with their data but are migrated from another product or have been working with Excel databases). Learning SQL and using mySQL Workbench is a pretty big barrier for many of my colleagues who are coming from EasyNetSites which is meant to shield people from having to be technical. If you could do more of a quick .csv load into an empty table first, you would probably satisfy the majority of users.
E
Eugenio
If anyone has additional suggestions / comments about this feature, that is still only in a preliminary design phase, you are welcome to post here.
E
Eugenio
I am thinking about it in these days. The general idea is that for each table, you could import a CSV/Excel and the records will be inserted based on a match between the field names in the table and the names in the heading (first row of the file).
The values should be validated according to the settings you have in the form configurator.
The insert process is straightforward, the update and delete aspects are less obvious: if I import a row having same ID of an already existing row, the existing row should probably be updated.
Maybe the best solution is to check if the ID column is available in the CSV and if yes, if it's non-blank and the corresponding record is already in the database, execute an upgrade, in all the other cases, an insert.
Maybe we would also need a "delete existing records if no match is found" option.
F
Frank R. Wölffel
Eugenio:
Exactly what I need!
If there is ID and no match => insert
ID and match => update
ID and no entries => delete
or, for security reasons with a keyword:
ID and keyword "yes" in column "delete" => delete
L
LurkingKiwi
Eugenio: I think you need to explicitly detail whether or how you support the various "features" of C(haracter)SV, such as embedded line breaks (CRLF) and whether quotes are required for text fields, which characters are available as separators and quotes etc.
Also, for select-single fields, mapping of strings to foreign key values.
E
Eugenio
LurkingKiwi: if you check the current CSV import procedure (that at the moment only works for a new installation) you will see that those details are handled already: you can choose separators and (optional) quotes.
I was also thinking about adding, for each field, an optional callback function that transform the CSV value in a corresponding value for the database (changing format, for example, if needed); this could also cover the needs related to select single fields.
Frank R. Wölffel
Eugenio:
we also have to think about single fields with hard coded options
E
Eugenio
Frank R. Wölffel: Yes the values need to be validated before inserting