Last modified by John Huss on 2010/07/23 14:35

Hide last authors
John Huss 7.1 1 FrontBase is now available for **free**. Please see the FrontBase website [[http:~~/~~/www.frontbase.com>>url:http://www.frontbase.com||shape="rect"]] for details.
smmccraw 1.1 2
David Holt 4.1 3 ----
4
John Huss 5.1 5 **A first experience with FrontBase, WO 5.3.1 and MacOSX 10.4.6**
6
7 Make sure you have the FrontBase jdbc connector accessible from your WebObjects applications. I placed mine in /Library/Java/Extensions/
John Huss 7.1 8 Make sure that the FrontBasePlugIn.framework is located in /Library/Java/Frameworks
9 Make sure that you have added the framework to your project or you will not be able to establish a connection to your database, even though you'll be able to see your database from EOModeler.
10 If you haven't yet assigned a User/Password combination to your database, you still must include User information in order for the connection to be established. Use _system to get started. The URL for basic connection is: jdbc:FrontBase:~/~/localhost/my_database_name
John Huss 5.1 11
12 The documentation available at the Frontbase site is very complete. The only gotcha was that I was left with the impression the framework was only necessary for WebObjects 4.5 and below. This is not the case, you have to install and include it for WO 5 projects too.
13
14 ----
15
David Holt 4.1 16 **Useful SQL commands in FrontBase**
17
John Huss 7.1 18 {{code 0="sql"}}
David Holt 4.1 19
20 // writes an ascii backup of your database
21 write all output(dir='/path/to/export/dir', type = 'FrontBase', content=true);
22
23 // creates a database from the exported files that results from the previous command
John Huss 5.1 24 SCRIPT <path-to-export-directory>/schema.sql;
David Holt 4.1 25
26 // to rename a table
27 alter table name Item to ItemArchive;
28
29 {{/code}}
30
John Huss 7.1 31 (% style="color: rgb(0,0,0);" %)If you don't use type='FrontBase', you end up losing precision on your timestamp columns, which can really suck.
David Holt 4.1 32
33 ----
34
John Huss 5.1 35 **Delimited Export From FrontBase**
John Huss 7.1 36 (% style="color: rgb(0,0,0);" %)Out of the box FrontBase doesn't provide a pure delimited file export. But it comes close, and with a little effort, you can have a delimited file export that you can import into a different database. This script will create a tab delimited backup of your database.
smmccraw 1.1 37
John Huss 5.1 38 {{code}}
smmccraw 1.1 39
John Huss 5.1 40 #! /bin/bash
41
42 # The name of the database to backup must be the first argument to the script
43 # The backup output directory (ABSOLUTE PATH!) must be the second parameter
44
45 if [ "$#" -lt "2" ]; then
46 echo "Usage:"
47 echo "$0 <DATABASE_NAME> <BACKUP_OUTPUT_DIRECTORY_ABOLUTE_PATH>"
48 exit 1
49 fi
50
51 mkdir -p "$2"
52
53 echo "CONNECT TO $1 USER _system;" > backup_$1.sql
54 echo "WRITE ALL OUTPUT(DIR='$2', CSEP='\t', RSEP='\n', CONTENT=TRUE);" >> backup_$1.sql
55 echo "DISCONNECT CURRENT;" >> backup_$1.sql
56
57 # run the backup
58 /Library/FrontBase/bin/sql92 < backup_$1.sql
59
60 rm backup_$1.sql
61
62 # convert files to plain delimited format
63 for FILE in `ls $2/*_*`; do
64 #echo "$FILE"
65
66 # read table name (first line)
67 TABLE=`head -n 1 "$FILE"`
68 TABLE=`echo $TABLE | sed 's/_SYSTEM.//g' `
69 echo $TABLE
70
71 if [[ $TABLE == BLACK_AND_WHITE_LIST* ]]; then
72 rm -f "$FILE"
73 continue
74 fi
75
76 if [[ $TABLE == INFORMATION_SCHEMA* ]]; then
77 rm -f "$FILE"
78 continue
79 fi
80
81 # read number of columns (second line)
82 NUMCOLUMNS=`head -n 2 "$FILE" | tail -n 1`
83
84
85 COLUMNS=""
86
87 # read list of columns and assemble into one line (csv)
88 for ((i=1; i <= $NUMCOLUMNS; i++)); do
89 LINENUM=$((i + 2))
90 LINE=`head -n $LINENUM "$FILE" | tail -n 1`
91 COLUMN=`echo $LINE | cut -d\" -f 2`
92 COLUMNS="${COLUMNS},${COLUMN}"
93 done
94
95 # remove leading comma from column list
96 COLUMNS=`sed 's/^.//' <<<$COLUMNS`
97 # replace commas with tabs in the header
98 COLUMNS=`sed 's/,/ /g' <<<$COLUMNS`
99
100 OUTFILE="$2/${TABLE}.tab"
101
102 # append CSV header with column names to file
103 echo "$COLUMNS" > "$OUTFILE"
104
105 # delete column list from file and escape double quotes.
106 # if you are manually escaping text, then remove the part: | sed 's/"/""/g'
107 DELETELINES=$(($NUMCOLUMNS + 2))
108 sed "1,${DELETELINES}d" $FILE | sed 's/"/""/g' >> "$OUTFILE"
109
110 rm -f "$FILE"
111 done
112
113 echo ""
114 echo "The backup file has been written to $2.tgz"
115 echo ""
116
117 {{/code}}
118
119 There are a few caveats with this backup:
120
John Huss 7.1 121 1. FrontBase's delimited export doesn't do any quoting or escaping as far as I know. The script above handles embedded quotes, but that is all. So if your data has embedded line breaks, tabs, or double-quotes, then you will need to manually deal with that. The easiest way is to just edit the data in the DB before backing up. First run a query to escape double quotes so that " becomes "". Then run a query to surround the data with double quotes. These changes will make the export file valid but won't alter the data once it's imported.
122 1. FrontBase's export pads CHAR (fixed-length) fields with zero bytes (0x0) which are not allowed by Postgres' import. You can alter your schema to use VARCHAR instead before exporting, or just strip the zero bytes if you know that if won't mess up any other data you have. You can strip them after exporting by running this command:
123 1. {{code}}
John Huss 5.1 124 tr -d '\0' < MyTable.tab > MyTableStripped.tab
125 mv MyTableStripped.tab MyTable.tab
John Huss 7.1 126 {{/code}}I don't know if this works with binary data.
John Huss 5.1 127 1. There may be issues with character encoding.
John Huss 7.1 128 1. There may be issues with timestamp precision. Is type='FrontBase' needed?
John Huss 5.1 129
130 ----
131
132 **Migration to PostreSQL**
133
134 Using the delimited export above, you can migrate a FrontBase database to PostgreSQL. This is an import script that does that:
135
136 {{code}}
137
138 #! /bin/bash
139
140 # NOTE: you will be prompted for your DB password, perhaps interrupting a very long process
141 # that would otherwise be automated; you can avoid this by setting the auth method to trust
142 # in postgresql.conf or creating a .pgpass file in your home folder.
143
144 # The database must exist already. If it doesn't, create it by running: createdb "MyDatabase"
145
146 # This will only import the data, not the schema. the schema must already be defined.
147 # However, for better speed you may want to wait to define the indexes, foreign keys,
148 # and constraints until after the import is complete.
149
150 # If an error occurs in the import phase all changes will be rolled back; the DB will have
151 # no data and you can fiddle with things and try again.
152
153 if [ $# -lt "2" ]; then
154 echo "Usage: $0 <database_name> <path_to_directory_with_tab_files>"
155 exit
156 fi
157
158 DBNAME="$1"
159 OUTPUTDIR="$2"
160
161 cd "$OUTPUTDIR"
162
163 function createScripts {
164
165 rm -f import.txt
166 rm -f sequences.txt
167 rm -f vacuum.txt
168 rm -f analyze.txt
169
170 echo "SET CONSTRAINTS ALL DEFERRED;" >> import.txt
171 echo "" >> import.txt
172
173 for FILE in `ls *.tab`; do
174 TABLE=`sed 's/.tab$//' <<<$FILE `
175
176 if [[ $TABLE == INFORMATION_SCHEMA* ]]; then
177 continue
178 fi
179
180 echo "Preparing sql for $TABLE"
181
182 COLUMNS=`head -n 1 $FILE`
183 # replace spaces or tabs with commas in the header
184 COLUMNS=`sed 's/[ ]/,/g' <<<$COLUMNS`
185
186 echo "DELETE FROM ${TABLE};" >> import.txt
187 echo "COPY ${TABLE}(${COLUMNS}) FROM '${OUTPUTDIR}/${FILE}' WITH DELIMITER E'\t' CSV HEADER;" >> import.txt
188 echo "" >> import.txt
189
190 # this assumes your primary keys are named ${TABLE}PK - the table name followed by "pk".
191 # if not, modify this, or reset the sequences manually later.
192 echo "SELECT SETVAL('${TABLE}_seq', (SELECT MAX(${TABLE}PK) FROM ${TABLE}));" >> sequences.txt
193 echo "" >> sequences.txt
194
195 echo "VACUUM ${TABLE};" >> vacuum.txt
196 echo "ANALYZE ${TABLE};" >> analyze.txt
197 done
198 }
199
200 createScripts
201
202 /Library/PostgreSQL8/bin/psql --set ON_ERROR_STOP=on -a --user postgres --dbname ${DBNAME} --single-transaction --file "${OUTPUTDIR}/import.txt"
203 if [ $? -ne 0 ]; then
204 exit
205 fi
206
207 /Library/PostgreSQL8/bin/psql --set ON_ERROR_STOP=on -a --user postgres --dbname ${DBNAME} --file "${OUTPUTDIR}/vacuum.txt"
208 if [ $? -ne 0 ]; then
209 exit
210 fi
211
212 /Library/PostgreSQL8/bin/psql --set ON_ERROR_STOP=on -a --user postgres --dbname ${DBNAME} --file "${OUTPUTDIR}/analyze.txt"
213 if [ $? -ne 0 ]; then
214 exit
215 fi
216
217 /Library/PostgreSQL8/bin/psql --set ON_ERROR_STOP=off -a --user postgres --dbname ${DBNAME} --file "${OUTPUTDIR}/sequences.txt"
218
219 echo "Done."
220 echo "Now add indexes, foreign keys, and constraints if you haven't yet."
221
222 {{/code}}
223
224 ----
225
smmccraw 1.1 226 **Migration from MySQL**
227
228 In most cases the datatypes are exactly the same. Here are the FrontBase equivalents for common datatypes in MySQL that are different:
229
John Huss 7.1 230 TEXT ~-~-> CLOB
smmccraw 1.1 231
John Huss 7.1 232 DATETIME ~-~-> TIMESTAMP
smmccraw 1.1 233
John Huss 7.1 234 INT ~-~-> INTEGER
smmccraw 1.1 235
John Huss 7.1 236 MEDIUMBLOB ~-~-> BLOB