Version 6.1 by John Huss on 2010/07/23 14:34

Show last authors
1 FrontBase is now available for **free**. Please see the FrontBase website [[http://www.frontbase.com]] for details.
2
3 ----
4
5 **A first experience with FrontBase, WO 5.3.1 and MacOSX 10.4.6**
6
7 Make sure you have the FrontBase jdbc connector accessible from your WebObjects applications. I placed mine in /Library/Java/Extensions/
8 Make sure that the FrontBasePlugIn.framework is located in /Library/Java/Frameworks
9 Make sure that you have added the framework to your project or you will not be able to establish a connection to your database, even though you'll be able to see your database from EOModeler.
10 If you haven't yet assigned a User/Password combination to your database, you still must include User information in order for the connection to be established. Use system to get started. The URL for basic connection is: jdbc:FrontBase:~/~/localhost/my_database_name
11
12 The documentation available at the Frontbase site is very complete. The only gotcha was that I was left with the impression the framework was only necessary for WebObjects 4.5 and below. This is not the case, you have to install and include it for WO 5 projects too.
13
14 ----
15
16 **Useful SQL commands in FrontBase**
17
18 {{code value="sql"}}
19
20 // writes an ascii backup of your database
21 write all output(dir='/path/to/export/dir', type = 'FrontBase', content=true);
22
23 // creates a database from the exported files that results from the previous command
24 SCRIPT <path-to-export-directory>/schema.sql;
25
26 // to rename a table
27 alter table name Item to ItemArchive;
28
29 {{/code}}
30
31 {{color value="#000000"}}
32 If you don't use type='FrontBase', you end up losing precision on your timestamp columns, which can really suck.
33 {{/color}}
34
35 ----
36
37 **Delimited Export From FrontBase**
38
39 {{color value="#000000"}}
40 Out of the box FrontBase doesn't provide a pure delimited file export. &nbsp;But it comes close, and with a little effort, you can have a delimited file export that you can import into a different database. This script will create a tab delimited backup of your database.
41 {{/color}}
42
43 {{code}}
44
45 #! /bin/bash
46
47 # The name of the database to backup must be the first argument to the script
48 # The backup output directory (ABSOLUTE PATH!) must be the second parameter
49
50 if [ "$#" -lt "2" ]; then
51 echo "Usage:"
52 echo "$0 <DATABASE_NAME> <BACKUP_OUTPUT_DIRECTORY_ABOLUTE_PATH>"
53 exit 1
54 fi
55
56 mkdir -p "$2"
57
58 echo "CONNECT TO $1 USER _system;" > backup_$1.sql
59 echo "WRITE ALL OUTPUT(DIR='$2', CSEP='\t', RSEP='\n', CONTENT=TRUE);" >> backup_$1.sql
60 echo "DISCONNECT CURRENT;" >> backup_$1.sql
61
62 # run the backup
63 /Library/FrontBase/bin/sql92 < backup_$1.sql
64
65 rm backup_$1.sql
66
67 # convert files to plain delimited format
68 for FILE in `ls $2/*_*`; do
69 #echo "$FILE"
70
71 # read table name (first line)
72 TABLE=`head -n 1 "$FILE"`
73 TABLE=`echo $TABLE | sed 's/_SYSTEM.//g' `
74 echo $TABLE
75
76 if [[ $TABLE == BLACK_AND_WHITE_LIST* ]]; then
77 rm -f "$FILE"
78 continue
79 fi
80
81 if [[ $TABLE == INFORMATION_SCHEMA* ]]; then
82 rm -f "$FILE"
83 continue
84 fi
85
86 # read number of columns (second line)
87 NUMCOLUMNS=`head -n 2 "$FILE" | tail -n 1`
88
89
90 COLUMNS=""
91
92 # read list of columns and assemble into one line (csv)
93 for ((i=1; i <= $NUMCOLUMNS; i++)); do
94 LINENUM=$((i + 2))
95 LINE=`head -n $LINENUM "$FILE" | tail -n 1`
96 COLUMN=`echo $LINE | cut -d\" -f 2`
97 COLUMNS="${COLUMNS},${COLUMN}"
98 done
99
100 # remove leading comma from column list
101 COLUMNS=`sed 's/^.//' <<<$COLUMNS`
102 # replace commas with tabs in the header
103 COLUMNS=`sed 's/,/ /g' <<<$COLUMNS`
104
105 OUTFILE="$2/${TABLE}.tab"
106
107 # append CSV header with column names to file
108 echo "$COLUMNS" > "$OUTFILE"
109
110 # delete column list from file and escape double quotes.
111 # if you are manually escaping text, then remove the part: | sed 's/"/""/g'
112 DELETELINES=$(($NUMCOLUMNS + 2))
113 sed "1,${DELETELINES}d" $FILE | sed 's/"/""/g' >> "$OUTFILE"
114
115 rm -f "$FILE"
116 done
117
118 echo ""
119 echo "The backup file has been written to $2.tgz"
120 echo ""
121
122 {{/code}}
123
124 There are a few caveats with this backup:
125
126 1. FrontBase's delimited export doesn't do any quoting or escaping as far as I know. The script above handles embedded quotes, but that is all. So if your data has embedded line breaks, tabs, or double-quotes, then you will need to manually deal with that. The easiest way is to just edit the data in the DB before backing up. First run a query to escape double quotes so that " becomes "". Then run a query to surround the data with double quotes. These changes will make the export file valid but won't alter the data once it's imported.
127 1. FrontBase's export pads CHAR (fixed-length) fields with zero bytes (0x0) which are not allowed by Postgres' import. You can alter your schema to use VARCHAR instead before exporting, or just strip the zero bytes if you know that if won't mess up any other data you have. You can strip them after exporting by running this command:
128
129 {{code}}
130
131 tr -d '\0' < MyTable.tab > MyTableStripped.tab
132 mv MyTableStripped.tab MyTable.tab
133
134 {{/code}}
135
136 1. I don't know if this works with binary data.
137 1. There may be issues with character encoding.
138 1. There may be issues with timestamp precision. Is type='FrontBase' needed?
139
140 ----
141
142 **Migration to PostreSQL**
143
144 Using the delimited export above, you can migrate a FrontBase database to PostgreSQL. This is an import script that does that:
145
146 {{code}}
147
148 #! /bin/bash
149
150 # NOTE: you will be prompted for your DB password, perhaps interrupting a very long process
151 # that would otherwise be automated; you can avoid this by setting the auth method to trust
152 # in postgresql.conf or creating a .pgpass file in your home folder.
153
154 # The database must exist already. If it doesn't, create it by running: createdb "MyDatabase"
155
156 # This will only import the data, not the schema. the schema must already be defined.
157 # However, for better speed you may want to wait to define the indexes, foreign keys,
158 # and constraints until after the import is complete.
159
160 # If an error occurs in the import phase all changes will be rolled back; the DB will have
161 # no data and you can fiddle with things and try again.
162
163 if [ $# -lt "2" ]; then
164 echo "Usage: $0 <database_name> <path_to_directory_with_tab_files>"
165 exit
166 fi
167
168 DBNAME="$1"
169 OUTPUTDIR="$2"
170
171 cd "$OUTPUTDIR"
172
173 function createScripts {
174
175 rm -f import.txt
176 rm -f sequences.txt
177 rm -f vacuum.txt
178 rm -f analyze.txt
179
180 echo "SET CONSTRAINTS ALL DEFERRED;" >> import.txt
181 echo "" >> import.txt
182
183 for FILE in `ls *.tab`; do
184 TABLE=`sed 's/.tab$//' <<<$FILE `
185
186 if [[ $TABLE == INFORMATION_SCHEMA* ]]; then
187 continue
188 fi
189
190 echo "Preparing sql for $TABLE"
191
192 COLUMNS=`head -n 1 $FILE`
193 # replace spaces or tabs with commas in the header
194 COLUMNS=`sed 's/[ ]/,/g' <<<$COLUMNS`
195
196 echo "DELETE FROM ${TABLE};" >> import.txt
197 echo "COPY ${TABLE}(${COLUMNS}) FROM '${OUTPUTDIR}/${FILE}' WITH DELIMITER E'\t' CSV HEADER;" >> import.txt
198 echo "" >> import.txt
199
200 # this assumes your primary keys are named ${TABLE}PK - the table name followed by "pk".
201 # if not, modify this, or reset the sequences manually later.
202 echo "SELECT SETVAL('${TABLE}_seq', (SELECT MAX(${TABLE}PK) FROM ${TABLE}));" >> sequences.txt
203 echo "" >> sequences.txt
204
205 echo "VACUUM ${TABLE};" >> vacuum.txt
206 echo "ANALYZE ${TABLE};" >> analyze.txt
207 done
208 }
209
210 createScripts
211
212 /Library/PostgreSQL8/bin/psql --set ON_ERROR_STOP=on -a --user postgres --dbname ${DBNAME} --single-transaction --file "${OUTPUTDIR}/import.txt"
213 if [ $? -ne 0 ]; then
214 exit
215 fi
216
217 /Library/PostgreSQL8/bin/psql --set ON_ERROR_STOP=on -a --user postgres --dbname ${DBNAME} --file "${OUTPUTDIR}/vacuum.txt"
218 if [ $? -ne 0 ]; then
219 exit
220 fi
221
222 /Library/PostgreSQL8/bin/psql --set ON_ERROR_STOP=on -a --user postgres --dbname ${DBNAME} --file "${OUTPUTDIR}/analyze.txt"
223 if [ $? -ne 0 ]; then
224 exit
225 fi
226
227 /Library/PostgreSQL8/bin/psql --set ON_ERROR_STOP=off -a --user postgres --dbname ${DBNAME} --file "${OUTPUTDIR}/sequences.txt"
228
229 echo "Done."
230 echo "Now add indexes, foreign keys, and constraints if you haven't yet."
231
232 {{/code}}
233
234 ----
235
236 **Migration from MySQL**
237
238 In most cases the datatypes are exactly the same. Here are the FrontBase equivalents for common datatypes in MySQL that are different:
239
240 TEXT -> CLOB
241
242 DATETIME -> TIMESTAMP
243
244 INT -> INTEGER
245
246 MEDIUMBLOB -> BLOB