To edit or add content to this Wiki, you can simply create a new account at http://wocommunity.org/account.
FrontBase is now available for free. Please see the FrontBase website http://www.frontbase.com for details.
A first experience with FrontBase, WO 5.3.1 and MacOSX 10.4.6
Make sure you have the FrontBase jdbc connector accessible from your WebObjects applications. I placed mine in /Library/Java/Extensions/
Make sure that the FrontBasePlugIn.framework is located in /Library/Java/Frameworks
Make sure that you have added the framework to your project or you will not be able to establish a connection to your database, even though you'll be able to see your database from EOModeler.
If you haven't yet assigned a User/Password combination to your database, you still must include User information in order for the connection to be established. Use _system to get started. The URL for basic connection is: jdbc:FrontBase://localhost/my_database_name
The documentation available at the Frontbase site is very complete. The only gotcha was that I was left with the impression the framework was only necessary for WebObjects 4.5 and below. This is not the case, you have to install and include it for WO 5 projects too.
Useful SQL commands in FrontBase
// writes an ascii backup of your database write all output(dir='/path/to/export/dir', type = 'FrontBase', content=true); // creates a database from the exported files that results from the previous command SCRIPT <path-to-export-directory>/schema.sql; // to rename a table alter table name Item to ItemArchive;
If you don't use type='FrontBase', you end up losing precision on your timestamp columns, which can really suck.
Delimited Export From FrontBase
Out of the box FrontBase doesn't provide a pure delimited file export. But it comes close, and with a little effort, you can have a delimited file export that you can import into a different database. This script will create a tab delimited backup of your database.
#! /bin/bash # The name of the database to backup must be the first argument to the script # The backup output directory (ABSOLUTE PATH!) must be the second parameter if [ "$#" -lt "2" ]; then echo "Usage:" echo "$0 <DATABASE_NAME> <BACKUP_OUTPUT_DIRECTORY_ABOLUTE_PATH>" exit 1 fi mkdir -p "$2" echo "CONNECT TO $1 USER _system;" > backup_$1.sql echo "WRITE ALL OUTPUT(DIR='$2', CSEP='\t', RSEP='\n', CONTENT=TRUE);" >> backup_$1.sql echo "DISCONNECT CURRENT;" >> backup_$1.sql # run the backup /Library/FrontBase/bin/sql92 < backup_$1.sql rm backup_$1.sql # convert files to plain delimited format for FILE in `ls $2/*_*`; do #echo "$FILE" # read table name (first line) TABLE=`head -n 1 "$FILE"` TABLE=`echo $TABLE | sed 's/_SYSTEM.//g' ` echo $TABLE if [[ $TABLE == BLACK_AND_WHITE_LIST* ]]; then rm -f "$FILE" continue fi if [[ $TABLE == INFORMATION_SCHEMA* ]]; then rm -f "$FILE" continue fi # read number of columns (second line) NUMCOLUMNS=`head -n 2 "$FILE" | tail -n 1` COLUMNS="" # read list of columns and assemble into one line (csv) for ((i=1; i <= $NUMCOLUMNS; i++)); do LINENUM=$((i + 2)) LINE=`head -n $LINENUM "$FILE" | tail -n 1` COLUMN=`echo $LINE | cut -d\" -f 2` COLUMNS="${COLUMNS},${COLUMN}" done # remove leading comma from column list COLUMNS=`sed 's/^.//' <<<$COLUMNS` # replace commas with tabs in the header COLUMNS=`sed 's/,/ /g' <<<$COLUMNS` OUTFILE="$2/${TABLE}.tab" # append CSV header with column names to file echo "$COLUMNS" > "$OUTFILE" # delete column list from file and escape double quotes. # if you are manually escaping text, then remove the part: | sed 's/"/""/g' DELETELINES=$(($NUMCOLUMNS + 2)) sed "1,${DELETELINES}d" $FILE | sed 's/"/""/g' >> "$OUTFILE" rm -f "$FILE" done echo "" echo "The backup file has been written to $2.tgz" echo ""
There are a few caveats with this backup:
tr -d '\0' < MyTable.tab > MyTableStripped.tab mv MyTableStripped.tab MyTable.tab
Migration to PostreSQL
Using the delimited export above, you can migrate a FrontBase database to PostgreSQL. This is an import script that does that:
#! /bin/bash # NOTE: you will be prompted for your DB password, perhaps interrupting a very long process # that would otherwise be automated; you can avoid this by setting the auth method to trust # in postgresql.conf or creating a .pgpass file in your home folder. # The database must exist already. If it doesn't, create it by running: createdb "MyDatabase" # This will only import the data, not the schema. the schema must already be defined. # However, for better speed you may want to wait to define the indexes, foreign keys, # and constraints until after the import is complete. # If an error occurs in the import phase all changes will be rolled back; the DB will have # no data and you can fiddle with things and try again. if [ $# -lt "2" ]; then echo "Usage: $0 <database_name> <path_to_directory_with_tab_files>" exit fi DBNAME="$1" OUTPUTDIR="$2" cd "$OUTPUTDIR" function createScripts { rm -f import.txt rm -f sequences.txt rm -f vacuum.txt rm -f analyze.txt echo "SET CONSTRAINTS ALL DEFERRED;" >> import.txt echo "" >> import.txt for FILE in `ls *.tab`; do TABLE=`sed 's/.tab$//' <<<$FILE ` if [[ $TABLE == INFORMATION_SCHEMA* ]]; then continue fi echo "Preparing sql for $TABLE" COLUMNS=`head -n 1 $FILE` # replace spaces or tabs with commas in the header COLUMNS=`sed 's/[ ]/,/g' <<<$COLUMNS` echo "DELETE FROM ${TABLE};" >> import.txt echo "COPY ${TABLE}(${COLUMNS}) FROM '${OUTPUTDIR}/${FILE}' WITH DELIMITER E'\t' CSV HEADER;" >> import.txt echo "" >> import.txt # this assumes your primary keys are named ${TABLE}PK - the table name followed by "pk". # if not, modify this, or reset the sequences manually later. echo "SELECT SETVAL('${TABLE}_seq', (SELECT MAX(${TABLE}PK) FROM ${TABLE}));" >> sequences.txt echo "" >> sequences.txt echo "VACUUM ${TABLE};" >> vacuum.txt echo "ANALYZE ${TABLE};" >> analyze.txt done } createScripts /Library/PostgreSQL8/bin/psql --set ON_ERROR_STOP=on -a --user postgres --dbname ${DBNAME} --single-transaction --file "${OUTPUTDIR}/import.txt" if [ $? -ne 0 ]; then exit fi /Library/PostgreSQL8/bin/psql --set ON_ERROR_STOP=on -a --user postgres --dbname ${DBNAME} --file "${OUTPUTDIR}/vacuum.txt" if [ $? -ne 0 ]; then exit fi /Library/PostgreSQL8/bin/psql --set ON_ERROR_STOP=on -a --user postgres --dbname ${DBNAME} --file "${OUTPUTDIR}/analyze.txt" if [ $? -ne 0 ]; then exit fi /Library/PostgreSQL8/bin/psql --set ON_ERROR_STOP=off -a --user postgres --dbname ${DBNAME} --file "${OUTPUTDIR}/sequences.txt" echo "Done." echo "Now add indexes, foreign keys, and constraints if you haven't yet."
Migration from MySQL
In most cases the datatypes are exactly the same. Here are the FrontBase equivalents for common datatypes in MySQL that are different:
TEXT --> CLOB
DATETIME --> TIMESTAMP
INT --> INTEGER
MEDIUMBLOB --> BLOB