In my previous post, I discussed some approaches to configuring PostgreSQL databases and accessing the data in them with ArcSDE 9.3. For this post, I will describe some of my ongoing experiences with getting data into ArcSDE 9.3.
There are two main ways that I am investigating of loading/making available your spatial data in ArcSDE 9.3 for PostgreSQL. The first is the traditional method of importing a feature class via ArcCatalog. (You can also create data in this manner but I haven’t played with that yet.) The second is to register an existing PostGIS table as a layer using the “sdelayer -o register” command-line tool. I will discuss this latter option first.
My first experience in trying to register an existing layer met with abject failure. I was greeted with an error message stating that I had to be the owner of the table in order to register it. I was already aware of this fact via the online documentation and I was, indeed, the owner of the table.
I hacked at this numerous time, parsing through PostgreSQL and ArcSDE log files to no avail. I did find some cryptic messages referring to tables in a “postgres” schema (which I didn’t have), but generally found nothing helpful.
Next I turned to ArcCatalog and decided to import a shapefile and see how well that worked. I set up an ArcSDE connection as depicted below.
I was able to connect just fine but my first attempt to import a shapefile failed with a message saying it could not create a feature class named “arcsde_test1.postgres.fema_regions”. At that point, I realized that ArcCatalog was assuming that I was importing the data into the schema of the user that I was connected as. The “postgres” user is the equivalent to SQL Server’s “sa”. PostgreSQL doesn’t create a schema for it automatically and I don’t typically create one for it in my databases. In this case, because I was testing, I did (not sure if that’s a good practice or not). I did so using the following SQL in pgAdmin III:
CREATE SCHEMA AUTHORIZATION postgres;
This creates a schema for the user that will own the objects in the schema. Once this was in place, I was able to import my shapefile with no issues. I did make sure to use the PG_GEOMETRY configuration keyword to ensure that the PostGIS data type was used.
This got me wondering about my earlier registration problem. I went back and used shape2pgsql to import a shapefile into the newly created postgres schema and then attempted to use the sdelayer utility to register it with ArcSDE. This time it registered! What I learned is that, in addition to being the owner of the table, the table must be in the schema for the user that you are using to register the table. In my case, it wasn’t enough that the “postgres” user already owned the table. The table also had to be in the postgres user schema in my database.
At this point I had one layer which had been loaded via ArcCatalog and using PostGIS geometries and I had another copy of the same data set loaded via shp2pgsql and registered via “sdelayer -o register”. It was time to check the behavior of each. Because I was using PostGIS geometry, I wanted to test both layers in ArcMap and in uDig.
First, the layer registered with the sdelayer command-line tool. This layer did not initially draw in ArcMap after it was registered. With a little investigation in ArcCatalog, I realized the X and Y offsets, the scaling factor and XY cluster tolerance were not set. I will not go into an ArcSDE tuning discussion here but suffice it to say I was able to rectify this problem. I have used the “sdelayer -o register” approach in the past with Oracle Spatial and did not have to take this step so I am chalking it up to a “beta-ism”. I will submit this as feedback to ESRI. The next two images show this data set. The top image shows it in ArcMap and the bottom shows it in uDig. Both applications are reading the same PostGIS data set. Bear in mind that additional tuning was required in ArcSDE and this tuning will need to be maintained separately but, as of this point, both applications are reading the same data set.
With this having been accomplished, it was now time to see what I could do with the data. Since I believe in jumping in with both feet, I went straight to editing. My test was pretty simple: with both ArcMap and uDig pointed at the layer, I did the following:
1. Add a new feature using ArcMap and save it (the big triangle in the Northwest).
2. Refresh the display in uDig to show the new feature.
3. Modify the feature in uDig and commit the change.
4. Refresh the display in ArcMap to show the update.
As can be seen from the series of screen captures, this test worked flawlessly. I have also been able to render the data in both desktops and perform analysis on it. It is important to remember that this test only addresses PostGIS layers that have been registered with ArcSDE via the “sdelayer -o register” command line tool. In my next post I will dig a little deeper with the layer I loaded via ArcCatalog.
great stuff! Inspite of going through command references, ESRI forums etc, this blog was the key to get my postgres spatial table register with sde.
However like Ryan, am also not able to get the map displayed. Am not able to see the rows when I switch to Table view in ArcCatalog. Am guess its some DB access issue or something here. Any pointers to log / debug files will be really helpful as well.
thanks
-Shrini
Hi Bill, I found your postgres/sde articles a copule days ago and they have been a tremedous help. One problem I am running into is getting the data to display in ArcCatalog or ArcMap. I was able to register the layer successfully but it will not display. I noticed you said that you had to add the XY offset, tolerance, etc. but when I view the properties for this feature class those parameters are set.
Any ideas what I could be missing?
Thanks,
-Ryan
Thanks for answer, I have other questions, one of them is why can’t it see a layer totaly when I try to do “zoom to this layer” created in PostGIS and then, using sdelayer – register, to be showed with ArcMAP, but when I connected by PostGIS, to those layer, show it, very well. Thanks for your answer.
Juan,
That error code indicates that your new record violates a check constraint. I would suggest examining the check constraints defined for your layer and verifying your data against them.
Bill
Hi Bill: I try to do everything you specify here, but when add a new feature using ArcMap and save it, show this error: Unable to save edits. Underlying DBMS error[ERROR: el nuevo registro para la relaci::SQL state:23514]. Please give me an answer to this error. Thanks. Greetings, Juan
PG_GEOMETRY causes ArcSDE to use the PostGIS spatial type where as ST_GEOMETRY is ESRI’s spatial type. I tend to use PG_GEOMETRY because it increases my interoperability options with non-ESRI platforms. It also gives the the option of doing some of my analysis in psql outside of the ESRI stack if I need to.
thanks. interesting information. i have managed to create my own postgre-ArcSDE and loading the data using ArcCatalog.
I have a question, whats the difference between PG_GEOMETRY and ST_GEOMETRY ? they serve the same purpose but what are the considerations in choosing them ?
thanks
yocki@hotmail.com
Khavin,
did you amend the pg_hba.conf so remote machines can directly access your postgres instance?
Kate
To my knowledge, you can’t…
http://duncanjg.wordpress.com/2008/07/05/raster-data-in-postgis-2/
Bill
I am new to opensource GIS software. How does one import raster images into Postgres/postgis database using a front end like quantum/Udig or GvSig?
I have configured the database and spatially enabled the database with PostGis. I have already replicated the PostGis_Template and loaded vector data via Quantum. I have no issues here. I can connect vis UDig & Quantum and extract data themes.
Any comments would be appreciated.
Ethan,
Is your data set in a user schema that is owned by login that you are using to connect?
I’m having some trouble getting a direct connect to work. I can connect fine using port 5151, but when I switch to direct connect i get “Failed to connect…Bad Login User”. I’ve tried using the sde account and the postgres account. My connection service string is “sde:postgresql:test-sql”. If I change this at all i get “Failed to connect…Invalid parameter value passed to function”.
The documentation I’ve found isn’t particularly helpful: http://webhelp.esri.com/arcgisdesktop/9.3/index.cfm?TopicName=Setting_up_a_direct_connection_to_PostgreSQL
Thanks!
Glad to help!
Bingo. Thanks!
I did the same thing at first. My layer was originally in the public schema and it didn’t work. It needs to be in the sde schema (since you’re using that login).
Interesting. I am using:
sdelayer -o register -l gisdb.public.cocorahs,gisdb.public.coco_geo -e p -u sde -C id,USER -D gisdb
cocorahs is my table and the PostGIS geometry column is coco_geo. Both are owned by sde.
Error: The user must be the table owner (-96).
Error: Cannot Create Layer.
Is the error I am getting
Sure, here it is:
sdelayer -o register -l milbases,shape -e a+ -C gid,USER -i sde:postgresql:localhost -D arcsde_test1 -u postgres -p XXXXX
My table is in the user schema for the postgres user in the arcsde_test1 database and postgres also owns the table. The table was loaded with shp2pgsql which is the same way I load many data sets for use in GeoServer and SharpMap.
I would be interested in seeing more information on getting the sdelayer command to register the geometry column from an existing PostGIS database. I keep running into the same table owner issue even though sde owns the table. Can you post an example of the sdelayer command you issued.
Julian,
Glad to help. My understanding is that you will have full geodatabase capability withPostgreSQL. I have successfully created a raster catalog in ArcSDE 9.3 for PostgreSQL. I’ll show that in an upcoming post.
Bill
Thanks for this, Bill.
I might be jumping the gun here, but, will image catalogs be able to be added into ArcSDE 9.3 using PostgreSQL?
Bill, I’ve been beating my head against the same “sdelayer -o register” problem. You saved me another night of frustration! Thanks!
Still can’t get sde to load into any of my other postgis databases to test direct connect….
Thanks, Paul. I’m pulling together notes for Part 3 now. It’s nice to finally get my hands dirty with it.
Great stuff, Bill, keep it coming!