In this post I am going to explain, in high level details, how you can develop your own Face­book inte­gra­tion.  You will want to read the post on what the face­book inte­gra­tion pro­vides, before you read this post so you under­stand what the goal of this inte­gra­tion is. This post is intended for developers.

First, you need to get in the mind­set of think­ing of one fan page at a time.  If you want to expand to mul­ti­ple pages then just loop over the logic.  Also, the process will run sev­eral times a day.  Ide­ally once every hour and send data each time it runs.


Client Library: http://​wiki​.devel​op​ers​.face​book​.com/​i​n​d​e​x​.​p​h​p​/​C​a​t​e​g​o​r​y​:​C​l​i​e​n​t​_​L​i​b​r​a​r​ies

FQL Wiki: http://​wiki​.devel​op​ers​.face​book​.com/​i​n​d​e​x​.​p​h​p​/​FQL

Part 1: Fans

To get the fan count you will want to use Facebook’s FQL. You will want to run a query like this: “SELECT fan_count FROM page where page_id=$pageId”

Once you have the fan count you will want to save it to a file ( or DB ) and upload it ONCE a day into an numeric event with the Omni­ture XML Data Inser­tion API.

To get the fan changes you will look at the saved fan count from the last time the process was run and com­pare it to the cur­rent exe­cu­tion of the process.  Then you will upload the dif­fer­ence to give you the num­ber of new fans since the last exe­cu­tion.  Once you have the fan changes you will send them into a numeric event.

Part 2: Posts, Com­ments, Likes

This is where things get tricky.  You can’t pull the num­ber or com­ments or likes from the API, because the Face­book table that holds the data stores it based on the post_id.  But we are lucky enough to have a created_date for the post_id.

From that you will use a query sim­i­lar to this:

SELECT post_id,updated_time,created_time,comments.count,likes.count ’ .

FROM stream WHERE source_id = ‘.$pageId.

’ AND created_time >= ‘.$created_time_start.

’ AND created_time <= ‘.$created_time_end.’ limit 50″}’;

You will notice in the above query that it is lim­it­ing based on the cre­ated time start and cre­ated time end.  Basi­cally, we can’t look at all posts through­out his­tory so we need to have a limit.  After a lot of test­ing we found that look­ing back 7 days was fairly safe.  Yes, we will lose some likes and com­ments but those are far out on the long tale.

The results from the query will be based on the post_id and that is how you want to store it.   You will look at the data from the last time the process was run iter­at­ing through each post over the past 7 days and com­par­ing the num­ber of com­ments and likes.  Again, you will find the dif­fer­ence between the com­ments and likes (same way as you did with the fans) then upload the dif­fer­ence into incre­menter events.  Finally, the posts are a lit­tle sim­pler in that you look at the last saved data and if the post_id does not exist, then that means it is a new post so you incre­ment the post counter and upload into another numeric event.

I know this is high level but the logic is fairly com­plex so you will need to work out a lot of the details even if I try to cover every detail I can think of.

Things to watch out for:

  • You can’t pull huge sets of data from the FB API. So split it up into chunks like I did in the query above. In the end you should be exe­cut­ing sev­eral FQL queries for each run of the process on a sin­gle fan page.
  • Face­book lim­its you to 100 requests every 600 sec ( 10 min). So make sure you grab the excep­tion that is thrown when you break the limit.
  • Don’t try to only run the process once a day, it is too unreliable.


I men­tioned this in my last post that you can con­tact your Account Man­ager and get con­sult­ing hours with Engi­neer­ing Ser­vices.  If you pur­chase as lit­tle as 10 hours then I am con­fi­dent you will save a sig­nif­i­cant num­ber of hours of your devel­op­ment time not to men­tion the time you will save in the long term when you have a robust process in place.

As always, post your com­ments or e-mail me at pearcea (at) adobe​.com.  It is your com­ments and e-mails that keep me post­ing and give me ideas for future posts.


Thanks for the facebook integration devlopment secrets. I followed your steps completely and discovered that it was impossible to pull huge sets of data from the FB API. You were correct. It was much easier to split it up into chunks. I might think about the consulting hours with Engineering Services as well. It sounds like it might be a great investment.


Hey you think I can see how you did it in code I would really appricate it

Rudi Shumpert
Rudi Shumpert

Pearce, Good stuff here! I'll have to give this a try soon. -Rudi