Tuesday, April 5, 2011

Python: DISTINCT on GQuery result set (GQL, GAE)

Imagine you got an entity in the Google App Engine datastore, storing links for anonymous users. You would like to perform the following SQL query, which is not supported:

SELECT DISTINCT user_hash FROM links

Instead you could use:

user = db.GqlQuery("SELECT user_hash FROM links")

How to use Python most efficiently to filter the result, so it returns a DISTINCT result set? How to count the DISTINCT result set?

From stackoverflow
  • One option would be to put the results into a set object:

    http://www.python.org/doc/2.6/library/sets.html#sets.Set

    The resulting set will consist only of the distinct values passed into it.

    Failing that, building up a new list containing only the unique objects would work. Something like:

    unique_results = []
    for obj in user:
        if obj not in unique_results:
            unique_results.append(obj)
    

    That for loop can be condensed into a list comprehension as well.

  • A set is good way to deal with that:

    >>> a = ['google.com', 'livejournal.com', 'livejournal.com', 'google.com', 'stackoverflow.com']
    >>> b = set(a)
    >>> b
    set(['livejournal.com', 'google.com', 'stackoverflow.com'])
    >>>
    

    One suggestion w/r/t the first answer, is that sets and dicts are better at retrieving unique results quickly, membership in lists is O(n) versus O(1) for the other types, so if you want to store additional data, or do something like create the mentioned unique_results list, it may be better to do something like:

    unique_results = {}
    >>> for item in a:
        unique_results[item] = ''
    
    
    >>> unique_results
    {'livejournal.com': '', 'google.com': '', 'stackoverflow.com': ''}
    
    Federico Elles : A set object is an unordered collection of distinct hashable objects.(...) New in version 2.4. http://www.python.org/doc/2.5.2/lib/types-set.html
    sudarkoff : Set is okay if the number of records is relatively small. But if you have gazillions of records in the datastore, it would be quite inefficient! A much better strategy would be to pre-calculate and store the result at the insert/update time.

0 comments:

Post a Comment