All Projects → demian85 → gnip

demian85 / gnip

Licence: MIT License
Connect to Gnip streaming API and manage rules

Programming Languages

javascript
184084 projects - #8 most used programming language

Projects that are alternatives of or similar to gnip

Sarcasm Detection
Detecting Sarcasm on Twitter using both traditonal machine learning and deep learning techniques.
Stars: ✭ 73 (+160.71%)
Mutual labels:  twitter, tweets
congresstweets
Datasets of the daily Twitter output of Congress.
Stars: ✭ 76 (+171.43%)
Mutual labels:  twitter, tweets
Twitter Sentiment Analysis
This script can tell you the sentiments of people regarding to any events happening in the world by analyzing tweets related to that event
Stars: ✭ 94 (+235.71%)
Mutual labels:  twitter, tweets
Guffer
Guffer tweets based on a daily schedule
Stars: ✭ 12 (-57.14%)
Mutual labels:  twitter, tweets
Dmarchiver
A tool to archive the direct messages, images and videos from your private conversations on Twitter
Stars: ✭ 204 (+628.57%)
Mutual labels:  twitter, tweets
Twweet Cli
🐦 Tweet right from your cli without even opening your browser.
Stars: ✭ 47 (+67.86%)
Mutual labels:  twitter, tweets
Twint
An advanced Twitter scraping & OSINT tool written in Python that doesn't use Twitter's API, allowing you to scrape a user's followers, following, Tweets and more while evading most API limitations.
Stars: ✭ 12,102 (+43121.43%)
Mutual labels:  twitter, tweets
Tweetscraper
TweetScraper is a simple crawler/spider for Twitter Search without using API
Stars: ✭ 694 (+2378.57%)
Mutual labels:  twitter, tweets
Scrape Twitter
🐦 Access Twitter data without an API key. [DEPRECATED]
Stars: ✭ 166 (+492.86%)
Mutual labels:  twitter, tweets
Laravel Twitter Streaming Api
Easily work with the Twitter Streaming API in a Laravel app
Stars: ✭ 153 (+446.43%)
Mutual labels:  twitter, tweets
Tweets
🐦 Tweet every 24 pull request
Stars: ✭ 8 (-71.43%)
Mutual labels:  twitter, tweets
twitterstream
Twitter Streaming API Example with Kafka Streams in Scala
Stars: ✭ 49 (+75%)
Mutual labels:  tweets, twitter-streaming-api
Twitter Post Fetcher
Fetch your twitter posts without using the new Twitter 1.1 API. Pure JavaScript! By Jason Mayes
Stars: ✭ 886 (+3064.29%)
Mutual labels:  twitter, tweets
Twitterldatopicmodeling
Uses topic modeling to identify context between follower relationships of Twitter users
Stars: ✭ 48 (+71.43%)
Mutual labels:  twitter, tweets
Twitter
Twitter API for Laravel 5.5+, 6.x, 7.x & 8.x
Stars: ✭ 755 (+2596.43%)
Mutual labels:  twitter, tweets
Tta Elastic
Official Trump Twitter Archive V2 source
Stars: ✭ 104 (+271.43%)
Mutual labels:  twitter, tweets
Yotter
Youtube and Twitter with privacy.
Stars: ✭ 376 (+1242.86%)
Mutual labels:  twitter, tweets
Linqtotwitter
LINQ Provider for the Twitter API (C# Twitter Library)
Stars: ✭ 401 (+1332.14%)
Mutual labels:  twitter, tweets
Real Time Sentiment Tracking On Twitter For Brand Improvement And Trend Recognition
A real-time interactive web app based on data pipelines using streaming Twitter data, automated sentiment analysis, and MySQL&PostgreSQL database (Deployed on Heroku)
Stars: ✭ 127 (+353.57%)
Mutual labels:  twitter, tweets
Twitterdelete
💀 Delete your old, unpopular tweets.
Stars: ✭ 231 (+725%)
Mutual labels:  twitter, tweets

NodeJS Gnip module

Connect to Gnip streaming API and manage rules. You must have a Gnip account with any data source available, like Twitter Power Track.

Currenly, this module only supports JSON activity stream format, so you must enable data normalization in your admin panel.

Gnip.Stream

This class is an EventEmitter and allows you to connect to the stream and start receiving data.

Constructor options

  • timeout As requested in the Gnip docs (http://support.gnip.com/apis/powertrack/api_reference.html), this option in the constructor allows us to set a read timeout in the client. The recommended value is >=30 seconds, so the constructor will throw an error if a smaller timeout is provided. The default value for this option is 35 seconds.
  • backfillMinutes Number of minutes to backfill after connecting to the stream. Optional. Value should be 0 - 5.
  • partition Partition of the Firehose stream you want to connect to. Only required for Firehose streams.
  • parser Parser library for incoming JSON data. Optional, but defaults to the excellent json-bigint library.
    Matching tag IDs are sent to us as big integers which can't be reliably parsed by the native JSON library in Node.js. More info on this issue can be found at StackOverflow

API methods

  • start() Connect to the stream and start receiving data. At this point you should have registered at least one event listener for any of these events: 'data', 'object' or 'tweet'.

  • end() Terminates the connection.

Events

  • ready Emitted when the connection has been successfully established
  • data Emitted for each data chunk (decompressed)
  • error Emitted when any type of error occurs. An error is raised if the response status code is not 20x. {error: String} objects are also checked here.
  • object Emitted for each JSON object.
  • tweet Emitted for each tweet.
  • delete Emitted for each deleted tweet.
  • end Emitted when the connection is terminated. This event is always emitted when an error occurs and the connection is closed.

Gnip.Rules

This class allows you to manage an unlimited number of tracking rules.

Constructor options

  • user GNIP account username.
  • password GNIP account password.
  • url GNIP Rules endpoint url e.g. https://gnip-api.twitter.com/rules/${streamType}/accounts/${account}/publishers/twitter/${label}.json
  • batchSize The batch size used when adding/deleting rules in bulk. (Defaults to 5000)
  • parser Much like the parser option allowed in the Gnip Stream constructor, you can pass a custom parser handler/library for incoming JSON data. This is optional, and defaults to the json-bigint library. More details.
  • cacheFile Internally Gnip.Rules uses a file for caching the current state of the rules configuration, the default path is in the directory of the package. This optional configuration allows you to change the path as the default one may cause problems in applications where node_modules is in a read-only filesystems (e.g. AWS Lambda).

API methods

  • getAll(callback) Get cached rules.

  • update(rules: Array, callback) Creates or replaces the live tracking rules.
    Rules are sent in batches of options.batchSize, so you can pass an unlimited number of rules.
    The current tracking rules are stored in a local JSON file so you can update the existing rules efficiently without having to remove them all. The callback receives an object as the 2nd argument and contains the number of added and deleted rules.

  • clearCache(callback) Clears cached rules.

The following methods uses Gnip API directly and ignores the local cache. Avoid usage if you are working with too many rules!

  • live.update(rules: Array, callback)
  • live.add(rules: Array, callback)
  • live.remove(rules: Array, callback)
  • live.getAll(callback)
  • live.getByIds(ids: Array, callback)
  • live.removeAll(callback)

Gnip.Search

This class is an EventEmitter and allows you to connect to either the 30 day or full archive search API and start receiving data.

Constructor options

  • user GNIP account username.
  • password GNIP account password.
  • url GNIP Search endpoint url e.g. https://gnip-api.twitter.com/search/30day/accounts/{ACCOUNT_NAME}/{LABEL}.json
  • query Rule to match tweets.
  • fromDate The oldest date from which tweets will be gathered. Date given in the format 'YYYYMMDDHHMM'. Optional.
  • toDate The most recent date to which tweets will be gathered. Date given in the format 'YYYYMMDDHHMM'. Optional.
  • maxResults The maximum number of search results to be returned by a request. A number between 10 and 500. Optional.
  • tag Used to segregate rules and their matching data into different logical groups. Optional.
  • bucket The unit of time for which count data will be provided. Options: "day", "hour", "minute". Optional, for /counts calls.
  • rateLimiter A limiter object, used to control the rate of collection. Optional. If unspecified, a rate limit of 30 requests a minute will be shared between Search streams. If you have a non-standard rate limit, you should pass this parameter.
const RateLimiter = require('limiter').RateLimiter;
// Allow 60 requests per minute
const limiter = new RateLimiter(60, 'minute');
const stream = new Gnip.Search({
	rateLimiter : limiter,
  ...
});

API methods

  • start() Start receiving data. At this point you should have registered at least one event listener for 'object' or 'tweet'.

  • end() Terminates the connection.

Events

  • ready Emitted when tweets have started to be collected.
  • error Emitted when a recoverable (non fatal) error occurs.
  • object Emitted for each JSON object.
  • tweet Emitted for each tweet.
  • end Emitted when the connection is terminated. If the stream has ended due to a fatal error, the error object will be passed.

Gnip.Usage

This class allows you to track activity consumption across Gnip products.

Constructor options

const usage = new Gnip.Usage({
	url : 'https://gnip-api.twitter.com/metrics/usage/accounts/{ACCOUNT_NAME}.json',
	user : 'xxx',
	password : 'xxx'
});

API Methods

usage.get({ bucket:'day', fromDate:'201612010000', toDate:'201612100000' },function( err, body )
{
	...
});

Installation

npm install gnip

Example Usage

const Gnip = require('gnip');

const stream = new Gnip.Stream({
  url : 'https://gnip-stream.twitter.com/stream/powertrack/accounts/xxx/publishers/twitter/prod.json',
  user : 'xxx',
  password : 'xxx',
  backfillMinutes: 5 // optional
});
stream.on('ready', function() {
  console.log('Stream ready!');
});
stream.on('tweet', function(tweet) {
  console.log(tweet);
});
stream.on('error', function(err) {
  console.error(err);
});

const rules = new Gnip.Rules({
  url : 'https://gnip-api.twitter.com/rules/powertrack/accounts/xxx/publishers/twitter/prod.json',
  user : 'xxx',
  password : 'xxx',
  batchSize: 1234 // not required, defaults to 5000
});

const newRules = [
  '#hashtag', 
  'keyword', 
  '@user',
  {value: 'keyword as object'},
  {value: '@demianr85', tag: 'rule tag'}
];

rules.update(newRules, function(err) {
  if (err) throw err;
  stream.start();
});

const search = new Gnip.Search({
  url : 'https://gnip-stream.twitter.com/stream/powertrack/accounts/xxx/publishers/twitter/prod.json',
  user : 'xxx',
  password : 'xxx',
  query : '@user'
});

search.on('tweet', function(tweet) {
  console.log(tweet);
});

search.on('error', function(err) {
  console.error(err);
});

search.on('end', function(err) {
  if( err ) 
    console.error(err);
});

// search counts usage
const counts = new Gnip.Search({
  url : 'https://gnip-stream.twitter.com/stream/powertrack/accounts/xxx/publishers/twitter/prod/counts.json',
  user : 'xxx',
  password : 'xxx',
  query : '@user',
  bucket: 'day'
});

counts.on('object', function(object) {
  console.log(object.results);
  counts.end();
});

More details and tests soon...

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].