Is your feature request related to a problem? Please describe.
There is a limit on the number of load jobs per day (per table and per project limit of 1,000 load jobs). When a pandas DataFrame is used as part of a data processing pipeline, this limit can easily be reached. See: googleapis/python-bigquery-pandas#238
Describe the solution you'd like
I'd like to see a insert_rows_from_dataframe method on the client to complement the insert_rows method. It would transform the (column-oriented) DataFrame in to (row-oriented) JSON and upload the rows via calls to tabledata.insert (in batches).