Skip to content

Instantly share code, notes, and snippets.

View lifan0127's full-sized avatar

Fan Li lifan0127

View GitHub Profile
@lifan0127
lifan0127 / paper-qa-zotero.py
Created March 8, 2023 02:53
Streamlining Literature Reviews with Paper QA and Zotero
import os
os.environ['OPENAI_API_KEY'] = '<Your OpenAI API Key>'
# See here on how to find your Zotero info: https://github.com/urschrei/pyzotero#quickstart
ZOTERO_USER_ID = '<Your Zotero User ID>'
ZOTERO_API_KEY = '<Your Zotero API Key>'
ZOTERO_COLLECTION_ID = '<Your Zotero Collection ID>'
question = 'What predictive models are used in materials discovery?'
# The following prompt instruction is injected to limit the number of keywords per query
@lifan0127
lifan0127 / base.css
Last active August 29, 2015 14:21 — forked from planetoftheweb/base.css
@import url(http://fonts.googleapis.com/css?family=Roboto+Slab:700|Exo+2:300,600);
/* Eric Meyer's Reset CSS v2.0 - http://cssreset.com */
html,body,div,span,applet,object,iframe,h1,h2,h3,h4,h5,h6,p,blockquote,pre,a,abbr,acronym,address,big,cite,code,del,dfn,em,img,ins,kbd,q,s,samp,small,strike,strong,sub,sup,tt,var,b,u,i,center,dl,dt,dd,ol,ul,li,fieldset,form,label,legend,table,caption,tbody,tfoot,thead,tr,th,td,article,aside,canvas,details,embed,figure,figcaption,footer,header,hgroup,menu,nav,output,ruby,section,summary,time,mark,audio,video{border:0;font-size:100%;font:inherit;vertical-align:baseline;margin:0;padding:0}article,aside,details,figcaption,figure,footer,header,hgroup,menu,nav,section{display:block}body{line-height:1}ol,ul{list-style:none}blockquote,q{quotes:none}blockquote:before,blockquote:after,q:before,q:after{content:none}table{border-collapse:collapse;border-spacing:0}
/* Solarized Palette - http://ethanschoonover.com/solarized ---------
lightgray : #819090;
gray : #70
@lifan0127
lifan0127 / server.R
Last active August 29, 2015 14:20 — forked from trestletech/server.R
library(shiny)
library(dplyr)
library(lubridate)
# Load libraries and functions needed to create SQLite databases.
library(RSQLite)
library(RSQLite.extfuns)
saveSQLite <- function(data, name){
path <- dplyr:::db_location(filename=paste0(name, ".sqlite"))
We can make this file beautiful and searchable if this error is corrected: It looks like row 4 should actually have 19 columns, instead of 14. in line 3.
Portfolio Manager ID,Property Name,Address,Postal Code,Primary Property Type - EPA Calculated,Property Floor Area (Buildings and Parking) (ft�),Year Built,Number of Buildings,Philadelphia Building ID,Electricity Use - Grid Purchase and Generated from Onsite Renewable Systems (kBtu),Natural Gas Use (kBtu),Fuel Oil #2 Use (kBtu),District Steam Use (kBtu),ENERGY STAR Score,Site EUI (kBtu/ft�),Source EUI (kBtu/ft�),Water Use (All Water Sources) (kgal),Total GHG Emissions (MtCO2e),Notes
3965468,Allegheny Business Center,2233 West Allegheny Avenue,19132,Adult Education,87756,1915,1,884558100,1161103.6,774906.4,Not Available,Not Available,Not Available,22.1,50.8,5.6,188.1,
3999203,Art Institute of Philadelphia,1618 - 1622 Chestnut Street,19103,Adult Education,79480,1931,1,772509500,7329927.9,Not Available,Not Available,1073596,Not Available,105.7,305.8,1917.7,1052.5,
3875830,Center One,9880 Bustleton Ave,19115,Ambulatory Surgical Center,79485,1980,1,778025205,7748344.9,Not Available,Not Available,Not Available,Not A
@lifan0127
lifan0127 / data.csv
Last active August 29, 2015 14:16 — forked from jkeirstead/data.csv
category value sector
UK production emissions 632 UK
Carbon flows from EU 88 EU
Carbon flows to EU -61 EU
Carbon flows from other Annex 1 82 Annex 1
Carbon flows to other Annex 1 -39 Annex 1
Carbon flows from non-Annex 1 104 Other non-Annex 1
Carbon flows from non-Annex 1 64 China
Carbon flows to non-Annex 1 -25 Non-Annex 1
UK consumption emissions 845 UK
Title: Word cloud
Author: Fereshteh Karimeddini <fereshteh@rstudio.com>
AuthorUrl: http://www.rstudio.com/
License: MIT
DisplayMode: Showcase
Tags: wordcloud text-mining actionbutton
Type: Shiny
# 加载所需扩展包
library(RCurl)
library(RJSONIO)
library(XML)
# 建立一个根据网址提取天气预报的子函数
fromurl<- function(finalurl) {
# 先读取网页,再解析JSON数据存在raw中
web <- getURL(finalurl)
raw <-fromJSON(web)
high <- raw$forecast$simpleforecast$forecastday[[2]]$high['celsius']