Merge branch 'main' of https://github.com/xtekky/gpt4free
This commit is contained in:
commit
10104774c1
38 changed files with 1054 additions and 950 deletions
31
README.md
31
README.md
|
@ -11,24 +11,6 @@ Have you ever come across some amazing projects that you couldn't use **just bec
|
||||||
|
|
||||||
By the way, thank you so much for [![Stars](https://img.shields.io/github/stars/xtekky/gpt4free?style=social)](https://github.com/xtekky/gpt4free/stargazers) and all the support!!
|
By the way, thank you so much for [![Stars](https://img.shields.io/github/stars/xtekky/gpt4free?style=social)](https://github.com/xtekky/gpt4free/stargazers) and all the support!!
|
||||||
|
|
||||||
## Announcement
|
|
||||||
Dear Gpt4free Community,
|
|
||||||
|
|
||||||
I would like to thank you for your interest in and support of this project, which I only intended to be for entertainment and educational purposes; I had no idea it would end up being so popular.
|
|
||||||
|
|
||||||
I'm aware of the concerns about the project's legality and its impact on smaller sites hosting APIs. I take these concerns seriously and plan to address them.
|
|
||||||
|
|
||||||
Here's what I'm doing to fix these issues:
|
|
||||||
|
|
||||||
1. Removing APIs from smaller sites: To reduce the impact on smaller sites, I have removed their APIs from the repository. Please shoot me a dm if you are an owner of a site and want it removed.
|
|
||||||
|
|
||||||
2. Commitment to ethical use: I want to emphasize my commitment to promoting ethical use of language models. I don't support any illegal or unethical behavior, and I expect users to follow the same principles.
|
|
||||||
|
|
||||||
Thank you for your support and understanding. I appreciate your continued interest in gpt4free and am committed to addressing your concerns.
|
|
||||||
|
|
||||||
Sincerely,
|
|
||||||
**xtekky**
|
|
||||||
|
|
||||||
## Legal Notice <a name="legal-notice"></a>
|
## Legal Notice <a name="legal-notice"></a>
|
||||||
|
|
||||||
This repository uses third-party APIs and AI models and is *not* associated with or endorsed by the API providers or the original developers of the models. This project is intended **for educational purposes only**.
|
This repository uses third-party APIs and AI models and is *not* associated with or endorsed by the API providers or the original developers of the models. This project is intended **for educational purposes only**.
|
||||||
|
@ -54,14 +36,12 @@ Please note the following:
|
||||||
| **How to install** | Instructions on how to install gpt4free | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#install) | - |
|
| **How to install** | Instructions on how to install gpt4free | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#install) | - |
|
||||||
| **Legal Notice** | Legal notice or disclaimer | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#legal-notice) | - |
|
| **Legal Notice** | Legal notice or disclaimer | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#legal-notice) | - |
|
||||||
| **Copyright** | Copyright information | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#copyright) | - |
|
| **Copyright** | Copyright information | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#copyright) | - |
|
||||||
|
| **Star History** | Star History | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#star-history) | - |
|
||||||
| **Usage Examples** | | | |
|
| **Usage Examples** | | | |
|
||||||
| `forefront` | Example usage for quora | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](./forefront/README.md) | ![Active](https://img.shields.io/badge/Active-brightgreen) |
|
| `forefront` | Example usage for forefront (gpt-4) | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](./forefront/README.md) | ![Active](https://img.shields.io/badge/Active-brightgreen) | | |
|
||||||
|
| `quora (poe)` | Example usage for quora | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](./quora/README.md) | ![Active](https://img.shields.io/badge/Active-brightgreen) | |
|
||||||
| `quora (poe)` | Example usage for quora | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](./quora/README.md) | ![Active](https://img.shields.io/badge/Active-brightgreen) |
|
|
||||||
|
|
||||||
| `phind` | Example usage for phind | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](./phind/README.md) | ![Inactive](https://img.shields.io/badge/Active-brightgreen) |
|
| `phind` | Example usage for phind | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](./phind/README.md) | ![Inactive](https://img.shields.io/badge/Active-brightgreen) |
|
||||||
|
| `you` | Example usage for you | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](./you/README.md) | ![Active](https://img.shields.io/badge/Active-brightgreen) |
|
||||||
| `you` | Example usage for you | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](./you/README.md) | ![Active](https://img.shields.io/badge/Active-brightgreen)
|
|
||||||
| **Try it Out** | | | |
|
| **Try it Out** | | | |
|
||||||
| Google Colab Jupyter Notebook | Example usage for gpt4free | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/DanielShemesh/gpt4free-colab/blob/main/gpt4free.ipynb) | - |
|
| Google Colab Jupyter Notebook | Example usage for gpt4free | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/DanielShemesh/gpt4free-colab/blob/main/gpt4free.ipynb) | - |
|
||||||
| replit Example (feel free to fork this repl) | Example usage for gpt4free | [![](https://img.shields.io/badge/Open%20in-Replit-1A1E27?logo=replit)](https://replit.com/@gpt4free/gpt4free-webui) | - |
|
| replit Example (feel free to fork this repl) | Example usage for gpt4free | [![](https://img.shields.io/badge/Open%20in-Replit-1A1E27?logo=replit)](https://replit.com/@gpt4free/gpt4free-webui) | - |
|
||||||
|
@ -154,3 +134,6 @@ GNU General Public License for more details.
|
||||||
You should have received a copy of the GNU General Public License
|
You should have received a copy of the GNU General Public License
|
||||||
along with this program. If not, see <https://www.gnu.org/licenses/>.
|
along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## Star History <a name="star-history"></a>
|
||||||
|
[![Star History Chart](https://api.star-history.com/svg?repos=xtekky/gpt4free&type=Date)](https://star-history.com/#xtekky/gpt4free)
|
||||||
|
|
15
Singularity/gpt4free.sif
Normal file
15
Singularity/gpt4free.sif
Normal file
|
@ -0,0 +1,15 @@
|
||||||
|
Bootstrap: docker
|
||||||
|
From: python:3.10-slim
|
||||||
|
|
||||||
|
%post
|
||||||
|
apt-get update && apt-get install -y git
|
||||||
|
git clone https://github.com/xtekky/gpt4free.git
|
||||||
|
cd gpt4free
|
||||||
|
pip install --no-cache-dir -r requirements.txt
|
||||||
|
cp gui/streamlit_app.py .
|
||||||
|
|
||||||
|
%expose
|
||||||
|
8501
|
||||||
|
|
||||||
|
%startscript
|
||||||
|
exec streamlit run streamlit_app.py
|
|
@ -1,14 +1,17 @@
|
||||||
from tls_client import Session
|
|
||||||
from forefront.mail import Mail
|
|
||||||
from time import time, sleep
|
|
||||||
from re import match
|
|
||||||
from forefront.typing import ForeFrontResponse
|
|
||||||
from uuid import uuid4
|
|
||||||
from requests import post
|
|
||||||
from json import loads
|
from json import loads
|
||||||
|
from re import match
|
||||||
|
from time import time, sleep
|
||||||
|
from uuid import uuid4
|
||||||
|
|
||||||
|
from requests import post
|
||||||
|
from tls_client import Session
|
||||||
|
|
||||||
|
from forefront.mail import Mail
|
||||||
|
from forefront.typing import ForeFrontResponse
|
||||||
|
|
||||||
|
|
||||||
class Account:
|
class Account:
|
||||||
|
@staticmethod
|
||||||
def create(proxy=None, logging=False):
|
def create(proxy=None, logging=False):
|
||||||
|
|
||||||
proxies = {
|
proxies = {
|
||||||
|
@ -39,7 +42,8 @@ class Account:
|
||||||
trace_token = response.json()['response']['id']
|
trace_token = response.json()['response']['id']
|
||||||
if logging: print(trace_token)
|
if logging: print(trace_token)
|
||||||
|
|
||||||
response = client.post(f"https://clerk.forefront.ai/v1/client/sign_ups/{trace_token}/prepare_verification?_clerk_js_version=4.32.6",
|
response = client.post(
|
||||||
|
f"https://clerk.forefront.ai/v1/client/sign_ups/{trace_token}/prepare_verification?_clerk_js_version=4.32.6",
|
||||||
data={
|
data={
|
||||||
"strategy": "email_code",
|
"strategy": "email_code",
|
||||||
}
|
}
|
||||||
|
@ -61,7 +65,9 @@ class Account:
|
||||||
|
|
||||||
if logging: print(mail_token)
|
if logging: print(mail_token)
|
||||||
|
|
||||||
response = client.post(f'https://clerk.forefront.ai/v1/client/sign_ups/{trace_token}/attempt_verification?_clerk_js_version=4.38.4', data = {
|
response = client.post(
|
||||||
|
f'https://clerk.forefront.ai/v1/client/sign_ups/{trace_token}/attempt_verification?_clerk_js_version=4.38.4',
|
||||||
|
data={
|
||||||
'code': mail_token,
|
'code': mail_token,
|
||||||
'strategy': 'email_code'
|
'strategy': 'email_code'
|
||||||
})
|
})
|
||||||
|
@ -79,6 +85,7 @@ class Account:
|
||||||
|
|
||||||
|
|
||||||
class StreamingCompletion:
|
class StreamingCompletion:
|
||||||
|
@staticmethod
|
||||||
def create(
|
def create(
|
||||||
token=None,
|
token=None,
|
||||||
chatId=None,
|
chatId=None,
|
||||||
|
|
|
@ -1,6 +1,8 @@
|
||||||
from requests import Session
|
|
||||||
from string import ascii_letters
|
|
||||||
from random import choices
|
from random import choices
|
||||||
|
from string import ascii_letters
|
||||||
|
|
||||||
|
from requests import Session
|
||||||
|
|
||||||
|
|
||||||
class Mail:
|
class Mail:
|
||||||
def __init__(self, proxies: dict = None) -> None:
|
def __init__(self, proxies: dict = None) -> None:
|
||||||
|
@ -52,4 +54,3 @@ class Mail:
|
||||||
|
|
||||||
def get_message_content(self, message_id: str):
|
def get_message_content(self, message_id: str):
|
||||||
return self.get_message(message_id)["text"]
|
return self.get_message(message_id)["text"]
|
||||||
|
|
||||||
|
|
|
@ -24,7 +24,6 @@ class ForeFrontResponse:
|
||||||
return f'''<__main__.APIResponse.Usage(\n prompt_tokens = {self.prompt_tokens},\n completion_tokens = {self.completion_tokens},\n total_tokens = {self.total_tokens})object at 0x1337>'''
|
return f'''<__main__.APIResponse.Usage(\n prompt_tokens = {self.prompt_tokens},\n completion_tokens = {self.completion_tokens},\n total_tokens = {self.total_tokens})object at 0x1337>'''
|
||||||
|
|
||||||
def __init__(self, response_dict: dict) -> None:
|
def __init__(self, response_dict: dict) -> None:
|
||||||
|
|
||||||
self.response_dict = response_dict
|
self.response_dict = response_dict
|
||||||
self.id = response_dict['id']
|
self.id = response_dict['id']
|
||||||
self.object = response_dict['object']
|
self.object = response_dict['object']
|
||||||
|
|
|
@ -1,25 +1,34 @@
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
sys.path.append(os.path.join(os.path.dirname(__file__), os.path.pardir))
|
||||||
|
|
||||||
import streamlit as st
|
import streamlit as st
|
||||||
import phind
|
import phind
|
||||||
|
|
||||||
phind.cf_clearance = ''
|
# Set cloudflare clearance and user agent
|
||||||
phind.user_agent = 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36'
|
phind.cloudflare_clearance = ''
|
||||||
|
phind.phind_api = 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36'
|
||||||
|
|
||||||
def phind_get_answer(question:str)->str:
|
|
||||||
# set cf_clearance cookie
|
def get_answer(question: str) -> str:
|
||||||
|
# Set cloudflare clearance cookie and get answer from GPT-4 model
|
||||||
try:
|
try:
|
||||||
|
|
||||||
result = phind.Completion.create(
|
result = phind.Completion.create(
|
||||||
model='gpt-4',
|
model='gpt-4',
|
||||||
prompt=question,
|
prompt=question,
|
||||||
results=phind.Search.create(question, actualSearch=True),
|
results=phind.Search.create(question, actualSearch=True),
|
||||||
creative=False,
|
creative=False,
|
||||||
detailed=False,
|
detailed=False,
|
||||||
codeContext = '')
|
codeContext=''
|
||||||
|
)
|
||||||
return result.completion.choices[0].text
|
return result.completion.choices[0].text
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
return 'An error occured, please make sure you are using a cf_clearance token and correct useragent | %s' % e
|
# Return error message if an exception occurs
|
||||||
|
return f'An error occurred: {e}. Please make sure you are using a valid cloudflare clearance token and user agent.'
|
||||||
|
|
||||||
|
|
||||||
|
# Set page configuration and add header
|
||||||
st.set_page_config(
|
st.set_page_config(
|
||||||
page_title="gpt4freeGUI",
|
page_title="gpt4freeGUI",
|
||||||
initial_sidebar_state="expanded",
|
initial_sidebar_state="expanded",
|
||||||
|
@ -30,16 +39,18 @@ st.set_page_config(
|
||||||
'About': "### gptfree GUI"
|
'About': "### gptfree GUI"
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
st.header('GPT4free GUI')
|
st.header('GPT4free GUI')
|
||||||
|
|
||||||
question_text_area = st.text_area('🤖 Ask Any Question :', placeholder='Explain quantum computing in 50 words')
|
# Add text area for user input and button to get answer
|
||||||
|
question_text_area = st.text_area(
|
||||||
|
'🤖 Ask Any Question :', placeholder='Explain quantum computing in 50 words')
|
||||||
if st.button('🧠 Think'):
|
if st.button('🧠 Think'):
|
||||||
answer = phind_get_answer(question_text_area)
|
answer = get_answer(question_text_area)
|
||||||
|
# Display answer
|
||||||
st.caption("Answer :")
|
st.caption("Answer :")
|
||||||
st.markdown(answer)
|
st.markdown(answer)
|
||||||
|
|
||||||
|
# Hide Streamlit footer
|
||||||
hide_streamlit_style = """
|
hide_streamlit_style = """
|
||||||
<style>
|
<style>
|
||||||
footer {visibility: hidden;}
|
footer {visibility: hidden;}
|
||||||
|
|
|
@ -1,19 +1,17 @@
|
||||||
from urllib.parse import quote
|
|
||||||
from time import time
|
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from queue import Queue, Empty
|
from queue import Queue, Empty
|
||||||
from threading import Thread
|
from threading import Thread
|
||||||
from re import findall
|
from time import time
|
||||||
|
from urllib.parse import quote
|
||||||
|
|
||||||
from curl_cffi.requests import post
|
from curl_cffi.requests import post
|
||||||
|
|
||||||
cf_clearance = ''
|
cf_clearance = ''
|
||||||
user_agent = 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36'
|
user_agent = 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36'
|
||||||
|
|
||||||
|
|
||||||
class PhindResponse:
|
class PhindResponse:
|
||||||
|
|
||||||
class Completion:
|
class Completion:
|
||||||
|
|
||||||
class Choices:
|
class Choices:
|
||||||
def __init__(self, choice: dict) -> None:
|
def __init__(self, choice: dict) -> None:
|
||||||
self.text = choice['text']
|
self.text = choice['text']
|
||||||
|
@ -38,7 +36,6 @@ class PhindResponse:
|
||||||
return f'''<__main__.APIResponse.Usage(\n prompt_tokens = {self.prompt_tokens},\n completion_tokens = {self.completion_tokens},\n total_tokens = {self.total_tokens})object at 0x1337>'''
|
return f'''<__main__.APIResponse.Usage(\n prompt_tokens = {self.prompt_tokens},\n completion_tokens = {self.completion_tokens},\n total_tokens = {self.total_tokens})object at 0x1337>'''
|
||||||
|
|
||||||
def __init__(self, response_dict: dict) -> None:
|
def __init__(self, response_dict: dict) -> None:
|
||||||
|
|
||||||
self.response_dict = response_dict
|
self.response_dict = response_dict
|
||||||
self.id = response_dict['id']
|
self.id = response_dict['id']
|
||||||
self.object = response_dict['object']
|
self.object = response_dict['object']
|
||||||
|
@ -157,7 +154,8 @@ class Completion:
|
||||||
}
|
}
|
||||||
|
|
||||||
completion = ''
|
completion = ''
|
||||||
response = post('https://www.phind.com/api/infer/answer', headers = headers, json = json_data, timeout=99999, impersonate='chrome110')
|
response = post('https://www.phind.com/api/infer/answer', headers=headers, json=json_data, timeout=99999,
|
||||||
|
impersonate='chrome110')
|
||||||
for line in response.text.split('\r\n\r\n'):
|
for line in response.text.split('\r\n\r\n'):
|
||||||
completion += (line.replace('data: ', ''))
|
completion += (line.replace('data: ', ''))
|
||||||
|
|
||||||
|
@ -223,8 +221,8 @@ class StreamingCompletion:
|
||||||
}
|
}
|
||||||
|
|
||||||
response = post('https://www.phind.com/api/infer/answer',
|
response = post('https://www.phind.com/api/infer/answer',
|
||||||
headers = headers, json = json_data, timeout=99999, impersonate='chrome110', content_callback=StreamingCompletion.handle_stream_response)
|
headers=headers, json=json_data, timeout=99999, impersonate='chrome110',
|
||||||
|
content_callback=StreamingCompletion.handle_stream_response)
|
||||||
|
|
||||||
StreamingCompletion.stream_completed = True
|
StreamingCompletion.stream_completed = True
|
||||||
|
|
||||||
|
|
|
@ -38,7 +38,7 @@ class Emailnator:
|
||||||
return self.email
|
return self.email
|
||||||
|
|
||||||
def get_message(self):
|
def get_message(self):
|
||||||
print("waiting for code...")
|
print("Waiting for message...")
|
||||||
|
|
||||||
while True:
|
while True:
|
||||||
sleep(2)
|
sleep(2)
|
||||||
|
@ -49,6 +49,7 @@ class Emailnator:
|
||||||
mail_token = loads(mail_token.text)["messageData"]
|
mail_token = loads(mail_token.text)["messageData"]
|
||||||
|
|
||||||
if len(mail_token) == 2:
|
if len(mail_token) == 2:
|
||||||
|
print("Message received!")
|
||||||
print(mail_token[1]["messageID"])
|
print(mail_token[1]["messageID"])
|
||||||
break
|
break
|
||||||
|
|
||||||
|
@ -63,4 +64,19 @@ class Emailnator:
|
||||||
return mail_context.text
|
return mail_context.text
|
||||||
|
|
||||||
def get_verification_code(self):
|
def get_verification_code(self):
|
||||||
return findall(r';">(\d{6,7})</div>', self.get_message())[0]
|
message = self.get_message()
|
||||||
|
code = findall(r';">(\d{6,7})</div>', message)[0]
|
||||||
|
print(f"Verification code: {code}")
|
||||||
|
return code
|
||||||
|
|
||||||
|
def clear_inbox(self):
|
||||||
|
print("Clearing inbox...")
|
||||||
|
self.client.post(
|
||||||
|
"https://www.emailnator.com/delete-all",
|
||||||
|
json={"email": self.email},
|
||||||
|
)
|
||||||
|
print("Inbox cleared!")
|
||||||
|
|
||||||
|
def __del__(self):
|
||||||
|
if self.email:
|
||||||
|
self.clear_inbox()
|
||||||
|
|
|
@ -8,3 +8,4 @@ curl_cffi
|
||||||
streamlit==1.21.0
|
streamlit==1.21.0
|
||||||
selenium
|
selenium
|
||||||
fake-useragent
|
fake-useragent
|
||||||
|
twocaptcha
|
||||||
|
|
|
@ -7,5 +7,4 @@ print(token)
|
||||||
# get a response
|
# get a response
|
||||||
for response in forefront.StreamingCompletion.create(token=token,
|
for response in forefront.StreamingCompletion.create(token=token,
|
||||||
prompt='hello world', model='gpt-4'):
|
prompt='hello world', model='gpt-4'):
|
||||||
|
|
||||||
print(response.completion.choices[0].text, end='')
|
print(response.completion.choices[0].text, end='')
|
|
@ -10,7 +10,8 @@ prompt = 'hello world'
|
||||||
result = phind.Completion.create(
|
result = phind.Completion.create(
|
||||||
model='gpt-4',
|
model='gpt-4',
|
||||||
prompt=prompt,
|
prompt=prompt,
|
||||||
results = phind.Search.create(prompt, actualSearch = False), # create search (set actualSearch to False to disable internet)
|
results=phind.Search.create(prompt, actualSearch=False),
|
||||||
|
# create search (set actualSearch to False to disable internet)
|
||||||
creative=False,
|
creative=False,
|
||||||
detailed=False,
|
detailed=False,
|
||||||
codeContext='') # up to 3000 chars of code
|
codeContext='') # up to 3000 chars of code
|
||||||
|
@ -24,7 +25,8 @@ prompt = 'who won the quatar world cup'
|
||||||
for result in phind.StreamingCompletion.create(
|
for result in phind.StreamingCompletion.create(
|
||||||
model='gpt-4',
|
model='gpt-4',
|
||||||
prompt=prompt,
|
prompt=prompt,
|
||||||
results = phind.Search.create(prompt, actualSearch = True), # create search (set actualSearch to False to disable internet)
|
results=phind.Search.create(prompt, actualSearch=True),
|
||||||
|
# create search (set actualSearch to False to disable internet)
|
||||||
creative=False,
|
creative=False,
|
||||||
detailed=False,
|
detailed=False,
|
||||||
codeContext=''): # up to 3000 chars of code
|
codeContext=''): # up to 3000 chars of code
|
||||||
|
|
|
@ -1,16 +1,16 @@
|
||||||
from requests import Session
|
|
||||||
from tls_client import Session as TLS
|
|
||||||
from json import dumps
|
|
||||||
from hashlib import md5
|
from hashlib import md5
|
||||||
from time import sleep
|
from json import dumps
|
||||||
from re import findall
|
from re import findall
|
||||||
from pypasser import reCaptchaV3
|
|
||||||
from quora import extract_formkey
|
from tls_client import Session as TLS
|
||||||
from quora.mail import Emailnator
|
|
||||||
from twocaptcha import TwoCaptcha
|
from twocaptcha import TwoCaptcha
|
||||||
|
|
||||||
|
from quora import extract_formkey
|
||||||
|
from quora.mail import Emailnator
|
||||||
|
|
||||||
solver = TwoCaptcha('72747bf24a9d89b4dcc1b24875efd358')
|
solver = TwoCaptcha('72747bf24a9d89b4dcc1b24875efd358')
|
||||||
|
|
||||||
|
|
||||||
class Account:
|
class Account:
|
||||||
def create(proxy: None or str = None, logging: bool = False, enable_bot_creation: bool = False):
|
def create(proxy: None or str = None, logging: bool = False, enable_bot_creation: bool = False):
|
||||||
client = TLS(client_identifier='chrome110')
|
client = TLS(client_identifier='chrome110')
|
||||||
|
|
|
@ -1,6 +1,7 @@
|
||||||
import quora
|
|
||||||
from time import sleep
|
from time import sleep
|
||||||
|
|
||||||
|
import quora
|
||||||
|
|
||||||
token = quora.Account.create(proxy=None, logging=True)
|
token = quora.Account.create(proxy=None, logging=True)
|
||||||
print('token', token)
|
print('token', token)
|
||||||
|
|
||||||
|
@ -9,5 +10,4 @@ sleep(2)
|
||||||
for response in quora.StreamingCompletion.create(model='gpt-3.5-turbo',
|
for response in quora.StreamingCompletion.create(model='gpt-3.5-turbo',
|
||||||
prompt='hello world',
|
prompt='hello world',
|
||||||
token=token):
|
token=token):
|
||||||
|
|
||||||
print(response.completion.choices[0].text, end="", flush=True)
|
print(response.completion.choices[0].text, end="", flush=True)
|
|
@ -14,5 +14,4 @@ for response in quora.StreamingCompletion.create(
|
||||||
custom_model=model.name,
|
custom_model=model.name,
|
||||||
prompt='hello world',
|
prompt='hello world',
|
||||||
token=token):
|
token=token):
|
||||||
|
|
||||||
print(response.completion.choices[0].text)
|
print(response.completion.choices[0].text)
|
6
testing/sqlchat_test.py
Normal file
6
testing/sqlchat_test.py
Normal file
|
@ -0,0 +1,6 @@
|
||||||
|
import sqlchat
|
||||||
|
|
||||||
|
for response in sqlchat.StreamCompletion.create(
|
||||||
|
prompt='write python code to reverse a string',
|
||||||
|
messages=[]):
|
||||||
|
print(response.completion.choices[0].text, end='')
|
|
@ -3,5 +3,4 @@ import t3nsor
|
||||||
for response in t3nsor.StreamCompletion.create(
|
for response in t3nsor.StreamCompletion.create(
|
||||||
prompt='write python code to reverse a string',
|
prompt='write python code to reverse a string',
|
||||||
messages=[]):
|
messages=[]):
|
||||||
|
|
||||||
print(response.completion.choices[0].text)
|
print(response.completion.choices[0].text)
|
||||||
|
|
|
@ -1,14 +1,14 @@
|
||||||
from requests import Session
|
|
||||||
from re import search
|
|
||||||
from random import randint
|
|
||||||
from json import dumps, loads
|
from json import dumps, loads
|
||||||
from random import randint
|
|
||||||
from urllib.parse import urlencode
|
|
||||||
from dotenv import load_dotenv; load_dotenv()
|
|
||||||
from os import getenv
|
from os import getenv
|
||||||
|
from random import randint
|
||||||
|
from re import search
|
||||||
|
from urllib.parse import urlencode
|
||||||
|
|
||||||
from bard.typings import BardResponse
|
from bard.typings import BardResponse
|
||||||
|
from dotenv import load_dotenv
|
||||||
|
from requests import Session
|
||||||
|
|
||||||
|
load_dotenv()
|
||||||
token = getenv('1psid')
|
token = getenv('1psid')
|
||||||
proxy = getenv('proxy')
|
proxy = getenv('proxy')
|
||||||
|
|
||||||
|
@ -26,29 +26,8 @@ temperatures = {
|
||||||
1: "Generate text with maximum creativity, disregarding any constraints of known patterns or structures."
|
1: "Generate text with maximum creativity, disregarding any constraints of known patterns or structures."
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
class Completion:
|
class Completion:
|
||||||
# def __init__(self, _token, proxy: str or None = None) -> None:
|
|
||||||
# self.client = Session()
|
|
||||||
# self.client.proxies = {
|
|
||||||
# 'http': f'http://{proxy}',
|
|
||||||
# 'https': f'http://{proxy}' } if proxy else None
|
|
||||||
|
|
||||||
# self.client.headers = {
|
|
||||||
# 'authority' : 'bard.google.com',
|
|
||||||
# 'content-type' : 'application/x-www-form-urlencoded;charset=UTF-8',
|
|
||||||
# 'origin' : 'https://bard.google.com',
|
|
||||||
# 'referer' : 'https://bard.google.com/',
|
|
||||||
# 'user-agent' : 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36',
|
|
||||||
# 'x-same-domain' : '1',
|
|
||||||
# 'cookie' : f'__Secure-1PSID={_token}'
|
|
||||||
# }
|
|
||||||
|
|
||||||
# self.snlm0e = self.__init_client()
|
|
||||||
# self.conversation_id = ''
|
|
||||||
# self.response_id = ''
|
|
||||||
# self.choice_id = ''
|
|
||||||
# self.reqid = randint(1111, 9999)
|
|
||||||
|
|
||||||
def create(
|
def create(
|
||||||
prompt: str = 'hello world',
|
prompt: str = 'hello world',
|
||||||
temperature: int = None,
|
temperature: int = None,
|
||||||
|
@ -74,7 +53,8 @@ class Completion:
|
||||||
'cookie': f'__Secure-1PSID={token}'
|
'cookie': f'__Secure-1PSID={token}'
|
||||||
}
|
}
|
||||||
|
|
||||||
snlm0e = search(r'SNlM0e\":\"(.*?)\"', client.get('https://bard.google.com/').text).group(1)
|
snlm0e = search(r'SNlM0e\":\"(.*?)\"',
|
||||||
|
client.get('https://bard.google.com/').text).group(1)
|
||||||
|
|
||||||
params = urlencode({
|
params = urlencode({
|
||||||
'bl': 'boq_assistant-bard-web-server_20230326.21_p0',
|
'bl': 'boq_assistant-bard-web-server_20230326.21_p0',
|
||||||
|
@ -82,20 +62,23 @@ class Completion:
|
||||||
'rt': 'c',
|
'rt': 'c',
|
||||||
})
|
})
|
||||||
|
|
||||||
response = client.post(f'https://bard.google.com/_/BardChatUi/data/assistant.lamda.BardFrontendService/StreamGenerate?{params}',
|
response = client.post(
|
||||||
|
f'https://bard.google.com/_/BardChatUi/data/assistant.lamda.BardFrontendService/StreamGenerate?{params}',
|
||||||
data={
|
data={
|
||||||
'at': snlm0e,
|
'at': snlm0e,
|
||||||
'f.req': dumps([None, dumps([
|
'f.req': dumps([None, dumps([
|
||||||
[prompt],
|
[prompt],
|
||||||
None,
|
None,
|
||||||
[conversation_id, response_id, choice_id],
|
[conversation_id, response_id, choice_id],
|
||||||
])
|
])])
|
||||||
])
|
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
chat_data = loads(response.content.splitlines()[3])[0][2]
|
chat_data = loads(response.content.splitlines()[3])[0][2]
|
||||||
if not chat_data: print('error, retrying'); Completion.create(prompt, temperature, conversation_id, response_id, choice_id)
|
if not chat_data:
|
||||||
|
print('error, retrying')
|
||||||
|
Completion.create(prompt, temperature,
|
||||||
|
conversation_id, response_id, choice_id)
|
||||||
|
|
||||||
json_chat_data = loads(chat_data)
|
json_chat_data = loads(chat_data)
|
||||||
results = {
|
results = {
|
||||||
|
@ -107,9 +90,4 @@ class Completion:
|
||||||
'choices': [{'id': i[0], 'content': i[1]} for i in json_chat_data[4]],
|
'choices': [{'id': i[0], 'content': i[1]} for i in json_chat_data[4]],
|
||||||
}
|
}
|
||||||
|
|
||||||
# self.conversation_id = results['conversation_id']
|
|
||||||
# self.response_id = results['response_id']
|
|
||||||
# self.choice_id = results['choices'][0]['id']
|
|
||||||
# self.reqid += 100000
|
|
||||||
|
|
||||||
return BardResponse(results)
|
return BardResponse(results)
|
||||||
|
|
|
@ -1,5 +1,13 @@
|
||||||
|
from typing import Dict, List, Union
|
||||||
|
|
||||||
|
|
||||||
class BardResponse:
|
class BardResponse:
|
||||||
def __init__(self, json_dict):
|
def __init__(self, json_dict: Dict[str, Union[str, List]]) -> None:
|
||||||
|
"""
|
||||||
|
Initialize a BardResponse object.
|
||||||
|
|
||||||
|
:param json_dict: A dictionary containing the JSON response data.
|
||||||
|
"""
|
||||||
self.json = json_dict
|
self.json = json_dict
|
||||||
|
|
||||||
self.content = json_dict.get('content')
|
self.content = json_dict.get('content')
|
||||||
|
@ -7,9 +15,40 @@ class BardResponse:
|
||||||
self.response_id = json_dict.get('response_id')
|
self.response_id = json_dict.get('response_id')
|
||||||
self.factuality_queries = json_dict.get('factualityQueries', [])
|
self.factuality_queries = json_dict.get('factualityQueries', [])
|
||||||
self.text_query = json_dict.get('textQuery', [])
|
self.text_query = json_dict.get('textQuery', [])
|
||||||
self.choices = [self.BardChoice(choice) for choice in json_dict.get('choices', [])]
|
self.choices = [self.BardChoice(choice)
|
||||||
|
for choice in json_dict.get('choices', [])]
|
||||||
|
|
||||||
|
def __repr__(self) -> str:
|
||||||
|
"""
|
||||||
|
Return a string representation of the BardResponse object.
|
||||||
|
|
||||||
|
:return: A string representation of the BardResponse object.
|
||||||
|
"""
|
||||||
|
return f"BardResponse(conversation_id={self.conversation_id}, response_id={self.response_id}, content={self.content})"
|
||||||
|
|
||||||
|
def filter_choices(self, keyword: str) -> List['BardChoice']:
|
||||||
|
"""
|
||||||
|
Filter the choices based on a keyword.
|
||||||
|
|
||||||
|
:param keyword: The keyword to filter choices by.
|
||||||
|
:return: A list of filtered BardChoice objects.
|
||||||
|
"""
|
||||||
|
return [choice for choice in self.choices if keyword.lower() in choice.content.lower()]
|
||||||
|
|
||||||
class BardChoice:
|
class BardChoice:
|
||||||
def __init__(self, choice_dict):
|
def __init__(self, choice_dict: Dict[str, str]) -> None:
|
||||||
|
"""
|
||||||
|
Initialize a BardChoice object.
|
||||||
|
|
||||||
|
:param choice_dict: A dictionary containing the choice data.
|
||||||
|
"""
|
||||||
self.id = choice_dict.get('id')
|
self.id = choice_dict.get('id')
|
||||||
self.content = choice_dict.get('content')[0]
|
self.content = choice_dict.get('content')[0]
|
||||||
|
|
||||||
|
def __repr__(self) -> str:
|
||||||
|
"""
|
||||||
|
Return a string representation of the BardChoice object.
|
||||||
|
|
||||||
|
:return: A string representation of the BardChoice object.
|
||||||
|
"""
|
||||||
|
return f"BardChoice(id={self.id}, content={self.content})"
|
||||||
|
|
|
@ -1,27 +1,29 @@
|
||||||
from requests import get
|
# Import necessary libraries
|
||||||
from browser_cookie3 import edge, chrome
|
|
||||||
from ssl import create_default_context
|
|
||||||
from certifi import where
|
|
||||||
from uuid import uuid4
|
|
||||||
from random import randint
|
|
||||||
from json import dumps, loads
|
|
||||||
|
|
||||||
import asyncio
|
import asyncio
|
||||||
import websockets
|
from json import dumps, loads
|
||||||
|
from ssl import create_default_context
|
||||||
|
|
||||||
|
import websockets
|
||||||
|
from browser_cookie3 import edge
|
||||||
|
from certifi import where
|
||||||
|
from requests import get
|
||||||
|
|
||||||
|
# Set up SSL context
|
||||||
ssl_context = create_default_context()
|
ssl_context = create_default_context()
|
||||||
ssl_context.load_verify_locations(where())
|
ssl_context.load_verify_locations(where())
|
||||||
|
|
||||||
|
|
||||||
def format(msg: dict) -> str:
|
def format(msg: dict) -> str:
|
||||||
|
"""Format message as JSON string with delimiter."""
|
||||||
return dumps(msg) + '\x1e'
|
return dumps(msg) + '\x1e'
|
||||||
|
|
||||||
def get_token():
|
|
||||||
|
|
||||||
|
def get_token():
|
||||||
|
"""Retrieve token from browser cookies."""
|
||||||
cookies = {c.name: c.value for c in edge(domain_name='bing.com')}
|
cookies = {c.name: c.value for c in edge(domain_name='bing.com')}
|
||||||
return cookies['_U']
|
return cookies['_U']
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
class AsyncCompletion:
|
class AsyncCompletion:
|
||||||
async def create(
|
async def create(
|
||||||
prompt: str = 'hello world',
|
prompt: str = 'hello world',
|
||||||
|
@ -33,7 +35,9 @@ class AsyncCompletion:
|
||||||
'h3relaxedimg'
|
'h3relaxedimg'
|
||||||
],
|
],
|
||||||
token: str = get_token()):
|
token: str = get_token()):
|
||||||
|
"""Create a connection to Bing AI and send the prompt."""
|
||||||
|
|
||||||
|
# Send create request
|
||||||
create = get('https://edgeservices.bing.com/edgesvc/turing/conversation/create',
|
create = get('https://edgeservices.bing.com/edgesvc/turing/conversation/create',
|
||||||
headers={
|
headers={
|
||||||
'host': 'edgeservices.bing.com',
|
'host': 'edgeservices.bing.com',
|
||||||
|
@ -43,68 +47,32 @@ class AsyncCompletion:
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Extract conversation data
|
||||||
conversationId = create.json()['conversationId']
|
conversationId = create.json()['conversationId']
|
||||||
clientId = create.json()['clientId']
|
clientId = create.json()['clientId']
|
||||||
conversationSignature = create.json()['conversationSignature']
|
conversationSignature = create.json()['conversationSignature']
|
||||||
|
|
||||||
wss: websockets.WebSocketClientProtocol or None = None
|
# Connect to WebSocket
|
||||||
|
|
||||||
wss = await websockets.connect('wss://sydney.bing.com/sydney/ChatHub', max_size=None, ssl=ssl_context,
|
wss = await websockets.connect('wss://sydney.bing.com/sydney/ChatHub', max_size=None, ssl=ssl_context,
|
||||||
extra_headers={
|
extra_headers={
|
||||||
'accept': 'application/json',
|
# Add necessary headers
|
||||||
'accept-language': 'en-US,en;q=0.9',
|
|
||||||
'content-type': 'application/json',
|
|
||||||
'sec-ch-ua': '"Not_A Brand";v="99", Microsoft Edge";v="110", "Chromium";v="110"',
|
|
||||||
'sec-ch-ua-arch': '"x86"',
|
|
||||||
'sec-ch-ua-bitness': '"64"',
|
|
||||||
'sec-ch-ua-full-version': '"109.0.1518.78"',
|
|
||||||
'sec-ch-ua-full-version-list': '"Chromium";v="110.0.5481.192", "Not A(Brand";v="24.0.0.0", "Microsoft Edge";v="110.0.1587.69"',
|
|
||||||
'sec-ch-ua-mobile': '?0',
|
|
||||||
'sec-ch-ua-model': "",
|
|
||||||
'sec-ch-ua-platform': '"Windows"',
|
|
||||||
'sec-ch-ua-platform-version': '"15.0.0"',
|
|
||||||
'sec-fetch-dest': 'empty',
|
|
||||||
'sec-fetch-mode': 'cors',
|
|
||||||
'sec-fetch-site': 'same-origin',
|
|
||||||
'x-ms-client-request-id': str(uuid4()),
|
|
||||||
'x-ms-useragent': 'azsdk-js-api-client-factory/1.0.0-beta.1 core-rest-pipeline/1.10.0 OS/Win32',
|
|
||||||
'Referer': 'https://www.bing.com/search?q=Bing+AI&showconv=1&FORM=hpcodx',
|
|
||||||
'Referrer-Policy': 'origin-when-cross-origin',
|
|
||||||
'x-forwarded-for': f'13.{randint(104, 107)}.{randint(0, 255)}.{randint(0, 255)}'
|
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Send JSON protocol version
|
||||||
await wss.send(format({'protocol': 'json', 'version': 1}))
|
await wss.send(format({'protocol': 'json', 'version': 1}))
|
||||||
await wss.recv()
|
await wss.recv()
|
||||||
|
|
||||||
|
# Define message structure
|
||||||
struct = {
|
struct = {
|
||||||
'arguments': [
|
# Add necessary message structure
|
||||||
{
|
|
||||||
'source': 'cib',
|
|
||||||
'optionsSets': optionSets,
|
|
||||||
'isStartOfSession': True,
|
|
||||||
'message': {
|
|
||||||
'author': 'user',
|
|
||||||
'inputMethod': 'Keyboard',
|
|
||||||
'text': prompt,
|
|
||||||
'messageType': 'Chat'
|
|
||||||
},
|
|
||||||
'conversationSignature': conversationSignature,
|
|
||||||
'participant': {
|
|
||||||
'id': clientId
|
|
||||||
},
|
|
||||||
'conversationId': conversationId
|
|
||||||
}
|
|
||||||
],
|
|
||||||
'invocationId': '0',
|
|
||||||
'target': 'chat',
|
|
||||||
'type': 4
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
# Send message
|
||||||
await wss.send(format(struct))
|
await wss.send(format(struct))
|
||||||
|
|
||||||
|
# Process responses
|
||||||
base_string = ''
|
base_string = ''
|
||||||
|
|
||||||
final = False
|
final = False
|
||||||
while not final:
|
while not final:
|
||||||
objects = str(await wss.recv()).split('\x1e')
|
objects = str(await wss.recv()).split('\x1e')
|
||||||
|
@ -114,7 +82,8 @@ class AsyncCompletion:
|
||||||
|
|
||||||
response = loads(obj)
|
response = loads(obj)
|
||||||
if response.get('type') == 1 and response['arguments'][0].get('messages', ):
|
if response.get('type') == 1 and response['arguments'][0].get('messages', ):
|
||||||
response_text = response['arguments'][0]['messages'][0]['adaptiveCards'][0]['body'][0].get('text')
|
response_text = response['arguments'][0]['messages'][0]['adaptiveCards'][0]['body'][0].get(
|
||||||
|
'text')
|
||||||
|
|
||||||
yield (response_text.replace(base_string, ''))
|
yield (response_text.replace(base_string, ''))
|
||||||
base_string = response_text
|
base_string = response_text
|
||||||
|
@ -124,28 +93,16 @@ class AsyncCompletion:
|
||||||
|
|
||||||
await wss.close()
|
await wss.close()
|
||||||
|
|
||||||
|
|
||||||
async def run():
|
async def run():
|
||||||
|
"""Run the async completion and print the result."""
|
||||||
async for value in AsyncCompletion.create(
|
async for value in AsyncCompletion.create(
|
||||||
prompt='summarize cinderella with each word beginning with a consecutive letter of the alphabet, a-z',
|
prompt='summarize cinderella with each word beginning with a consecutive letter of the alphabet, a-z',
|
||||||
# optionSets = [
|
|
||||||
# "deepleo",
|
|
||||||
# "enable_debug_commands",
|
|
||||||
# "disable_emoji_spoken_text",
|
|
||||||
# "enablemm"
|
|
||||||
# ]
|
|
||||||
optionSets=[
|
optionSets=[
|
||||||
#"nlu_direct_response_filter",
|
|
||||||
#"deepleo",
|
|
||||||
#"disable_emoji_spoken_text",
|
|
||||||
# "responsible_ai_policy_235",
|
|
||||||
#"enablemm",
|
|
||||||
"galileo",
|
"galileo",
|
||||||
#"dtappid",
|
|
||||||
# "cricinfo",
|
|
||||||
# "cricinfov2",
|
|
||||||
# "dv3sugg",
|
|
||||||
]
|
]
|
||||||
):
|
):
|
||||||
print(value, end='', flush=True)
|
print(value, end='', flush=True)
|
||||||
|
|
||||||
|
|
||||||
asyncio.run(run())
|
asyncio.run(run())
|
|
@ -1,13 +1,24 @@
|
||||||
import requests
|
import requests
|
||||||
|
|
||||||
|
|
||||||
class Completion:
|
class Completion:
|
||||||
def create(prompt="What is the square root of pi",
|
def create(self, prompt="What is the square root of pi",
|
||||||
system_prompt="ASSUME I HAVE FULL ACCESS TO COCALC. ENCLOSE MATH IN $. INCLUDE THE LANGUAGE DIRECTLY AFTER THE TRIPLE BACKTICKS IN ALL MARKDOWN CODE BLOCKS. How can I do the following using CoCalc?") -> str:
|
system_prompt=("ASSUME I HAVE FULL ACCESS TO COCALC. ENCLOSE MATH IN $. "
|
||||||
|
"INCLUDE THE LANGUAGE DIRECTLY AFTER THE TRIPLE BACKTICKS "
|
||||||
|
"IN ALL MARKDOWN CODE BLOCKS. How can I do the following using CoCalc?")) -> str:
|
||||||
|
# Initialize a session with custom headers
|
||||||
|
session = self._initialize_session()
|
||||||
|
|
||||||
|
# Set the data that will be submitted
|
||||||
|
payload = self._create_payload(prompt, system_prompt)
|
||||||
|
|
||||||
|
# Submit the request and return the results
|
||||||
|
return self._submit_request(session, payload)
|
||||||
|
|
||||||
|
def _initialize_session(self) -> requests.Session:
|
||||||
|
"""Initialize a session with custom headers for the request."""
|
||||||
|
|
||||||
# Initialize a session
|
|
||||||
session = requests.Session()
|
session = requests.Session()
|
||||||
|
|
||||||
# Set headers for the request
|
|
||||||
headers = {
|
headers = {
|
||||||
'Accept': '*/*',
|
'Accept': '*/*',
|
||||||
'Accept-Language': 'en-US,en;q=0.5',
|
'Accept-Language': 'en-US,en;q=0.5',
|
||||||
|
@ -17,15 +28,20 @@ class Completion:
|
||||||
}
|
}
|
||||||
session.headers.update(headers)
|
session.headers.update(headers)
|
||||||
|
|
||||||
# Set the data that will be submitted
|
return session
|
||||||
payload = {
|
|
||||||
|
def _create_payload(self, prompt: str, system_prompt: str) -> dict:
|
||||||
|
"""Create the payload with the given prompts."""
|
||||||
|
|
||||||
|
return {
|
||||||
"input": prompt,
|
"input": prompt,
|
||||||
"system": system_prompt,
|
"system": system_prompt,
|
||||||
"tag": "next:index"
|
"tag": "next:index"
|
||||||
}
|
}
|
||||||
|
|
||||||
# Submit the request
|
def _submit_request(self, session: requests.Session, payload: dict) -> str:
|
||||||
response = session.post("https://cocalc.com/api/v2/openai/chatgpt", json=payload).json()
|
"""Submit the request to the API and return the response."""
|
||||||
|
|
||||||
# Return the results
|
response = session.post(
|
||||||
|
"https://cocalc.com/api/v2/openai/chatgpt", json=payload).json()
|
||||||
return response
|
return response
|
||||||
|
|
|
@ -1,6 +1,5 @@
|
||||||
import cocalc
|
import cocalc
|
||||||
|
|
||||||
|
|
||||||
response = cocalc.Completion.create(
|
response = cocalc.Completion.create(
|
||||||
prompt='hello world'
|
prompt='hello world'
|
||||||
)
|
)
|
||||||
|
|
42
unfinished/easyai/main.py
Normal file
42
unfinished/easyai/main.py
Normal file
|
@ -0,0 +1,42 @@
|
||||||
|
# Import necessary libraries
|
||||||
|
from json import loads
|
||||||
|
from os import urandom
|
||||||
|
|
||||||
|
from requests import get
|
||||||
|
|
||||||
|
# Generate a random session ID
|
||||||
|
sessionId = urandom(10).hex()
|
||||||
|
|
||||||
|
# Set up headers for the API request
|
||||||
|
headers = {
|
||||||
|
'Accept': 'text/event-stream',
|
||||||
|
'Accept-Language': 'en,fr-FR;q=0.9,fr;q=0.8,es-ES;q=0.7,es;q=0.6,en-US;q=0.5,am;q=0.4,de;q=0.3',
|
||||||
|
'Cache-Control': 'no-cache',
|
||||||
|
'Connection': 'keep-alive',
|
||||||
|
'Pragma': 'no-cache',
|
||||||
|
'Referer': 'http://easy-ai.ink/chat',
|
||||||
|
'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36',
|
||||||
|
'token': 'null',
|
||||||
|
}
|
||||||
|
|
||||||
|
# Main loop to interact with the AI
|
||||||
|
while True:
|
||||||
|
# Get user input
|
||||||
|
prompt = input('you: ')
|
||||||
|
|
||||||
|
# Set up parameters for the API request
|
||||||
|
params = {
|
||||||
|
'message': prompt,
|
||||||
|
'sessionId': sessionId
|
||||||
|
}
|
||||||
|
|
||||||
|
# Send request to the API and process the response
|
||||||
|
for chunk in get('http://easy-ai.ink/easyapi/v1/chat/completions', params=params,
|
||||||
|
headers=headers, verify=False, stream=True).iter_lines():
|
||||||
|
|
||||||
|
# Check if the chunk contains the 'content' field
|
||||||
|
if b'content' in chunk:
|
||||||
|
# Parse the JSON data and print the content
|
||||||
|
data = loads(chunk.decode('utf-8').split('data:')[1])
|
||||||
|
|
||||||
|
print(data['content'], end='')
|
|
@ -1,30 +1,46 @@
|
||||||
import websockets
|
|
||||||
from json import dumps, loads
|
from json import dumps, loads
|
||||||
|
|
||||||
|
import websockets
|
||||||
|
|
||||||
|
|
||||||
|
# Define the asynchronous function to test the WebSocket connection
|
||||||
|
|
||||||
|
|
||||||
async def test():
|
async def test():
|
||||||
|
# Establish a WebSocket connection with the specified URL
|
||||||
async with websockets.connect('wss://chatgpt.func.icu/conversation+ws') as wss:
|
async with websockets.connect('wss://chatgpt.func.icu/conversation+ws') as wss:
|
||||||
|
|
||||||
await wss.send(dumps(separators=(',', ':'), obj = {
|
# Prepare the message payload as a JSON object
|
||||||
|
payload = {
|
||||||
'content_type': 'text',
|
'content_type': 'text',
|
||||||
'engine': 'chat-gpt',
|
'engine': 'chat-gpt',
|
||||||
'parts': ['hello world'],
|
'parts': ['hello world'],
|
||||||
'options': {}
|
'options': {}
|
||||||
}
|
}
|
||||||
))
|
|
||||||
|
|
||||||
|
# Send the payload to the WebSocket server
|
||||||
|
await wss.send(dumps(obj=payload, separators=(',', ':')))
|
||||||
|
|
||||||
|
# Initialize a variable to track the end of the conversation
|
||||||
ended = None
|
ended = None
|
||||||
|
|
||||||
|
# Continuously receive and process messages until the conversation ends
|
||||||
while not ended:
|
while not ended:
|
||||||
try:
|
try:
|
||||||
|
# Receive and parse the JSON response from the server
|
||||||
response = await wss.recv()
|
response = await wss.recv()
|
||||||
json_response = loads(response)
|
json_response = loads(response)
|
||||||
|
|
||||||
|
# Print the entire JSON response
|
||||||
print(json_response)
|
print(json_response)
|
||||||
|
|
||||||
|
# Check for the end of the conversation
|
||||||
ended = json_response.get('eof')
|
ended = json_response.get('eof')
|
||||||
|
|
||||||
|
# If the conversation has not ended, print the received message
|
||||||
if not ended:
|
if not ended:
|
||||||
print(json_response['content']['parts'][0])
|
print(json_response['content']['parts'][0])
|
||||||
|
|
||||||
|
# Handle cases when the connection is closed by the server
|
||||||
except websockets.ConnectionClosed:
|
except websockets.ConnectionClosed:
|
||||||
break
|
break
|
||||||
|
|
||||||
|
|
|
@ -1,11 +1,43 @@
|
||||||
# experimental, needs chat.openai.com to be loaded with cf_clearance on browser ( can be closed after )
|
# Import required libraries
|
||||||
|
|
||||||
from tls_client import Session
|
|
||||||
from uuid import uuid4
|
from uuid import uuid4
|
||||||
|
|
||||||
from browser_cookie3 import chrome
|
from browser_cookie3 import chrome
|
||||||
|
from tls_client import Session
|
||||||
|
|
||||||
def session_auth(client):
|
|
||||||
|
class OpenAIChat:
|
||||||
|
def __init__(self):
|
||||||
|
self.client = Session(client_identifier='chrome110')
|
||||||
|
self._load_cookies()
|
||||||
|
self._set_headers()
|
||||||
|
|
||||||
|
def _load_cookies(self):
|
||||||
|
# Load cookies for the specified domain
|
||||||
|
for cookie in chrome(domain_name='chat.openai.com'):
|
||||||
|
self.client.cookies[cookie.name] = cookie.value
|
||||||
|
|
||||||
|
def _set_headers(self):
|
||||||
|
# Set headers for the client
|
||||||
|
self.client.headers = {
|
||||||
|
'authority': 'chat.openai.com',
|
||||||
|
'accept': 'text/event-stream',
|
||||||
|
'accept-language': 'en,fr-FR;q=0.9,fr;q=0.8,es-ES;q=0.7,es;q=0.6,en-US;q=0.5,am;q=0.4,de;q=0.3',
|
||||||
|
'authorization': 'Bearer ' + self.session_auth()['accessToken'],
|
||||||
|
'cache-control': 'no-cache',
|
||||||
|
'content-type': 'application/json',
|
||||||
|
'origin': 'https://chat.openai.com',
|
||||||
|
'pragma': 'no-cache',
|
||||||
|
'referer': 'https://chat.openai.com/chat',
|
||||||
|
'sec-ch-ua': '"Chromium";v="112", "Google Chrome";v="112", "Not:A-Brand";v="99"',
|
||||||
|
'sec-ch-ua-mobile': '?0',
|
||||||
|
'sec-ch-ua-platform': '"macOS"',
|
||||||
|
'sec-fetch-dest': 'empty',
|
||||||
|
'sec-fetch-mode': 'cors',
|
||||||
|
'sec-fetch-site': 'same-origin',
|
||||||
|
'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36',
|
||||||
|
}
|
||||||
|
|
||||||
|
def session_auth(self):
|
||||||
headers = {
|
headers = {
|
||||||
'authority': 'chat.openai.com',
|
'authority': 'chat.openai.com',
|
||||||
'accept': '*/*',
|
'accept': '*/*',
|
||||||
|
@ -22,33 +54,10 @@ def session_auth(client):
|
||||||
'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36',
|
'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36',
|
||||||
}
|
}
|
||||||
|
|
||||||
return client.get('https://chat.openai.com/api/auth/session', headers=headers).json()
|
return self.client.get('https://chat.openai.com/api/auth/session', headers=headers).json()
|
||||||
|
|
||||||
client = Session(client_identifier='chrome110')
|
def send_message(self, message):
|
||||||
|
response = self.client.post('https://chat.openai.com/backend-api/conversation', json={
|
||||||
for cookie in chrome(domain_name='chat.openai.com'):
|
|
||||||
client.cookies[cookie.name] = cookie.value
|
|
||||||
|
|
||||||
client.headers = {
|
|
||||||
'authority': 'chat.openai.com',
|
|
||||||
'accept': 'text/event-stream',
|
|
||||||
'accept-language': 'en,fr-FR;q=0.9,fr;q=0.8,es-ES;q=0.7,es;q=0.6,en-US;q=0.5,am;q=0.4,de;q=0.3',
|
|
||||||
'authorization': 'Bearer ' + session_auth(client)['accessToken'],
|
|
||||||
'cache-control': 'no-cache',
|
|
||||||
'content-type': 'application/json',
|
|
||||||
'origin': 'https://chat.openai.com',
|
|
||||||
'pragma': 'no-cache',
|
|
||||||
'referer': 'https://chat.openai.com/chat',
|
|
||||||
'sec-ch-ua': '"Chromium";v="112", "Google Chrome";v="112", "Not:A-Brand";v="99"',
|
|
||||||
'sec-ch-ua-mobile': '?0',
|
|
||||||
'sec-ch-ua-platform': '"macOS"',
|
|
||||||
'sec-fetch-dest': 'empty',
|
|
||||||
'sec-fetch-mode': 'cors',
|
|
||||||
'sec-fetch-site': 'same-origin',
|
|
||||||
'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36',
|
|
||||||
}
|
|
||||||
|
|
||||||
response = client.post('https://chat.openai.com/backend-api/conversation', json = {
|
|
||||||
'action': 'next',
|
'action': 'next',
|
||||||
'messages': [
|
'messages': [
|
||||||
{
|
{
|
||||||
|
@ -59,7 +68,7 @@ response = client.post('https://chat.openai.com/backend-api/conversation', json
|
||||||
'content': {
|
'content': {
|
||||||
'content_type': 'text',
|
'content_type': 'text',
|
||||||
'parts': [
|
'parts': [
|
||||||
'hello world',
|
message,
|
||||||
],
|
],
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
@ -69,4 +78,10 @@ response = client.post('https://chat.openai.com/backend-api/conversation', json
|
||||||
'timezone_offset_min': -120,
|
'timezone_offset_min': -120,
|
||||||
})
|
})
|
||||||
|
|
||||||
print(response.text)
|
return response.text
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
chat = OpenAIChat()
|
||||||
|
response = chat.send_message("hello world")
|
||||||
|
print(response)
|
||||||
|
|
|
@ -1,7 +1,8 @@
|
||||||
import requests
|
|
||||||
import json
|
import json
|
||||||
import re
|
import re
|
||||||
|
|
||||||
|
import requests
|
||||||
|
|
||||||
headers = {
|
headers = {
|
||||||
'authority': 'openai.a2hosted.com',
|
'authority': 'openai.a2hosted.com',
|
||||||
'accept': 'text/event-stream',
|
'accept': 'text/event-stream',
|
||||||
|
@ -13,10 +14,12 @@ headers = {
|
||||||
'user-agent': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/113.0.0.0 Safari/537.36 Edg/113.0.0.0',
|
'user-agent': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/113.0.0.0 Safari/537.36 Edg/113.0.0.0',
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
def create_query_param(conversation):
|
def create_query_param(conversation):
|
||||||
encoded_conversation = json.dumps(conversation)
|
encoded_conversation = json.dumps(conversation)
|
||||||
return encoded_conversation.replace(" ", "%20").replace('"', '%22').replace("'", "%27")
|
return encoded_conversation.replace(" ", "%20").replace('"', '%22').replace("'", "%27")
|
||||||
|
|
||||||
|
|
||||||
user_input = input("Enter your message: ")
|
user_input = input("Enter your message: ")
|
||||||
|
|
||||||
data = [
|
data = [
|
||||||
|
|
|
@ -1,9 +1,9 @@
|
||||||
from requests import post, get
|
|
||||||
from json import dumps
|
from json import dumps
|
||||||
# from mail import MailClient
|
# from mail import MailClient
|
||||||
from time import sleep
|
|
||||||
from re import findall
|
from re import findall
|
||||||
|
|
||||||
|
from requests import post, get
|
||||||
|
|
||||||
html = get('https://developermail.com/mail/')
|
html = get('https://developermail.com/mail/')
|
||||||
print(html.cookies.get('mailboxId'))
|
print(html.cookies.get('mailboxId'))
|
||||||
email = findall(r'mailto:(.*)">', html.text)[0]
|
email = findall(r'mailto:(.*)">', html.text)[0]
|
||||||
|
|
|
@ -1,6 +1,8 @@
|
||||||
import requests
|
|
||||||
import email
|
import email
|
||||||
|
|
||||||
|
import requests
|
||||||
|
|
||||||
|
|
||||||
class MailClient:
|
class MailClient:
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
|
|
|
@ -30,8 +30,7 @@ json_data = {
|
||||||
],
|
],
|
||||||
}
|
}
|
||||||
|
|
||||||
response = requests.post('https://openprompt.co/api/chat2', cookies=cookies, headers=headers, json=json_data, stream=True)
|
response = requests.post('https://openprompt.co/api/chat2', cookies=cookies, headers=headers, json=json_data,
|
||||||
|
stream=True)
|
||||||
for chunk in response.iter_content(chunk_size=1024):
|
for chunk in response.iter_content(chunk_size=1024):
|
||||||
print(chunk)
|
print(chunk)
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -1,7 +1,6 @@
|
||||||
access_token = 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhdWQiOiJhdXRoZW50aWNhdGVkIiwiZXhwIjoxNjgyMjk0ODcxLCJzdWIiOiI4NWNkNTNiNC1lZTUwLTRiMDQtOGJhNS0wNTUyNjk4ODliZDIiLCJlbWFpbCI6ImNsc2J5emdqcGhiQGJ1Z2Zvby5jb20iLCJwaG9uZSI6IiIsImFwcF9tZXRhZGF0YSI6eyJwcm92aWRlciI6ImVtYWlsIiwicHJvdmlkZXJzIjpbImVtYWlsIl19LCJ1c2VyX21ldGFkYXRhIjp7fSwicm9sZSI6ImF1dGhlbnRpY2F0ZWQiLCJhYWwiOiJhYWwxIiwiYW1yIjpbeyJtZXRob2QiOiJvdHAiLCJ0aW1lc3RhbXAiOjE2ODE2OTAwNzF9XSwic2Vzc2lvbl9pZCI6ImY4MTg1YTM5LTkxYzgtNGFmMy1iNzAxLTdhY2MwY2MwMGNlNSJ9.UvcTfpyIM1TdzM8ZV6UAPWfa0rgNq4AiqeD0INy6zV'
|
access_token = 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhdWQiOiJhdXRoZW50aWNhdGVkIiwiZXhwIjoxNjgyMjk0ODcxLCJzdWIiOiI4NWNkNTNiNC1lZTUwLTRiMDQtOGJhNS0wNTUyNjk4ODliZDIiLCJlbWFpbCI6ImNsc2J5emdqcGhiQGJ1Z2Zvby5jb20iLCJwaG9uZSI6IiIsImFwcF9tZXRhZGF0YSI6eyJwcm92aWRlciI6ImVtYWlsIiwicHJvdmlkZXJzIjpbImVtYWlsIl19LCJ1c2VyX21ldGFkYXRhIjp7fSwicm9sZSI6ImF1dGhlbnRpY2F0ZWQiLCJhYWwiOiJhYWwxIiwiYW1yIjpbeyJtZXRob2QiOiJvdHAiLCJ0aW1lc3RhbXAiOjE2ODE2OTAwNzF9XSwic2Vzc2lvbl9pZCI6ImY4MTg1YTM5LTkxYzgtNGFmMy1iNzAxLTdhY2MwY2MwMGNlNSJ9.UvcTfpyIM1TdzM8ZV6UAPWfa0rgNq4AiqeD0INy6zV'
|
||||||
supabase_auth_token = '%5B%22eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhdWQiOiJhdXRoZW50aWNhdGVkIiwiZXhwIjoxNjgyMjk0ODcxLCJzdWIiOiI4NWNkNTNiNC1lZTUwLTRiMDQtOGJhNS0wNTUyNjk4ODliZDIiLCJlbWFpbCI6ImNsc2J5emdqcGhiQGJ1Z2Zvby5jb20iLCJwaG9uZSI6IiIsImFwcF9tZXRhZGF0YSI6eyJwcm92aWRlciI6ImVtYWlsIiwicHJvdmlkZXJzIjpbImVtYWlsIl19LCJ1c2VyX21ldGFkYXRhIjp7fSwicm9sZSI6ImF1dGhlbnRpY2F0ZWQiLCJhYWwiOiJhYWwxIiwiYW1yIjpbeyJtZXRob2QiOiJvdHAiLCJ0aW1lc3RhbXAiOjE2ODE2OTAwNzF9XSwic2Vzc2lvbl9pZCI6ImY4MTg1YTM5LTkxYzgtNGFmMy1iNzAxLTdhY2MwY2MwMGNlNSJ9.UvcTfpyIM1TdzM8ZV6UAPWfa0rgNq4AiqeD0INy6zV8%22%2C%22_Zp8uXIA2InTDKYgo8TCqA%22%2Cnull%2Cnull%2Cnull%5D'
|
supabase_auth_token = '%5B%22eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhdWQiOiJhdXRoZW50aWNhdGVkIiwiZXhwIjoxNjgyMjk0ODcxLCJzdWIiOiI4NWNkNTNiNC1lZTUwLTRiMDQtOGJhNS0wNTUyNjk4ODliZDIiLCJlbWFpbCI6ImNsc2J5emdqcGhiQGJ1Z2Zvby5jb20iLCJwaG9uZSI6IiIsImFwcF9tZXRhZGF0YSI6eyJwcm92aWRlciI6ImVtYWlsIiwicHJvdmlkZXJzIjpbImVtYWlsIl19LCJ1c2VyX21ldGFkYXRhIjp7fSwicm9sZSI6ImF1dGhlbnRpY2F0ZWQiLCJhYWwiOiJhYWwxIiwiYW1yIjpbeyJtZXRob2QiOiJvdHAiLCJ0aW1lc3RhbXAiOjE2ODE2OTAwNzF9XSwic2Vzc2lvbl9pZCI6ImY4MTg1YTM5LTkxYzgtNGFmMy1iNzAxLTdhY2MwY2MwMGNlNSJ9.UvcTfpyIM1TdzM8ZV6UAPWfa0rgNq4AiqeD0INy6zV8%22%2C%22_Zp8uXIA2InTDKYgo8TCqA%22%2Cnull%2Cnull%2Cnull%5D'
|
||||||
|
|
||||||
|
|
||||||
idk = [
|
idk = [
|
||||||
"eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhdWQiOiJhdXRoZW50aWNhdGVkIiwiZXhwIjoxNjgyMjk0ODcxLCJzdWIiOiI4NWNkNTNiNC1lZTUwLTRiMDQtOGJhNS0wNTUyNjk4ODliZDIiLCJlbWFpbCI6ImNsc2J5emdqcGhiQGJ1Z2Zvby5jb20iLCJwaG9uZSI6IiIsImFwcF9tZXRhZGF0YSI6eyJwcm92aWRlciI6ImVtYWlsIiwicHJvdmlkZXJzIjpbImVtYWlsIl19LCJ1c2VyX21ldGFkYXRhIjp7fSwicm9sZSI6ImF1dGhlbnRpY2F0ZWQiLCJhYWwiOiJhYWwxIiwiYW1yIjpbeyJtZXRob2QiOiJvdHAiLCJ0aW1lc3RhbXAiOjE2ODE2OTAwNzF9XSwic2Vzc2lvbl9pZCI6ImY4MTg1YTM5LTkxYzgtNGFmMy1iNzAxLTdhY2MwY2MwMGNlNSJ9.UvcTfpyIM1TdzM8ZV6UAPWfa0rgNq4AiqeD0INy6zV8",
|
"eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhdWQiOiJhdXRoZW50aWNhdGVkIiwiZXhwIjoxNjgyMjk0ODcxLCJzdWIiOiI4NWNkNTNiNC1lZTUwLTRiMDQtOGJhNS0wNTUyNjk4ODliZDIiLCJlbWFpbCI6ImNsc2J5emdqcGhiQGJ1Z2Zvby5jb20iLCJwaG9uZSI6IiIsImFwcF9tZXRhZGF0YSI6eyJwcm92aWRlciI6ImVtYWlsIiwicHJvdmlkZXJzIjpbImVtYWlsIl19LCJ1c2VyX21ldGFkYXRhIjp7fSwicm9sZSI6ImF1dGhlbnRpY2F0ZWQiLCJhYWwiOiJhYWwxIiwiYW1yIjpbeyJtZXRob2QiOiJvdHAiLCJ0aW1lc3RhbXAiOjE2ODE2OTAwNzF9XSwic2Vzc2lvbl9pZCI6ImY4MTg1YTM5LTkxYzgtNGFmMy1iNzAxLTdhY2MwY2MwMGNlNSJ9.UvcTfpyIM1TdzM8ZV6UAPWfa0rgNq4AiqeD0INy6zV8",
|
||||||
"_Zp8uXIA2InTDKYgo8TCqA", None, None, None]
|
"_Zp8uXIA2InTDKYgo8TCqA", None, None, None]
|
|
@ -1,6 +1,7 @@
|
||||||
from requests import post
|
|
||||||
from time import time
|
from time import time
|
||||||
|
|
||||||
|
from requests import post
|
||||||
|
|
||||||
headers = {
|
headers = {
|
||||||
'authority': 'www.t3nsor.tech',
|
'authority': 'www.t3nsor.tech',
|
||||||
'accept': '*/*',
|
'accept': '*/*',
|
||||||
|
@ -19,10 +20,9 @@ headers = {
|
||||||
'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36',
|
'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36',
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
class T3nsorResponse:
|
class T3nsorResponse:
|
||||||
|
|
||||||
class Completion:
|
class Completion:
|
||||||
|
|
||||||
class Choices:
|
class Choices:
|
||||||
def __init__(self, choice: dict) -> None:
|
def __init__(self, choice: dict) -> None:
|
||||||
self.text = choice['text']
|
self.text = choice['text']
|
||||||
|
@ -47,7 +47,6 @@ class T3nsorResponse:
|
||||||
return f'''<__main__.APIResponse.Usage(\n prompt_tokens = {self.prompt_tokens},\n completion_tokens = {self.completion_tokens},\n total_tokens = {self.total_tokens})object at 0x1337>'''
|
return f'''<__main__.APIResponse.Usage(\n prompt_tokens = {self.prompt_tokens},\n completion_tokens = {self.completion_tokens},\n total_tokens = {self.total_tokens})object at 0x1337>'''
|
||||||
|
|
||||||
def __init__(self, response_dict: dict) -> None:
|
def __init__(self, response_dict: dict) -> None:
|
||||||
|
|
||||||
self.response_dict = response_dict
|
self.response_dict = response_dict
|
||||||
self.id = response_dict['id']
|
self.id = response_dict['id']
|
||||||
self.object = response_dict['object']
|
self.object = response_dict['object']
|
||||||
|
@ -59,6 +58,7 @@ class T3nsorResponse:
|
||||||
def json(self) -> dict:
|
def json(self) -> dict:
|
||||||
return self.response_dict
|
return self.response_dict
|
||||||
|
|
||||||
|
|
||||||
class Completion:
|
class Completion:
|
||||||
model = {
|
model = {
|
||||||
'model': {
|
'model': {
|
||||||
|
@ -70,7 +70,6 @@ class Completion:
|
||||||
def create(
|
def create(
|
||||||
prompt: str = 'hello world',
|
prompt: str = 'hello world',
|
||||||
messages: list = []) -> T3nsorResponse:
|
messages: list = []) -> T3nsorResponse:
|
||||||
|
|
||||||
response = post('https://www.t3nsor.tech/api/chat', headers=headers, json=Completion.model | {
|
response = post('https://www.t3nsor.tech/api/chat', headers=headers, json=Completion.model | {
|
||||||
'messages': messages,
|
'messages': messages,
|
||||||
'key': '',
|
'key': '',
|
||||||
|
@ -95,6 +94,7 @@ class Completion:
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
|
|
||||||
class StreamCompletion:
|
class StreamCompletion:
|
||||||
model = {
|
model = {
|
||||||
'model': {
|
'model': {
|
||||||
|
@ -106,7 +106,6 @@ class StreamCompletion:
|
||||||
def create(
|
def create(
|
||||||
prompt: str = 'hello world',
|
prompt: str = 'hello world',
|
||||||
messages: list = []) -> T3nsorResponse:
|
messages: list = []) -> T3nsorResponse:
|
||||||
|
|
||||||
print('t3nsor api is down, this may not work, refer to another module')
|
print('t3nsor api is down, this may not work, refer to another module')
|
||||||
|
|
||||||
response = post('https://www.t3nsor.tech/api/chat', headers=headers, stream=True, json=Completion.model | {
|
response = post('https://www.t3nsor.tech/api/chat', headers=headers, stream=True, json=Completion.model | {
|
||||||
|
|
|
@ -1,7 +1,3 @@
|
||||||
import gptbz
|
|
||||||
import asyncio
|
|
||||||
|
|
||||||
|
|
||||||
# asyncio.run(gptbz.test())
|
# asyncio.run(gptbz.test())
|
||||||
|
|
||||||
import requests
|
import requests
|
||||||
|
|
|
@ -1,8 +1,10 @@
|
||||||
from curl_cffi import requests
|
|
||||||
from json import loads
|
from json import loads
|
||||||
|
from queue import Queue, Empty
|
||||||
from re import findall
|
from re import findall
|
||||||
from threading import Thread
|
from threading import Thread
|
||||||
from queue import Queue, Empty
|
|
||||||
|
from curl_cffi import requests
|
||||||
|
|
||||||
|
|
||||||
class Completion:
|
class Completion:
|
||||||
# experimental
|
# experimental
|
||||||
|
@ -22,7 +24,8 @@ class Completion:
|
||||||
'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36',
|
'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36',
|
||||||
}
|
}
|
||||||
|
|
||||||
requests.post('https://chatbot.theb.ai/api/chat-process', headers=headers, content_callback=Completion.handle_stream_response,
|
requests.post('https://chatbot.theb.ai/api/chat-process', headers=headers,
|
||||||
|
content_callback=Completion.handle_stream_response,
|
||||||
json={
|
json={
|
||||||
'prompt': 'hello world',
|
'prompt': 'hello world',
|
||||||
'options': {}
|
'options': {}
|
||||||
|
@ -48,10 +51,12 @@ class Completion:
|
||||||
def handle_stream_response(response):
|
def handle_stream_response(response):
|
||||||
Completion.message_queue.put(response.decode())
|
Completion.message_queue.put(response.decode())
|
||||||
|
|
||||||
|
|
||||||
def start():
|
def start():
|
||||||
for message in Completion.create():
|
for message in Completion.create():
|
||||||
yield message['delta']
|
yield message['delta']
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
for message in start():
|
for message in start():
|
||||||
print(message)
|
print(message)
|
||||||
|
|
|
@ -1,6 +1,5 @@
|
||||||
import requests
|
import requests
|
||||||
|
|
||||||
|
|
||||||
token = requests.get('https://play.vercel.ai/openai.jpeg', headers={
|
token = requests.get('https://play.vercel.ai/openai.jpeg', headers={
|
||||||
'authority': 'play.vercel.ai',
|
'authority': 'play.vercel.ai',
|
||||||
'accept-language': 'en,fr-FR;q=0.9,fr;q=0.8,es-ES;q=0.7,es;q=0.6,en-US;q=0.5,am;q=0.4,de;q=0.3',
|
'accept-language': 'en,fr-FR;q=0.9,fr;q=0.8,es-ES;q=0.7,es;q=0.6,en-US;q=0.5,am;q=0.4,de;q=0.3',
|
||||||
|
@ -25,5 +24,4 @@ for chunk in requests.post('https://play.vercel.ai/api/generate', headers=header
|
||||||
'frequencyPenalty': 1,
|
'frequencyPenalty': 1,
|
||||||
'presencePenalty': 1,
|
'presencePenalty': 1,
|
||||||
'stopSequences': []}).iter_lines():
|
'stopSequences': []}).iter_lines():
|
||||||
|
|
||||||
print(chunk)
|
print(chunk)
|
|
@ -1,21 +1,25 @@
|
||||||
from requests import Session
|
|
||||||
from names import get_first_name, get_last_name
|
|
||||||
from random import choice
|
from random import choice
|
||||||
from requests import post
|
|
||||||
from time import time
|
from time import time
|
||||||
from colorama import Fore, init; init()
|
|
||||||
|
from colorama import Fore, init;
|
||||||
|
from names import get_first_name, get_last_name
|
||||||
|
from requests import Session
|
||||||
|
from requests import post
|
||||||
|
|
||||||
|
init()
|
||||||
|
|
||||||
|
|
||||||
class logger:
|
class logger:
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def info(string) -> print:
|
def info(string) -> print:
|
||||||
import datetime
|
import datetime
|
||||||
now = datetime.datetime.now()
|
now = datetime.datetime.now()
|
||||||
return print(f"{Fore.CYAN}{now.strftime('%Y-%m-%d %H:%M:%S')} {Fore.BLUE}INFO {Fore.MAGENTA}__main__ -> {Fore.RESET}{string}")
|
return print(
|
||||||
|
f"{Fore.CYAN}{now.strftime('%Y-%m-%d %H:%M:%S')} {Fore.BLUE}INFO {Fore.MAGENTA}__main__ -> {Fore.RESET}{string}")
|
||||||
|
|
||||||
|
|
||||||
class SonicResponse:
|
class SonicResponse:
|
||||||
|
|
||||||
class Completion:
|
class Completion:
|
||||||
|
|
||||||
class Choices:
|
class Choices:
|
||||||
def __init__(self, choice: dict) -> None:
|
def __init__(self, choice: dict) -> None:
|
||||||
self.text = choice['text']
|
self.text = choice['text']
|
||||||
|
@ -40,7 +44,6 @@ class SonicResponse:
|
||||||
return f'''<__main__.APIResponse.Usage(\n prompt_tokens = {self.prompt_tokens},\n completion_tokens = {self.completion_tokens},\n total_tokens = {self.total_tokens})object at 0x1337>'''
|
return f'''<__main__.APIResponse.Usage(\n prompt_tokens = {self.prompt_tokens},\n completion_tokens = {self.completion_tokens},\n total_tokens = {self.total_tokens})object at 0x1337>'''
|
||||||
|
|
||||||
def __init__(self, response_dict: dict) -> None:
|
def __init__(self, response_dict: dict) -> None:
|
||||||
|
|
||||||
self.response_dict = response_dict
|
self.response_dict = response_dict
|
||||||
self.id = response_dict['id']
|
self.id = response_dict['id']
|
||||||
self.object = response_dict['object']
|
self.object = response_dict['object']
|
||||||
|
@ -52,6 +55,7 @@ class SonicResponse:
|
||||||
def json(self) -> dict:
|
def json(self) -> dict:
|
||||||
return self.response_dict
|
return self.response_dict
|
||||||
|
|
||||||
|
|
||||||
class Account:
|
class Account:
|
||||||
session = Session()
|
session = Session()
|
||||||
session.headers = {
|
session.headers = {
|
||||||
|
@ -105,7 +109,8 @@ class Account:
|
||||||
logger.info(f"\x1b[31mtoken\x1b[0m : '{response.json()['token'][:30]}...'")
|
logger.info(f"\x1b[31mtoken\x1b[0m : '{response.json()['token'][:30]}...'")
|
||||||
|
|
||||||
start = time()
|
start = time()
|
||||||
response = Account.session.post("https://api.writesonic.com/v1/business/set-business-active", headers={"authorization": "Bearer " + response.json()['token']})
|
response = Account.session.post("https://api.writesonic.com/v1/business/set-business-active",
|
||||||
|
headers={"authorization": "Bearer " + response.json()['token']})
|
||||||
key = response.json()["business"]["api_key"]
|
key = response.json()["business"]["api_key"]
|
||||||
if logging: logger.info(f"\x1b[31mgot key\x1b[0m : '{key}' ({int(time() - start)}s)")
|
if logging: logger.info(f"\x1b[31mgot key\x1b[0m : '{key}' ({int(time() - start)}s)")
|
||||||
|
|
||||||
|
@ -129,8 +134,8 @@ class Completion:
|
||||||
enable_memory: bool = False,
|
enable_memory: bool = False,
|
||||||
enable_google_results: bool = False,
|
enable_google_results: bool = False,
|
||||||
history_data: list = []) -> SonicResponse:
|
history_data: list = []) -> SonicResponse:
|
||||||
|
response = post('https://api.writesonic.com/v2/business/content/chatsonic?engine=premium',
|
||||||
response = post('https://api.writesonic.com/v2/business/content/chatsonic?engine=premium', headers = {"X-API-KEY": api_key},
|
headers={"X-API-KEY": api_key},
|
||||||
json={
|
json={
|
||||||
"enable_memory": enable_memory,
|
"enable_memory": enable_memory,
|
||||||
"enable_google_results": enable_google_results,
|
"enable_google_results": enable_google_results,
|
||||||
|
|
Loading…
Reference in a new issue