爱派,Python Use, AI Freedom Me!AiPy, Python Use, AI Freedom Me!
爱派是什么?LLM+Python程序可以操作和控制一切What is AiPy? It's an LLM + Python program that can operate and control everything.


有了爱派,世界从此不同The world will never be the same with AiPy


下载&安装Download & Installation
runs and works! Run the update.bat file in the extracted package to upgrade to the latest version.
通过用例,了解爱派的神奇之处
Learn the magic of AiPy through use cases
AI旅行规划
AI travel planning
AI开发游戏AI Development Games
AI文档处理
AI Document Processing
AI生活助手AI life assistant
用户好评
User Reviews












爱派(AiPy)为何如此不同
FAQ
Q:AiPy是什么?
Q: What is AiPy?
Q:AiPy和现在的Ai有什么区别?
Q: What is the difference between AiPy and the current Ai?
Q:AiPy是新的大模型还是套壳的大模型?
Q: Is the AiPy a new larger model or a sleeved larger model?
Q:AiPy的范式是什么?Python-Use又是什么?是Agent类的产品吗?与Manus、MCP等有什么区别?
A: AiPy是我们基于一种新范式Python-Use开发的一种更加通用快捷地使用大模型完成各种任务的产品。
传统的大模型AI Agent经典范式是通过开发大量的工具Agents再通过协同合作来完成各种任务,人们依赖于更多的Agents的开发及部署安装,在某种角度上上其实开发部署更多的Agent是限制了大模型能力发挥,而AiPy(Python-Use)一种新的范式,是一种“让AI用上 Python,让Python用上AI”的方式,也就是大模型通过对用户任务的理解进行规划拆分,并通过API Calling及Packages Calling实现自动编码并自动执行代码,还可通过反馈机制不断完善迭代,最终实现AI与环境的交互完成任务。
所以我们提出了:“真正的通用AI Agent 是NO Agents! (The real general AI Agent is NO Agents!)”的理念,AiPy(Python-Use)实现的是"No Agents, Code is Agent."新范式,Python 使用数据,Python 使用计算机,Python 使用网络,Python 使用物联网,Python 使用一切,最终实现真正的AI Think Do!
具体Python-Use的相关代码我们已经开源: https://github.com/knownsec/aipyapp
基于这个理念我们认为是跟Manus、MCP等最大区别!对于用户来说:
AiPy相比Manus最大的区别是,AiPy本身是开源免费的,用户只需承担大模型等API的调用token的费用(当然你还可以使用免费的大模型),因为不需要大量的Agents的调用,所以AiPy在相同任务Token消耗上也相对较少。另外还有一个最大的优势是,AiPy支持本地部署,用户无需把自己的敏感数据及文档传到云端,因为AiPy只负责任务进行对应编码,数据处理完全都是在本地进行,对超大文件的处理、涉及敏感数据的处理有种安全可靠的优势。
AiPy相比MCP Server最大的优势是,用户无需依赖各种服务定制开发的各种MCP Server,也无需去部署安装使用,也无需担忧各种MCP Server提供方不可靠带来的安全风险。AiPy可通过实时编码实现各种API的调用,完成各种功能,大家可以通过上面展示的案例或自行体验AiPy的魅力。
总之AiPy提供多种部署方式,不再局限在云端主机各种限制,也不需要各种工具的开发、下载安装及各种复杂的配置,你需要的只是跟大模型对话。
Q: What is the AiPy paradigm and what is Python-Use? Is it an Agent-like product? How is it different from Manus, MCP, etc.?
A: AiPy is a product we developed based on a new paradigm, Python-Use, to enable more universal and rapid utilization of large models for various tasks.
The traditional classic paradigm for large model AI Agents involves developing a large number of tool agents and then relying on their collaboration to accomplish various tasks. This approach depends on the development, deployment, and installation of more and more agents. However, from a certain perspective, developing and deploying more agents actually limits the full potential of large models. In contrast, AiPy (Python-Use), a new paradigm, takes a different approach: it's a way of "enabling AI to use Python and Python to use AI." This means that the large model understands and breaks down user tasks, then achieves automatic coding and code execution through API Calling and Packages Calling. It can also continuously improve and iterate through a feedback mechanism, ultimately enabling AI to interact with the environment and complete tasks.
Therefore, we propose the concept: "The real general AI Agent is NO Agents!" AiPy (Python-Use) implements the new paradigm of "No Agents, Code is Agent." Python uses data, Python uses computers, Python uses the network, Python uses the Internet of Things, Python uses everything, ultimately achieving true AI Think Do!
The specific code related to Python-Use has already been open-sourced: https://github.com/knownsec/aipyapp
Based on this concept, we believe this is the biggest difference compared to Manus, MCP, etc.! For users:
The biggest difference between AiPy and Manus is that AiPy itself is open-source and free. Users only need to bear the cost of tokens for calling APIs of large models (of course, you can also use free large models). Because it doesn't require the invocation of numerous agents, AiPy also consumes relatively fewer tokens for the same task. Another major advantage is that AiPy supports local deployment, eliminating the need for users to upload their sensitive data and documents to the cloud. This is because AiPy is only responsible for the corresponding code generation for the task, and all data processing is done locally, offering a secure and reliable advantage for handling very large files and sensitive data.
The biggest advantage of AiPy compared to MCP Server is that users don't need to rely on various custom-developed MCP Servers for different services, nor do they need to deploy, install, or use them. They also don't need to worry about the security risks posed by unreliable MCP Server providers. AiPy can achieve the invocation of various APIs and accomplish diverse functions through real-time coding. You can see the examples shown above or experience the power of AiPy for yourselves.
In summary, AiPy offers multiple deployment options, is no longer limited by the various restrictions of cloud-based hosts, and doesn't require the development, downloading, installation, or complex configuration of various tools. All you need to do is converse with the large model.
AiPy 支持哪些大模型?支持本地大模型调用吗?有哪些建议使用的模型?
A:AiPy 理论上支持所有通用大模型的调用,你只需要在配置文件中设置好通用大模型的API及模型信息即可完成调用。本地大模型我们目前也支持Ollama、LMStudio的API调用。
因为Python-Use的范式,很多的能力取决于大模型本身,所以从效果上讲大模型的编码能力等综合能力越好的模型实现任务的表现度越好。当然我们也需要考虑到大模型API调用的token花费成本问题,在性价比角度考虑,我们推荐使用DeepSeek,经过测试很少花费就可以实现大部分的任务执行工作
Q: What kind of macromodels does AiPy support? Does AiPy support local big model calls? What are the recommended models?
A: AiPy theoretically supports all generic big model calls, you just need to set the API and model information of the generic big model in the configuration file to complete the call. We also support Ollama and LMStudio APIs for local big models.
Because of the Python-Use paradigm, a lot of capabilities depend on the big model itself, so the better the coding ability of the big model and other comprehensive capabilities of the big model, the better the performance of the model to achieve the task. Of course, we also need to take into account the large model API calls tokens to spend the cost of the problem, in the cost-effective point of view, we recommend the use of DeepSeek, after testing a very small amount of money can achieve most of the task execution work.
Q:AiPy 目前能干哪些事情?
Q:What can AiPy do at the moment?
Q:AiPy 可以调用其他产品及业务的API吗?是怎么实现的?支持本地私有API吗?
A:是的,AiPy支持各种互联网业务的API调用,包括搜索、地图行程规划、社交媒体、天气等API服务,可以内置也可以在调用生成代码时候输入对应的API Key进行调用使用。至于实现API的调用我们实现了一个叫“API Calling”功能,大模型估计他对任务的理解去选择调用对应的API,你也可以通过任务提示词指定的方式去实现调用。
本地部署的AiPy是支持本地私有化的API的调用的,你只需要在配置文件里写好对应的API说明及地址即可。
Q: Can AiPy call other products and business APIs? How is it implemented? Does it support local private API?
A: Yes, AiPy supports a variety of Internet business API calls, including search, maps, trip planning, social media, weather and other API services, can be built-in can also be called to generate the code when you enter the corresponding API Key to call to use. As for the implementation of API calls we have implemented a function called ‘API Calling’, the big model estimates his understanding of the task to choose to call the corresponding API, you can also specify the way to achieve the call through the task prompt word.
Through the local deployment of AiPy is to support the local private words of the API call, you just need to write the corresponding API description and address in the configuration file.