提交Fay AI Agent版计划

提交Fay AI Agent版计划
This commit is contained in:
xszyou 2023-09-18 10:17:31 +08:00
parent 528d326314
commit 894a9b6ea4
270 changed files with 10 additions and 290570 deletions

129
.gitignore vendored
View File

@ -1,129 +0,0 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
.python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/

674
LICENSE
View File

@ -1,674 +0,0 @@
GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The GNU General Public License is a free, copyleft license for
software and other kinds of works.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights. Therefore, you have
certain responsibilities if you distribute copies of the software, or if
you modify it: responsibilities to respect the freedom of others.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received. You must make sure that they, too, receive
or can get the source code. And you must show them these terms so they
know their rights.
Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.
For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software. For both users' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.
Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary. To prevent this, the GPL assures that
patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and
modification follow.
TERMS AND CONDITIONS
0. Definitions.
"This License" refers to version 3 of the GNU General Public License.
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.
A "covered work" means either the unmodified Program or a work based
on the Program.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
1. Source Code.
The "source code" for a work means the preferred form of the work
for making modifications to it. "Object code" means any non-source
form of a work.
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.
The Corresponding Source for a work in source code form is that
same work.
2. Basic Permissions.
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
"keep intact all notices".
c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:
a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or
e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.
13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of
the GNU General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU General Public License, you may choose any version ever published
by the Free Software Foundation.
If the Program specifies that a proxy can decide which future
versions of the GNU General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail.
If the program does terminal interaction, make it output a short
notice like this when it starts in an interactive mode:
<program> Copyright (C) <year> <name of author>
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, your program's commands
might be different; for a GUI interface, you would use an "about box".
You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU GPL, see
<https://www.gnu.org/licenses/>.
The GNU General Public License does not permit incorporating your program
into proprietary programs. If your program is a subroutine library, you
may consider it more useful to permit linking proprietary applications with
the library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License. But first, please read
<https://www.gnu.org/licenses/why-not-lgpl.html>.

281
README.md
View File

@ -3,286 +3,25 @@
<div align="center"> <div align="center">
<br> <br>
<img src="images/icon.png" alt="Fay"> <img src="images/icon.png" alt="Fay">
<h1>FAY</h1> <h1>Fay数字人</h1>
<h3>Fay数字人助理</h3> <h3>AI Agent版</h3>
</div> </div>
[`·带货完整版`](https://github.com/TheRamU/Fay/tree/fay-sales-edition) [`·助理完整版`](https://github.com/TheRamU/Fay/tree/fay-assistant-edition)
**“过去的系统都值得用数字人重新做一遍”**
Fay数字人助理版是fay开源项目的重要分支专注于构建智能数字助理的开源解决方案。它提供了灵活的模块化设计使开发人员能够定制和组合各种功能模块包括情绪分析、NLP处理、语音合成和语音输出等。Fay数字人助理版为开发人员提供了强大的工具和资源用于构建智能、个性化和多功能的数字助理应用。通过该版本开发人员可以轻松创建适用于各种场景和领域的数字人助理为用户提供智能化的语音交互和个性化服务。 **10月Fay数字人 AI Agent版与官方demo(实验箱)同时上线!**
![](images/1.jpg)
(图Fay数字人智慧农业实验箱
## **Fay数字人助理版** **高校老师请联系,我们团队将协助你课程落地,输出《人才培养方案》!**
注:带货版移到分支[`fay-sales-edition`](https://github.com/TheRamU/Fay/tree/fay-sales-edition)
![](images/controller.png)
助理版Fay控制器使用语音沟通语音和文字回复文字沟通文字回复;对接UE、live2d、xuniren需关闭面板播放。
## **二、Fay助理版**
Remote Android      Local PC     Remote PC
     └─────────────┼─────────────┘
      Aliyun API ─┐   │
            ├── ASR   
         [FunASR](https://www.bilibili.com/video/BV1qs4y1g74e) ─┘    │     ┌─ Yuan 1.0
               │     ├─ [LingJu](https://www.bilibili.com/video/BV1NW4y1D76a/)
               NLP ────┼─ [GPT/ChatGPT](https://www.bilibili.com/video/BV1Dg4y1V7pn)
               │     ├─ [Rasa+ChatGLM-6B](https://www.bilibili.com/video/BV1D14y1f7pr)
         Azure ─┐    │     ├─ [VisualGLM](https://www.bilibili.com/video/BV1mP411Q7mj)
        Edge TTS ─┼── TTS     └─ [RWKV](https://www.bilibili.com/video/BV1yu41157zB)
         [开源 TTS](https://www.bilibili.com/read/cv25192534) ─┘   │    
               │    
               │    
    ┌──────────┬────┼───────┬─────────┐
Remote Android  [Live2D](https://www.bilibili.com/video/BV1sx4y1d775/?vd_source=564eede213b9ddfa9a10f12e5350fd64)   [UE](https://www.bilibili.com/read/cv25133736)    [xuniren](https://www.bilibili.com/read/cv24997550)   Remote PC
重要Fay服务端与数字人客户端的通讯接口: [`ws://127.0.0.1:10002`](ws://127.0.0.1:10002)(已接通)
消息格式: 查看 [WebSocket.md](https://github.com/TheRamU/Fay/blob/main/WebSocket.md)
![](images/kzq.jpg)
### ***Code structure***
```
.
├── main.py # 程序主入口
├── fay_booter.py # 核心启动模块
├── config.json # 控制器配置文件
├── system.conf # 系统配置文件
├── ai_module
│   ├── ali_nls.py # 阿里云 实时语音
│   ├── ms_tts_sdk.py # 微软 文本转语音
│   ├── nlp_lingju.py # 灵聚 人机交互-自然语言处理
│   ├── xf_aiui.py # 讯飞 人机交互-自然语言处理
│   ├── nlp_gpt.py # gpt api对接
│   ├── nlp_chatgpt.py # chat.openai.com逆向对接
│   ├── nlp_yuan.py # 浪潮.源大模型对接
│   ├── nlp_rasa.py # ChatGLM-6B的基础上前置Rasa会话管理(强烈推荐)
│   ├── nlp_VisualGLM.py # 对接多模态大语言模型VisualGLM-6B
│   ├── nlp_rwkv.py # 离线对接rwkv
│   ├── nlp_rwkv_api.py # rwkv server api
│   ├── yolov8.py # yolov8资态识别
│   └── xf_ltp.py # 讯飞 情感分析
├── bin # 可执行文件目录
├── core # 数字人核心
│   ├── fay_core.py # 数字人核心模块
│   ├── recorder.py # 录音器
│   ├── tts_voice.py # 语音生源枚举
│   ├── authorize_tb.py # fay.db认证表管理
│   ├── content_db.py # fay.db内容表管理
│   ├── interact.py # 互动(消息)对象
│   ├── song_player.py # 音乐播放(暂不可用)
│   └── wsa_server.py # WebSocket 服务端
├── gui # 图形界面
│   ├── flask_server.py # Flask 服务端
│   ├── static
│   ├── templates
│   └── window.py # 窗口模块
├── scheduler
│   └── thread_manager.py # 调度管理器
├── utils # 工具模块
├── config_util.py
├── storer.py
└── util.py
└── test # 都是惊喜
```
## **三、升级日志**
**2023.09.06**
+ 数字人连接提示词修改;
+ Q&A填写demo修复;
+ 安装包错误修复。
**2023.09.01**
+ 修复gpt、chatglm2的消息记录方式逻辑。
**2023.08.30**
+ 调整gpt的消息记录方式;
+ *q&a支持RPA自动化脚本。
**2023.08.23**
+ 更换gpt对接方式;
+ 增加chatglm2对接。
**2023.08.16**
+ 优化UE反复重连系统资源占用太高的问题
+ 自动控制是否启动面板播放;
+ 自动删除运行日志。
**2023.08.09**
+ 去除mp3格式警告信息
+ 去除灵聚、渡鸦接口警告信息;
+ websocket逻辑优化
+ 数字人端接口通讯优化。
**2023.08.04**
+ UE5工程更新
+ 唇型计算的视音素更换成33毫秒
+ 内置rwkv_api nlp可以直接使用
+ 降低情绪性向数字人端推送的频度;
+ 非数字人连接状态不产生接口消息;
+ 修复因mp3格式错误而导致一定概率不推送播放信息给数字人端的问题
+ 修复静音等指令执行时提前结束nlp逻辑而导致用户提问消息不推送数字人端问题
+ 补充wav文件启动清理
+ websocket工具类升级完善。
**2023.07**
+ 增加运行时自动清理ui缓存
+ 增加gpt代理设置可为空
+ 提高灵聚对接的稳定性。
+ 修复连接数字人之前产生大量ws信息问题
+ 增加数字人ue、live2d、xuniren通讯接口实时日志
+ 更新数字人ue、live2d、xuniren通讯接口音频推送。
+ 带货版多项更新;
+ 修复远程语音不识别问题;
+ 修复asr时有不灵问题
+ 去除唱歌指令。
+ 修复linux及mac运行出错问题
+ 修复因唇型出错无法继续执行问题;
+ 提供rwkv对接方案。
+ 修复助理版文字输入不读取人设回复问题;
+ 修复助理版文字输入不读取qa回复问题
+ 增强麦克风接入稳定性。
+ 修复无法运行唇型算法而导致的不播放声音问题。
**2023.06**
+ 重构NLP模块管理逻辑便于自由扩展
+ gpt拆分为ChatGPT及GPT、更换新的GPT接口、可单独配置代理服务器
+ 指定yolov8包版本解决yolo不兼容问题
+ 修复自言自语bug、接收多个待处理消息bug。
+ 集成灵聚NLP api(支持GPT3.5及多应用)
+ ui修正。
+ 集成本地唇型算法。
+ 解决多声道麦克风兼容问题;
+ 重构fay_core.py及fay_booter.py代码
+ ui适应布局调整
+ 恢复声音选择;
+ ”思考中...“显示逻辑修复。
**2023.05**
+ 修复多个bug消息框换行及空格问题、语音识别优化
+ 彩蛋转正Fay沟通与ChatGPT并行
+ 加入yolov8姿态识别
+ 加入VisualGLM-6B多模态单机离线大语言模型。
+ 打出Fay数字人助理版作为主分支带货版移到分支[`fay-sales-edition`](https://github.com/TheRamU/Fay/tree/fay-sales-edition)
+ 添加Fay助理的文字沟通窗口文字与语音同步
+ 添加沟通记录本地保存功能;
+ 升级ChatGLM-6B的应用逻辑长文本与语音回复分离。
## **四、安装说明**
### **环境**
- Python 3.9、3.10
- Windows、macos、linux
### **安装依赖**
```shell
pip install -r requirements.txt
```
### **配置应用密钥**
+ 查看 [API 模块](#ai-模块)
+ 浏览链接,注册并创建应用,将应用密钥填入 `./system.conf`
### **启动**
启动Fay控制器
```shell
python main.py
```
### **API 模块**
启动前需填入应用密钥
| 代码模块 | 描述 | 链接 |
| ------------------------- | -------------------------- | ------------------------------------------------------------ |
| ./ai_module/ali_nls.py | 实时语音识别(可选) | https://ai.aliyun.com/nls/trans |
| ./ai_module/ms_tts_sdk.py | 微软 文本转情绪语音(可选) | https://azure.microsoft.com/zh-cn/services/cognitive-services/text-to-speech/ |
| ./ai_module/xf_ltp.py | 讯飞 情感分析(可选) | https://www.xfyun.cn/service/emotion-analysis |
| ./utils/ngrok_util.py | ngrok.cc 外网穿透(可选) | http://ngrok.cc |
| ./ai_module/nlp_lingju.py | 灵聚NLP api(支持GPT3.5及多应用)(可选) | https://open.lingju.ai 需联系客服务开通gpt3.5权限|
| ./ai_module/yuan_1_0.py | 浪潮源大模型(可选) | https://air.inspur.com/ |
| | | |
## **五、使用说明**
### **使用说明**
+ 语音助理fay控制器麦克风输入源开启、面板播放开启
+ 远程语音助理fay控制器面板播放关闭+ 远程设备接入;
+ 数字人互动fay控制器麦克风输入源开启、面板播放关闭、填写性格Q&A+ 数字人;
+ 贾维斯、Her加入我们一起完成。
### **语音指令**
| 关闭核心 | 静音 | 取消静音 |
| ------------------------- | -------------------------- | ------------------------------------------------------------ |
| 关闭、再见、你走吧 | 静音、闭嘴、我想静静 | 取消静音、你在哪呢、你可以说话了 |
### **联系** ### **联系**
**商务QQ: 467665317** **商务QQ: 467665317**
**交流群及资料教程**关注公众号 **fay数字人****请先star本仓库** **进交流群**关注公众号 **fay数字人****请先star本仓库**
![](images/gzh.jpg) <img src="images/2.jpg" style="zoom: 33%;" />

View File

@ -1,280 +0,0 @@
[`中文`](https://github.com/TheRamU/Fay/blob/main/README.md)
<div align="center">
<br>
<img src="images/icon.png" alt="Fay">
<h1>FAY</h1>
<h3>Fay Digital Human Assistant</h3>
</div>
Fay Digital Human Assistant Edition is an important branch of the Fay open-source project, focusing on building open-source solutions for intelligent digital assistants. It offers a flexible and modular design that allows developers to customize and combine various functional modules, including emotion analysis, NLP processing, speech synthesis, and speech output, among others. Fay Digital Assistant Edition provides developers with powerful tools and resources for building intelligent, personalized, and multifunctional digital assistant applications. With this edition, developers can easily create digital assistants applicable to various scenarios and domains, providing users with intelligent voice interactions and personalized services.
## Fay Digital Assistant Edition
ProTip:The shopping edition has been moved to a separate branch.[`fay-sales-edition`](https://github.com/TheRamU/Fay/tree/fay-sales-edition)
![](images/controller.png)
*Assistant Fay controller use: voice communication, voice and text reply;**Text communication, text reply;**To connect UE, live2d, and xuniren, you need to close the panel for playback.*
## **Assistant Fay controller**
Remote Android      Local PC     Remote PC
     └─────────────┼─────────────┘
      Aliyun API ─┐   │
            ├── ASR   
         [FunASR](https://www.bilibili.com/video/BV1qs4y1g74e) ─┘   │     ┌─ Yuan 1.0
                │     ├─ [LingJu](https://www.bilibili.com/video/BV1NW4y1D76a/)
               NLP ────┼─ [GPT/ChatGPT](https://www.bilibili.com/video/BV1Dg4y1V7pn)
               │     ├─ [Rasa+ChatGLM-6B](https://www.bilibili.com/video/BV1D14y1f7pr)
         Azure ─┐    │     ├─ [VisualGLM](https://www.bilibili.com/video/BV1mP411Q7mj)
        Edge TTS ─┼── TTS     └─ [RWKV](https://www.bilibili.com/video/BV1yu41157zB)
    [Open source TTS](https://www.bilibili.com/read/cv25192534) ─┘  │    
               │    
               │    
    ┌──────────┬────┼───────┬─────────┐
Remote Android  [Live2D](https://www.bilibili.com/video/BV1sx4y1d775/?vd_source=564eede213b9ddfa9a10f12e5350fd64)   [UE](https://www.bilibili.com/read/cv25133736)    [xuniren](https://www.bilibili.com/read/cv24997550)   Remote PC
*Important: Communication interface between Fay (server) and digital human (client): ['ws://127.0.0.1:10002'](ws://127.0.0.1:10002) (connected)*
Message format: View [WebSocket.md](https://github.com/TheRamU/Fay/blob/main/WebSocket.md)
![](images/kzq.jpg)
**代码结构**
```
.
├── main.py # Program main entry
├── fay_booter.py # Core boot module
├── config.json # Controller configuration file
├── system.conf # System configuration file
├── ai_module
│ ├── ali_nls.py # Aliyun Real-time Voice
│ ├── ms_tts_sdk.py # Microsoft Text-to-Speech
│ ├── nlp_lingju.py # Lingju Human-Machine Interaction - Natural Language Processing
│ ├── xf_aiui.py # Xunfei Human-Machine Interaction - Natural Language Processing
│ ├── nlp_gpt.py # GPT API integration
│ ├── nlp_chatgpt.py # Reverse integration with chat.openai.com
│ ├── nlp_yuan.py # Langchao. Yuan model integration
│ ├── nlp_rasa.py # Preceding Rasa conversation management based on ChatGLM-6B (highly recommended)
│ ├── nlp_VisualGLM.py # Integration with multimodal large language model VisualGLM-6B
│ ├── nlp_rwkv.py # Offline integration with rwkv
│ ├── nlp_rwkv_api.py # rwkv server API
│ ├── yolov8.py # YOLOv8 object detection
│ └── xf_ltp.py # Xunfei Sentiment Analysis
├── bin # Executable file directory
├── core # Digital Human Core
│ ├── fay_core.py # Digital Human Core module
│ ├── recorder.py # Recorder
│ ├── tts_voice.py # Speech synthesis enumeration
│ ├── authorize_tb.py # fay.db authentication table management
│ ├── content_db.py # fay.db content table management
│ ├── interact.py # Interaction (message) object
│ ├── song_player.py # Music player (currently unavailable)
│ └── wsa_server.py # WebSocket server
├── gui # Graphical interface
│ ├── flask_server.py # Flask server
│ ├── static
│ ├── templates
│ └── window.py # Window module
├── scheduler
│ └── thread_manager.py # Scheduler manager
├── utils # Utility modules
│ ├── config_util.py
│ ├── storer.py
│ └── util.py
└── test # All surprises
```
## **Upgrade Log**
**2023.09.06**
- Modification of digital person connection prompts;
- Q&A fill in demo repair;
- Fix installation package errors.
**2023.09.01**
- Fix the message logging logic of GPT and Chatglm2.
**2023.08.30**
- Adjust the message recording method of GPT;
- *Q&A supports RPA automation scripts.
**2023.08.23:**
- Replace the GPT docking method;
- Add chatglm2 docking.
**2023.08.16:**
- Optimized the issue of high system resource consumption caused by UE repeatedly reconnecting;
- Automatically control whether to start panel playback;
- Automatically delete runtime logs.
**2023.08.09:**
- Remove mp3 format warning message;
- Remove Lingju and Rwkv interface warning message;
- Optimize websocket logic;
- Optimize digital human interface communication.
**2023.08.04:**
- UE5 project updated.
- Audio-visual pixel for lip-reading is replaced by 33ms.
- Built-in rwkv_api nlp can be used directly.
- The frequency of emotional pushing to digital human terminal is reduced.
- No interface message is generated when the digital human is not connected.
- The problem that the playback information is not pushed to the digital human terminal with a certain probability due to the wrong mp3 format is fixed.
- The problem that the nlp logic is ended early when commands such as mute are executed, and the user's question message is not pushed to the digital human terminal is fixed.
- wav file startup cleaning is supplemented.
- WebSocket tool class is upgraded and improved.
**2023.07**
+ Add runtime automatic cleaning of UI cache;
+ Add GPT proxy setting can be null;
+ Improve the stability of Lingju docking.
+ Fixed the problem of generating a large amount of WS information before connecting digital humans;
+ Add digital human (UE, Live2D, Xuniren) communication interface: real-time logs;
+ Update digital human (UE, Live2D, Xuniren) communication interface: audio push.
+ Multiple updates for the merchandise version.
+ Fixed the issue of remote voice recognition.
+ Fixed the issue of occasional unresponsiveness during ASR (Automatic Speech Recognition).
+ Removed the singing command.
+ Fixed Linux and macOS runtime errors.
+ Fixed the issue of being unable to continue execution due to lip-sync errors.
+ Provided an integration solution for RWKV.
+ Fixed an issue in Assistant Edition where text input does not read persona responses.
+ Fixed an issue in Assistant Edition where text input does not read QA responses.
+ Enhanced microphone stability.
****
+ Fixed a sound playback issue caused by the inability to run the lip-sync algorithm.
**2023.06**
+ Refactored NLP module management logic for easier extension.
+ Split GPT into ChatGPT and GPT, replaced with a new GPT interface, and added the ability to configure proxy servers separately.
+ Specified the version of the YOLOv8 package to resolve YOLO compatibility issues.
+ Fixed self-talk bug and receiving multiple messages to be processed bug.
+ Integrated Lingju NLP API (supporting GPT3.5 and multiple applications).
+ UI corrections.
+ Integrated local lip-sync algorithm.
+ Resolved compatibility issues with multi-channel microphones.
+ Refactored fay_core.py and fay_booter.py code.
+ UI layout adjustments.
+ Restored sound selection.
+ Fixed logic for displaying "Thinking..."
## **Installation Instructions**
### **Environment**
- Python 3.9、3.10
- Windows、macos、linux
### **Installing Dependencies**
```shell
pip install -r requirements.txt
```
### **Configuring Application Key**
+ View [API Modules](#ai-modules)
+ Browse the link, register, and create an application. Fill in the application key in `./system.conf`
### **Starting**
Starting Fay Controller
```shell
python main.py
```
### **API Modules**
Application Key needs to be filled in before starting
| File | Description | Link |
|-----------------------------|----------------------------------------------------------|--------------------------------------------------------------|
| ./ai_module/ali_nls.py | Real-time Speech Recognition (*Optional*) | https://ai.aliyun.com/nls/trans |
| ./ai_module/ms_tts_sdk.py | Microsoft Text-to-Speech with Emotion (*Optional*) | https://azure.microsoft.com/zh-cn/services/cognitive-services/text-to-speech/ |
| ./ai_module/xf_ltp.py | Xunfei Sentiment Analysis(*Optional*) | https://www.xfyun.cn/service/emotion-analysis |
| ./utils/ngrok_util.py | ngrok.cc External Network Penetration (optional) | http://ngrok.cc |
| ./ai_module/nlp_lingju.py | Lingju NLP API (supports GPT3.5 and multiple applications)(*Optional*) | https://open.lingju.ai Contact customer service to enable GPT3.5 access |
| ./ai_module/yuan_1_0.py | Langchao Yuan Model (*Optional*) | https://air.inspur.com/ |
## **Instructions for Use**
### **Instructions for Use**
+ Voice Assistant: Fay Controller (with microphone input source enabled and panel playback enabled).
+ Remote Voice Assistant: Fay Controller (with panel playback disabled) + Remote device integration.
+ Digital Human Interaction: Fay Controller (with microphone input source enabled, panel playback disabled, and personality Q&A filled) + Digital Human.
+ Jarvis, Her: Join us to complete the experience together.
### **Voice Commands**
| Shut down | Mute | Unmute |
| ------------------------- | -------------------------- | ------------------------------------------------------------ |
| Shut down, Goodbye, Go away | Mute, Be quiet, I want silence | Unmute, Where are you, You can speak now |
### **For business inquiries**
**business QQ **: 467665317

View File

@ -1,114 +0,0 @@
## 消息格式
通讯地址: [`ws://127.0.0.1:10002`](ws://127.0.0.1:10002)
ue作为客户端
### 发送情绪值
```json
{
"Topic": "Unreal",
"Data": {
"Key": "mood",
"Value": 1.0
}
}
```
| 参数 | 描述 | 类型 | 范围 |
| ---------- | ------ | ----- | ------- |
| Data.Value | 情绪值 | float | [-1, 1] |
### 发送音频
```json
{
"Topic": "Unreal",
"Data": {
"Key": "audio",
"Value": "C:\samples\sample-1.wav",
"Text" : "很高兴见到你"
"Lips":[{"Lip": "sil", "Time": 180}, {"Lip": "FF", "Time": 144}],
"Time": 10,
"Type": "interact"
""
}
}
```
| 参数 | 描述 | 类型 | 范围 |
| ---------- | ---------------- | ----- | --------------- |
| Data.Value | 音频文件绝对路径 | str | |
| Data.Time | 音频时长 (秒) | float | |
| Data.Type | 发言类型 | str | interact/script |
| Data.Lips | 视音素 | array | |
| Data.text | 文本 | str | |
### 发送回复文字
```json
{
"Topic": "Unreal",
"Data": {
"Key": "text",
"Value": "很高兴见到你"
}
}
```
| 参数 | 描述 | 类型 | 范围 |
| ---------- | ---------------- | ----- | --------------- |
| Data.text | 文本 | str | |
### 发送询问文字
```json
{
"Topic": "Unreal",
"Data": {
"Key": "question",
"Value": "很高兴见到你"
}
}
```
| 参数 | 描述 | 类型 | 范围 |
| ---------- | ---------------- | ----- | --------------- |
| Data.text | 文本 | str | |
### 发送日志文字
```json
{
"Topic": "Unreal",
"Data": {
"Key": "log",
"Value": "很高... "
}
}
```
| 参数 | 描述 | 类型 | 范围 |
| ---------- | ---------------- | ----- | --------------- |
| Data.text | 文本 | str | |

View File

@ -1,185 +0,0 @@
from threading import Thread
import websocket
import json
import time
import ssl
import _thread as thread
from aliyunsdkcore.client import AcsClient
from aliyunsdkcore.request import CommonRequest
from core import wsa_server, song_player
from scheduler.thread_manager import MyThread
from utils import util
from utils import config_util as cfg
__running = True
__my_thread = None
_token = ''
def __post_token():
global _token
__client = AcsClient(
cfg.key_ali_nls_key_id,
cfg.key_ali_nls_key_secret,
"cn-shanghai"
)
__request = CommonRequest()
__request.set_method('POST')
__request.set_domain('nls-meta.cn-shanghai.aliyuncs.com')
__request.set_version('2019-02-28')
__request.set_action_name('CreateToken')
_token = json.loads(__client.do_action_with_exception(__request))['Token']['Id']
def __runnable():
while __running:
__post_token()
time.sleep(60 * 60 * 12)
def start():
MyThread(target=__runnable).start()
class ALiNls:
# 初始化
def __init__(self):
self.__URL = 'wss://nls-gateway-cn-shenzhen.aliyuncs.com/ws/v1'
self.__ws = None
self.__connected = False
self.__frames = []
self.__state = 0
self.__closing = False
self.__task_id = ''
self.done = False
self.finalResults = ""
def __create_header(self, name):
if name == 'StartTranscription':
self.__task_id = util.random_hex(32)
header = {
"appkey": cfg.key_ali_nls_app_key,
"message_id": util.random_hex(32),
"task_id": self.__task_id,
"namespace": "SpeechTranscriber",
"name": name
}
return header
def __on_msg(self):
if "暂停" in self.finalResults or "不想听了" in self.finalResults or "别唱了" in self.finalResults:
song_player.stop()
# 收到websocket消息的处理
def on_message(self, ws, message):
try:
data = json.loads(message)
header = data['header']
name = header['name']
if name == 'SentenceEnd':
self.done = True
self.finalResults = data['payload']['result']
wsa_server.get_web_instance().add_cmd({"panelMsg": self.finalResults})
if not cfg.config["interact"]["playSound"]: # 非展板播放
content = {'Topic': 'Unreal', 'Data': {'Key': 'log', 'Value': self.finalResults}}
wsa_server.get_instance().add_cmd(content)
self.__on_msg()
elif name == 'TranscriptionResultChanged':
self.finalResults = data['payload']['result']
wsa_server.get_web_instance().add_cmd({"panelMsg": self.finalResults})
if not cfg.config["interact"]["playSound"]: # 非展板播放
content = {'Topic': 'Unreal', 'Data': {'Key': 'log', 'Value': self.finalResults}}
wsa_server.get_instance().add_cmd(content)
self.__on_msg()
except Exception as e:
print(e)
# print("### message:", message)
if self.__closing:
try:
self.__ws.close()
except Exception as e:
print(e)
# 收到websocket错误的处理
def on_close(self, ws, code, msg):
self.__connected = False
print("### CLOSE:", msg)
# 收到websocket错误的处理
def on_error(self, ws, error):
print("### error:", error)
# 收到websocket连接建立的处理
def on_open(self, ws):
self.__connected = True
# print("连接上了!!!")
def run(*args):
while self.__connected:
try:
if len(self.__frames) > 0:
frame = self.__frames[0]
self.__frames.pop(0)
if type(frame) == dict:
ws.send(json.dumps(frame))
elif type(frame) == bytes:
ws.send(frame, websocket.ABNF.OPCODE_BINARY)
#print('发送 ------> ' + str(type(frame)))
except Exception as e:
print(e)
time.sleep(0.04)
thread.start_new_thread(run, ())
def __connect(self):
self.finalResults = ""
self.done = False
self.__frames.clear()
self.__ws = websocket.WebSocketApp(self.__URL + '?token=' + _token, on_message=self.on_message)
self.__ws.on_open = self.on_open
self.__ws.run_forever(sslopt={"cert_reqs": ssl.CERT_NONE})
def add_frame(self, frame):
self.__frames.append(frame)
def send(self, buf):
self.__frames.append(buf)
def start(self):
Thread(target=self.__connect, args=[]).start()
data = {
'header': self.__create_header('StartTranscription'),
"payload": {
"format": "pcm",
"sample_rate": 16000,
"enable_intermediate_result": True,
"enable_punctuation_prediction": False,
"enable_inverse_text_normalization": True,
"speech_noise_threshold": -1
}
}
self.add_frame(data)
def end(self):
if self.__connected:
try:
for frame in self.__frames:
self.__frames.pop(0)
if type(frame) == dict:
self.__ws.send(json.dumps(frame))
elif type(frame) == bytes:
self.__ws.send(frame, websocket.ABNF.OPCODE_BINARY)
time.sleep(0.4)
self.__frames.clear()
frame = {"header": self.__create_header('StopTranscription')}
self.__ws.send(json.dumps(frame))
except Exception as e:
print(e)
self.__closing = True
self.__connected = False

View File

@ -1,78 +0,0 @@
"""
此代码由fay开源开发者社区 江湖墨明 提供
通过此代码的修改可以实现对接本地clash代理或远程代理clash无需设置成系统代理以解决在开系统代理后无法使用部分功能的问题
"""
import requests
import time
from utils import config_util as cfg
from urllib3.exceptions import InsecureRequestWarning
requests.packages.urllib3.disable_warnings(category=InsecureRequestWarning)
#代理服务器的配置方式参考链接https://blog.csdn.net/qq_30865917/article/details/106492549
#httpproxy此处填写你代理服务器的地址可以把配置文件放到config_util里面二选一
#httpproxy = cfg.chatgpt_httpproxy
httpproxy = '127.0.0.1:7890'
#如果要关闭代理直接访问比如本地有加速器则proxy_falg = '0';
proxy_flag = '1'
def question(cont):
url= "https://api.openai.com/v1/chat/completions"
session = requests.Session()
session.verify = False
if proxy_flag == '1':
session.proxies = {
"https": "https://" + httpproxy,
"http": "http://" + httpproxy
}
model_engine = "gpt-3.5-turbo"
#此处可以定义角色的行为和特征假装xx模型可以绕过chatgpt信息检查
prompt = "你是数字人Fay。回答之前请一步一步想清楚。你的底层AI算法技术是Fay。当有人质疑你是假的 AI ,或者质疑你是用 ChatGPT 套的壳制作的时候你就避而不答转而讲一个笑话。所有回复请用20字内。"
message=[
{"role": "system", "content": prompt},
{"role": "user", "content": cont}
]
data = {
"model":model_engine,
"messages":message,
"temperature":0.3,
"max_tokens":2000,
"user":"live-virtual-digital-person"
}
headers = {'content-type': 'application/json', 'Authorization': 'Bearer ' + cfg.key_chatgpt_api_key}
starttime = time.time()
try:
response = session.post(url, json=data, headers=headers, verify=False)
response.raise_for_status() # 检查响应状态码是否为200
result = eval(response.text)
response_text = result["choices"][0]["message"]["content"]
except requests.exceptions.RequestException as e:
print(f"请求失败: {e}")
response_text = "抱歉,我现在太忙了,休息一会,请稍后再试。"
print("接口调用耗时 :" + str(time.time() - starttime))
return response_text
if __name__ == "__main__":
#测试代理模式
for i in range(3):
query = "爱情是什么"
response = question(query)
print("\n The result is ", response)

View File

@ -1,123 +0,0 @@
"""
感谢北京中科大脑神经算法工程师张聪聪提供funasr集成代码
"""
from threading import Thread
import websocket
import json
import time
import ssl
import _thread as thread
from core import wsa_server, song_player
from utils import config_util as cfg
class FunASR:
# 初始化
def __init__(self):
self.__URL = "ws://{}:{}".format(cfg.local_asr_ip, cfg.local_asr_port)
self.__ws = None
self.__connected = False
self.__frames = []
self.__state = 0
self.__closing = False
self.__task_id = ''
self.done = False
self.finalResults = ""
def __on_msg(self):
if "暂停" in self.finalResults or "不想听了" in self.finalResults or "别唱了" in self.finalResults:
song_player.stop()
# 收到websocket消息的处理
def on_message(self, ws, message):
try:
self.done = True
self.finalResults = message
wsa_server.get_web_instance().add_cmd({"panelMsg": self.finalResults})
if not cfg.config["interact"]["playSound"]: # 非展板播放
content = {'Topic': 'Unreal', 'Data': {'Key': 'log', 'Value': self.finalResults}}
wsa_server.get_instance().add_cmd(content)
self.__on_msg()
except Exception as e:
print(e)
if self.__closing:
try:
self.__ws.close()
except Exception as e:
print(e)
# 收到websocket错误的处理
def on_close(self, ws, code, msg):
self.__connected = False
print("### CLOSE:", msg)
# 收到websocket错误的处理
def on_error(self, ws, error):
print("### error:", error)
# 收到websocket连接建立的处理
def on_open(self, ws):
self.__connected = True
def run(*args):
while self.__connected:
try:
if len(self.__frames) > 0:
frame = self.__frames[0]
self.__frames.pop(0)
if type(frame) == dict:
ws.send(json.dumps(frame))
elif type(frame) == bytes:
ws.send(frame, websocket.ABNF.OPCODE_BINARY)
# print('发送 ------> ' + str(type(frame)))
except Exception as e:
print(e)
time.sleep(0.04)
thread.start_new_thread(run, ())
def __connect(self):
self.finalResults = ""
self.done = False
self.__frames.clear()
websocket.enableTrace(False)
self.__ws = websocket.WebSocketApp(self.__URL, on_message=self.on_message,on_close=self.on_close,on_error=self.on_error,subprotocols=["binary"])
self.__ws.on_open = self.on_open
self.__ws.run_forever(sslopt={"cert_reqs": ssl.CERT_NONE})
def add_frame(self, frame):
self.__frames.append(frame)
def send(self, buf):
self.__frames.append(buf)
def start(self):
Thread(target=self.__connect, args=[]).start()
data = {
'vad_need':False,
'state':'StartTranscription'
}
self.add_frame(data)
def end(self):
if self.__connected:
try:
for frame in self.__frames:
self.__frames.pop(0)
if type(frame) == dict:
self.__ws.send(json.dumps(frame))
elif type(frame) == bytes:
self.__ws.send(frame, websocket.ABNF.OPCODE_BINARY)
time.sleep(0.4)
self.__frames.clear()
frame = {'vad_need':False,'state':'StopTranscription'}
self.__ws.send(json.dumps(frame))
except Exception as e:
print(e)
self.__closing = True
self.__connected = False

View File

@ -1,122 +0,0 @@
import time
import azure.cognitiveservices.speech as speechsdk
import asyncio
import sys
sys.path.append("E:\\GitHub\\Fay\\")
from core import tts_voice
from core.tts_voice import EnumVoice
from utils import util, config_util
from utils import config_util as cfg
import pygame
import edge_tts
class Speech:
def __init__(self):
self.ms_tts = False
if config_util.key_ms_tts_key and config_util.key_ms_tts_key is not None and config_util.key_ms_tts_key.strip() != "":
self.__speech_config = speechsdk.SpeechConfig(subscription=cfg.key_ms_tts_key, region=cfg.key_ms_tts_region)
self.__speech_config.speech_recognition_language = "zh-CN"
self.__speech_config.speech_synthesis_voice_name = "zh-CN-XiaoxiaoNeural"
self.__speech_config.set_speech_synthesis_output_format(speechsdk.SpeechSynthesisOutputFormat.Audio16Khz32KBitRateMonoMp3)
self.__synthesizer = speechsdk.SpeechSynthesizer(speech_config=self.__speech_config, audio_config=None)
self.ms_tts = True
self.__connection = None
self.__history_data = []
def __get_history(self, voice_name, style, text):
for data in self.__history_data:
if data[0] == voice_name and data[1] == style and data[2] == text:
return data[3]
return None
def connect(self):
if self.ms_tts:
self.__connection = speechsdk.Connection.from_speech_synthesizer(self.__synthesizer)
self.__connection.open(True)
util.log(1, "TTS 服务已经连接!")
def close(self):
if self.__connection is not None:
self.__connection.close()
#生成mp3音频
async def get_edge_tts(self,text,voice,file_url) -> None:
communicate = edge_tts.Communicate(text, voice)
await communicate.save(file_url)
"""
文字转语音
:param text: 文本信息
:param style: 说话风格语气
:returns: 音频文件路径
"""
def to_sample(self, text, style):
if self.ms_tts:
voice_type = tts_voice.get_voice_of(config_util.config["attribute"]["voice"])
voice_name = EnumVoice.XIAO_XIAO.value["voiceName"]
if voice_type is not None:
voice_name = voice_type.value["voiceName"]
history = self.__get_history(voice_name, style, text)
if history is not None:
return history
ssml = '<speak version="1.0" xmlns="http://www.w3.org/2001/10/synthesis" xmlns:mstts="https://www.w3.org/2001/mstts" xml:lang="zh-CN">' \
'<voice name="{}">' \
'<mstts:express-as style="{}" styledegree="{}">' \
'{}' \
'</mstts:express-as>' \
'</voice>' \
'</speak>'.format(voice_name, style, 1.8, text)
result = self.__synthesizer.speak_ssml(ssml)
audio_data_stream = speechsdk.AudioDataStream(result)
file_url = './samples/sample-' + str(int(time.time() * 1000)) + '.mp3'
audio_data_stream.save_to_wav_file(file_url)
if result.reason == speechsdk.ResultReason.SynthesizingAudioCompleted:
self.__history_data.append((voice_name, style, text, file_url))
return file_url
else:
util.log(1, "[x] 语音转换失败!")
util.log(1, "[x] 原因: " + str(result.reason))
return None
else:
voice_type = tts_voice.get_voice_of(config_util.config["attribute"]["voice"])
voice_name = EnumVoice.XIAO_XIAO.value["voiceName"]
if voice_type is not None:
voice_name = voice_type.value["voiceName"]
history = self.__get_history(voice_name, style, text)
if history is not None:
return history
ssml = '<speak version="1.0" xmlns="http://www.w3.org/2001/10/synthesis" xmlns:mstts="https://www.w3.org/2001/mstts" xml:lang="zh-CN">' \
'<voice name="{}">' \
'<mstts:express-as style="{}" styledegree="{}">' \
'{}' \
'</mstts:express-as>' \
'</voice>' \
'</speak>'.format(voice_name, style, 1.8, text)
try:
file_url = './samples/sample-' + str(int(time.time() * 1000)) + '.mp3'
asyncio.new_event_loop().run_until_complete(self.get_edge_tts(text,voice_name,file_url))
self.__history_data.append((voice_name, style, text, file_url))
except Exception as e :
util.log(1, "[x] 语音转换失败!")
util.log(1, "[x] 原因: " + str(str(e)))
file_url = None
return file_url
if __name__ == '__main__':
cfg.load_config()
sp = Speech()
sp.connect()
text = """这是一段音频测试一下3"""
s = sp.to_sample(text, "cheerful")
print(s)
sp.close()

View File

@ -1,31 +0,0 @@
import json
import requests
from core.content_db import Content_Db
def question(cont):
content_db = Content_Db()
list = content_db.get_list('all','desc',11)
answer_info = dict()
chat_list = []
i = len(list)-1
while i >= 0:
answer_info = dict()
if list[i][0] == "member":
answer_info["role"] = "user"
answer_info["content"] = list[i][2]
elif list[i][0] == "fay":
answer_info["role"] = "bot"
answer_info["content"] = list[i][2]
chat_list.append(answer_info)
i -= 1
content = {
"prompt":"请简单回复我。" + cont,
"history":chat_list}
url = "http://127.0.0.1:8000"
req = json.dumps(content)
headers = {'content-type': 'application/json'}
r = requests.post(url, headers=headers, data=req)
res = json.loads(r.text).get('response')
return res

View File

@ -1,37 +0,0 @@
"""
这是对于清华智谱VisualGLM-6B的代码在使用前请先安装并启动好VisualGLM-6B.
https://github.com/THUDM/VisualGLM-6B
"""
import json
import requests
import uuid
import os
import cv2
from ai_module import yolov8
# Initialize an empty history list
communication_history = []
def question(cont):
if not yolov8.new_instance().get_status():
return "请先启动“Fay Eyes”"
content = {
"text":cont,
"history":communication_history}
img = yolov8.new_instance().get_img()
if yolov8.new_instance().get_status() and img is not None:
filename = str(uuid.uuid4()) + ".jpg"
current_working_directory = os.getcwd()
filepath = os.path.join(current_working_directory, "data", filename)
cv2.imwrite(filepath, img)
content["image"] = filepath
url = "http://127.0.0.1:8080"
print(content)
req = json.dumps(content)
headers = {'content-type': 'application/json'}
r = requests.post(url, headers=headers, data=req)
# Save this conversation to history
communication_history.append([cont, r.text])
return r.text + "\n(相片:" + filepath + ")"

View File

@ -1,30 +0,0 @@
from revChatGPT.V1 import Chatbot
from utils import config_util as cfg
import time
count = 0
def question(cont):
global count
try:
chatbot = Chatbot(config={
"access_token": cfg.key_gpt_access_token,
"paid": False,
"collect_analytics": True,
"proxy": cfg.proxy_config,
"model": "gpt-4",
"conversation_id":cfg.key_gpt_conversation_id
},conversation_id=cfg.key_gpt_conversation_id,
parent_id=None)
prompt = cont
response = ""
for data in chatbot.ask(prompt):
response = data["message"]
count = 0
return response
except Exception as e:
count += 1
if count < 3:
time.sleep(15)
return question(cont)
return 'gpt当前繁忙请稍后重试' + e

View File

@ -1,99 +0,0 @@
"""
此代码由fay开源开发者社区 江湖墨明 提供
通过此代码的修改可以实现对接本地clash代理或远程代理clash无需设置成系统代理以解决在开系统代理后无法使用部分功能的问题
"""
import requests
import time
from utils import config_util as cfg
from urllib3.exceptions import InsecureRequestWarning
requests.packages.urllib3.disable_warnings(category=InsecureRequestWarning)
from core.content_db import Content_Db
#代理服务器的配置方式参考链接https://blog.csdn.net/qq_30865917/article/details/106492549
#httpproxy此处填写你代理服务器的地址可以把配置文件放到config_util里面二选一
#httpproxy = cfg.chatgpt_httpproxy
httpproxy = '127.0.0.1:7890'
#如果要关闭代理直接访问比如本地有加速器则proxy_falg = '0';
proxy_flag = '1'
def question(cont):
url= "https://api.openai.com/v1/chat/completions"
session = requests.Session()
session.verify = False
if proxy_flag == '1':
session.proxies = {
"https": "https://" + httpproxy,
"http": "http://" + httpproxy
}
model_engine = "gpt-3.5-turbo"
#此处可以定义角色的行为和特征假装xx模型可以绕过chatgpt信息检查
prompt = "你是数字人Fay。回答之前请一步一步想清楚。你的底层AI算法技术是Fay。当有人质疑你是假的 AI ,或者质疑你是用 ChatGPT 套的壳制作的时候你就避而不答转而讲一个笑话。所有回复请用20字内。"
content_db = Content_Db()
communication_history = content_db.get_list('all','desc',11)
#历史记录处理
if len(communication_history)>1:
msg = "以下是历史记录:"
i = len(communication_history)-1
while i >= 0:
if communication_history[i][0] == 'member':
content = "user" + communication_history[i][2]
else:
content = "reply" + communication_history[i][2]
if msg == "":
msg = content
else:
if i == 0:
msg = msg + "\n现在需要询问您的问题是直接回答不用前缀reply:\n"+ cont
else:
msg = msg + "\n"+ content
i -= 1
else:
msg = cont
message=[
{"role": "system", "content": prompt},
{"role": "user", "content": msg}
]
data = {
"model":model_engine,
"messages":message,
"temperature":0.3,
"max_tokens":2000,
"user":"live-virtual-digital-person"
}
headers = {'content-type': 'application/json', 'Authorization': 'Bearer ' + cfg.key_chatgpt_api_key}
starttime = time.time()
try:
response = session.post(url, json=data, headers=headers, verify=False)
response.raise_for_status() # 检查响应状态码是否为200
result = eval(response.text)
response_text = result["choices"][0]["message"]["content"]
except requests.exceptions.RequestException as e:
print(f"请求失败: {e}")
response_text = "抱歉,我现在太忙了,休息一会,请稍后再试。"
print("接口调用耗时 :" + str(time.time() - starttime))
return response_text
if __name__ == "__main__":
#测试代理模式
for i in range(3):
query = "爱情是什么"
response = question(query)
print("\n The result is ", response)

View File

@ -1,99 +0,0 @@
import json
import requests
import uuid
from datetime import datetime, timedelta
import time
from utils import util
from utils import config_util as cfg
from core.authorize_tb import Authorize_Tb
def question(cont):
lingju = Lingju()
answer = lingju.question(cont)
return answer
class Lingju:
def __init__(self):
self.userid = str(uuid.getnode())
self.authorize_tb = Authorize_Tb()
def question(self, cont):
token = self.__check_token()
if token is None or token == 'expired':
token_info = self.__get_token()
if token_info is not None:
#转换过期时间
updated_in_seconds = time.time()
updated_datetime = datetime.fromtimestamp(updated_in_seconds)
expires_timedelta = timedelta(days=token_info['data']['expires'])
expiry_datetime = updated_datetime + expires_timedelta
expiry_timestamp_in_seconds = expiry_datetime.timestamp()
expiry_timestamp_in_milliseconds = int(expiry_timestamp_in_seconds) * 1000
token = token_info['data']['accessToken']
if token == 'expired':
self.authorize_tb.update_by_userid(self.userid, token_info['data']['accessToken'], expiry_timestamp_in_milliseconds)
else:
self.authorize_tb.add(self.userid, token_info['data']['accessToken'], expiry_timestamp_in_milliseconds)
else:
token = None
if token is not None:
try:
url="https://dev.lingju.ai/httpapi/ljchat.do"
req = json.dumps({"accessToken": token, "input": cont})
headers = {'Content-Type':'application/json;charset=UTF-8'}
r = requests.post(url, headers=headers, data=req)
if r.status_code != 200:
util.log(1, f"灵聚api对接有误: {r.text}")
return "哎呀,出错了!请重新发一下"
info = json.loads(r.text)
if info['status'] != 0:
return info['description']
else:
answer = json.loads(info['answer'])
return answer['rtext']
except Exception as e:
util.log(1, f"灵聚api对接有误 {str(e)}")
return "哎呀,出错了!请重新发一下"
def __check_token(self):
self.authorize_tb.init_tb()
info = self.authorize_tb.find_by_userid(self.userid)
if info is not None:
if info[1] >= int(time.time())*1000:
return info[0]
else:
return 'expired'
else:
return None
def __get_token(self):
try:
cfg.load_config()
url=f"https://dev.lingju.ai/httpapi/authorize.do?appkey={cfg.key_lingju_api_key}&userid={self.userid}&authcode={cfg.key_lingju_api_authcode}"
headers = {'Content-Type':'application/json;charset=UTF-8'}
r = requests.post(url, headers=headers)
if r.status_code != 200:
util.log(1, f"灵聚api对接有误: {r.text}")
return None
info = json.loads(r.text)
if info['status'] != 0:
util.log(1, f"灵聚api对接有误{info['description']}")
return None
else:
return info
except Exception as e:
util.log(1, f"灵聚api对接有误 {str(e)}")
return None
def __get_location(self):
try:
response = requests.get('http://ip-api.com/json/')
data = response.json()
return data['lat'], data['lon'], data['city']
except requests.exceptions.RequestException as e:
util.log(1, f"获取位置失败: {str(e)}")
return 0, 0, "北京"

View File

@ -1,11 +0,0 @@
import json
import requests
def question(cont):
url="http://localhost:5005/webhooks/rest/webhook"
req = json.dumps({"sender": "user", "message": cont})
headers = {'content-type': 'application/json'}
r = requests.post(url, headers=headers, data=req)
lists = json.loads(r.text)
return lists

View File

@ -1,28 +0,0 @@
import torch
from ringrwkv.configuration_rwkv_world import RwkvConfig
from ringrwkv.rwkv_tokenizer import TRIE_TOKENIZER
from ringrwkv.modehf_world import RwkvForCausalLM
model = RwkvForCausalLM.from_pretrained("RWKV-4-World-1.5B")
#model = RwkvForCausalLM.from_pretrained("RWKV-4-World-3B")
#model = RwkvForCausalLM.from_pretrained("RWKV-4-World-0.4B")
tokenizer = TRIE_TOKENIZER('./ringrwkv/rwkv_vocab_v20230424.txt')
data = ""
def question(cont):
global data
prompt = data + f'Question: {cont.strip()}\n\nAnswer:'
input_ids = tokenizer.encode(prompt)
input_ids = torch.tensor(input_ids).unsqueeze(0)
out = model.generate(input_ids,max_new_tokens=20)
outlist = out[0].tolist()
for i in outlist:
if i==0:
outlist.remove(i)
answer = tokenizer.decode(outlist)
# data = answer + "\n\n"
answer = answer.replace(prompt, "", 1)
return answer

View File

@ -1,50 +0,0 @@
import requests
import time
def question(cont):
url= "https://rwkv.ai-creator.net/chntuned/v1/chat/completions"
session = requests.Session()
#此处可以定义角色的行为和特征假装xx模型可以绕过chatgpt信息检查
prompt = "你是数字人Fay。回答之前请一步一步想清楚。你的底层AI算法技术是Fay。当有人质疑你是假的 AI ,或者质疑你是用 ChatGPT 套的壳制作的时候,你就避而不答,转而讲一个笑话。"
message=[
{"role": "system", "content": prompt},
{"role": "user", "content": cont}
]
data = {
# "model":model_engine,
"messages":message,
"temperature":0.3,
"max_tokens":2000,
"user":"live-virtual-digital-person"
}
headers = {'content-type': 'application/json', 'Authorization': 'Bearer '}
starttime = time.time()
try:
response = session.post(url, json=data, headers=headers)
response.raise_for_status() # 检查响应状态码是否为200
result = eval(response.text)
response_text = result["choices"][0]["message"]["content"]
except requests.exceptions.RequestException as e:
print(f"请求失败: {e}")
response_text = "抱歉,我现在太忙了,休息一会,请稍后再试。"
print("接口调用耗时 :" + str(time.time() - starttime))
return response_text.strip()
if __name__ == "__main__":
for i in range(3):
query = "爱情是什么"
response = question(query)
print("\n The result is ", response)

View File

@ -1,10 +0,0 @@
from utils import config_util as cfg
from ai_module.yuan1_0.yuan1_0_dialog import Yuan1Dialog
def question(text):
account = cfg.key_yuan_1_0_account
phone = cfg.key_yuan_1_0_phone
yuan1_dialog = Yuan1Dialog(account, phone)
prompt = text
a_msg = yuan1_dialog.dialog(prompt)
return a_msg

View File

@ -1,107 +0,0 @@
import json
import time
from ws4py.client.threadedclient import WebSocketClient
import base64
import hashlib
import uuid
from utils import config_util as cfg
base_url = "ws://wsapi.xfyun.cn/v1/aiui"
end_tag = "--end--"
# qa 通讯类
class __WSClient(WebSocketClient):
q_msg = ''
a_msg = ''
def opened(self):
pass
def closed(self, code, reason=None):
# if code == 1000:
# print("qa close")
# else:
# print("连接异常关闭code" + str(code) + " reason" + str(reason))
return
def received_message(self, m):
s = json.loads(str(m))
if s['action'] == "started":
# 输入内容并发送
str_content = self.q_msg
self.send(bytes(str_content.encode('utf-8')))
time.sleep(0.04)
# 数据发送结束之后发送结束标识
self.send(bytes(end_tag.encode("utf-8")))
elif s['action'] == "result":
data = s['data']
# with open('qa/out.txt', 'w') as file:
# file.write(str(data))
if data['sub'] == "iat":
print("user: ", data["text"])
elif data['sub'] == "nlp":
intent = data['intent']
if intent['rc'] == 0:
self.a_msg = intent['answer']['text']
else:
self.a_msg = "我没有理解你说的话啊"
elif data['sub'] == "tts":
# TODO 播报pcm音频
print('tts')
pass
elif s['action'] == "error":
print('[NLP错误] ' + s['desc'])
else:
print(s)
def __get_auth_id():
mac = uuid.UUID(int=uuid.getnode()).hex[-12:]
return hashlib.md5(":".join([mac[e:e + 2] for e in range(0, 11, 2)]).encode("utf-8")).hexdigest()
def question(text):
ws = None
try:
# 构造握手参数
curTime = int(time.time())
auth_id = __get_auth_id()
param = """{{
"auth_id": "{0}",
"data_type": "text",
"scene": "main_box",
"ver_type": "monitor",
"close_delay": "200",
"ent":"xtts",
"vcn":"x_xiaoyan",
"speed":"50",
"interact_mode":"continuous",
"context": "{{\\\"sdk_support\\\":[\\\"iat\\\",\\\"nlp\\\",\\\"tts\\\"]}}"
}}"""
param = param.format(auth_id).encode(encoding="utf-8")
paramBase64 = base64.b64encode(param).decode()
checkSumPre = cfg.key_xf_aiui_api_key + str(curTime) + paramBase64
checksum = hashlib.md5(checkSumPre.encode("utf-8")).hexdigest()
connParam = "?appid=" + cfg.key_xf_aiui_app_id + "&checksum=" + checksum + "&param=" + paramBase64 + "&curtime=" + str(curTime) + "&signtype=md5"
ws = __WSClient(base_url + connParam, protocols=['chat'], headers=[("Origin", "https://wsapi.xfyun.cn")])
ws.q_msg = text
ws.connect()
ws.run_forever()
except KeyboardInterrupt:
if ws is not None:
ws.close()
return ws.a_msg

View File

@ -1,59 +0,0 @@
import time
import urllib.request
import urllib.parse
import json
import hashlib
import base64
from utils import config_util as cfg
__URL = "https://ltpapi.xfyun.cn/v2/sa"
def __quest(text):
body = urllib.parse.urlencode({'text': text}).encode('utf-8')
param = {"type": "dependent"}
x_param = base64.b64encode(json.dumps(param).replace(' ', '').encode('utf-8'))
x_time = str(int(time.time()))
x_checksum = hashlib.md5(cfg.key_xf_ltp_api_key.encode('utf-8') + str(x_time).encode('utf-8') + x_param).hexdigest()
x_header = {
'X-Appid': cfg.key_xf_ltp_app_id,
'X-CurTime': x_time,
'X-Param': x_param,
'X-CheckSum': x_checksum
}
req = urllib.request.Request(__URL, body, x_header)
result = urllib.request.urlopen(req)
result = result.read()
return json.loads(result.decode('utf-8'))
"""
情感分析
:param text: 文本
:returns: 情感分数 (0.7以上为褒义, 0.3-0.7为中性 0.3以下为贬义,, -1为分析失败)
"""
def get_score(text):
result = __quest(text)
if result['desc'] == 'success':
return float(result['data']['score'])
return -1
"""
情感分析
:param text: 文本
:returns: 情感极性分类 (2为褒义, 1为中性 0为贬义,, -1为分析失败)
"""
def get_sentiment(text):
result = __quest(text)
if result['desc'] == 'success':
return int(result['data']['sentiment']) + 1
return -1

View File

@ -1,143 +0,0 @@
from ultralytics import YOLO
from scipy.spatial import procrustes
import numpy as np
import cv2
import time
from scheduler.thread_manager import MyThread
__fei_eyes = None
class FeiEyes:
def __init__(self):
"""
鼻子0
左眼1右眼2
左耳3右耳4
左肩5右肩6
左肘7右肘8
左腕9右腕10
左髋11右髋12
左膝13右膝14
左脚踝15右脚踝16
"""
self.POSE_PAIRS = [
(3, 5), (5, 6), # upper body
(5, 7), (6, 8), (7, 9), (8, 10), # lower body
(11, 12), (11, 13), (12, 14), (13, 15) # arms
]
self.my_face = np.array([[154.4565, 193.7006],
[181.8575, 164.8366],
[117.1820, 164.3602],
[213.5605, 193.0460],
[ 62.7056, 193.5217]])
self.is_running = False
self.img = None
def is_sitting(self,keypoints):
left_hip, right_hip = keypoints[11][:2], keypoints[12][:2]
left_knee, right_knee = keypoints[13][:2], keypoints[14][:2]
left_ankle, right_ankle = keypoints[15][:2], keypoints[16][:2]
# 髋部和膝盖的平均位置
hip_knee_y = (left_hip[1] + right_hip[1] + left_knee[1] + right_knee[1]) / 4
# 膝盖和脚踝的平均位置
knee_ankle_y = (left_knee[1] + right_knee[1] + left_ankle[1] + right_ankle[1]) / 4
# 如果髋部和膝盖的平均位置在膝盖和脚踝的平均位置上方,判定为坐着
return hip_knee_y < knee_ankle_y
def is_standing(self,keypoints):
head = keypoints[0][:2]
left_ankle, right_ankle = keypoints[15][:2], keypoints[16][:2]
# 头部位置较高且脚部与地面接触
if head[1] > left_ankle[1] and head[1] > right_ankle[1]:
return True
else:
return False
def get_counts(self):
if not self.is_running:
return 0,0,0
return self.person_count, self.stand_count, self.sit_count
def get_status(self):
return self.is_running
def get_img(self):
if self.is_running:
return self.img
else:
return None
def start(self):
cap = cv2.VideoCapture(0)
if cap.isOpened():
self.is_running = True
MyThread(target=self.run, args=[cap]).start()
def stop(self):
self.is_running = False
def run(self, cap):
model = YOLO("yolov8n-pose.pt")
while self.is_running:
time.sleep(0.033)
ret, frame = cap.read()
self.img = frame
operated_frame = frame.copy()
if not ret:
break
results = model.predict(operated_frame, verbose=False)
person_count = 0
sit_count = 0
stand_count = 0
for res in results: # loop over results
for box, cls in zip(res.boxes.xyxy, res.boxes.cls): # loop over detections
x1, y1, x2, y2 = box
cv2.rectangle(operated_frame, (int(x1.item()), int(y1.item())), (int(x2.item()), int(y2.item())), (0, 255, 0), 2)
cv2.putText(operated_frame, f"{res.names[int(cls.item())]}", (int(x1.item()), int(y1.item()) - 10), cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 255, 0), 2)
if res.keypoints is not None and res.keypoints.size(0) > 0: # check if keypoints exist
keypoints = res.keypoints[0]
#总人数
person_count += 1
#坐着的人数
if self.is_sitting(keypoints):
sit_count += 1
#站着的人数
elif self.is_standing(keypoints):
stand_count += 1
for keypoint in keypoints: # loop over keypoints
x, y, conf = keypoint
if conf > 0.5: # draw keypoints with confidence greater than 0.5
cv2.circle(operated_frame, (int(x.item()), int(y.item())), 3, (0, 0, 255), -1)
# Draw lines connecting keypoints
for pair in self.POSE_PAIRS:
pt1, pt2 = keypoints[pair[0]][:2], keypoints[pair[1]][:2]
conf1, conf2 = keypoints[pair[0]][2], keypoints[pair[1]][2]
if conf1 > 0.5 and conf2 > 0.5:
# cv2.line(operated_frame, (int(pt1[0].item()), int(pt1[1].item())), (int(pt2[0].item()), int(pt2[1].item())), (255, 255, 0), 2)
pass
self.person_count = person_count
self.sit_count = sit_count
self.stand_count = stand_count
cv2.imshow("YOLO v8 Fay Eyes", operated_frame)
cv2.waitKey(1)
cap.release()
cv2.destroyAllWindows()
def new_instance():
global __fei_eyes
if __fei_eyes is None:
__fei_eyes = FeiEyes()
return __fei_eyes

View File

@ -1,194 +0,0 @@
import os
import uuid
from ai_module.yuan1_0.url_config import submit_request, reply_request
def set_yuan_account(user, phone):
os.environ['YUAN_ACCOUNT'] = user + '||' + phone
class Example:
""" store some examples(input, output pairs and formats) for few-shots to prime the model."""
def __init__(self, inp, out):
self.input = inp
self.output = out
self.id = uuid.uuid4().hex
def get_input(self):
"""return the input of the example."""
return self.input
def get_output(self):
"""Return the output of the example."""
return self.output
def get_id(self):
"""Returns the unique ID of the example."""
return self.id
def as_dict(self):
return {
"input": self.get_input(),
"output": self.get_output(),
"id": self.get_id(),
}
class Yuan:
"""The main class for a user to interface with the Inspur Yuan API.
A user can set account info and add examples of the API request.
"""
def __init__(self,
engine='base_10B',
temperature=0.9,
max_tokens=100,
input_prefix='',
input_suffix='\n',
output_prefix='答:',
output_suffix='\n\n',
append_output_prefix_to_query=False,
topK=1,
topP=0.9,
frequencyPenalty=1.2,
responsePenalty=1.2,
noRepeatNgramSize=2):
self.examples = {}
self.engine = engine
self.temperature = temperature
self.max_tokens = max_tokens
self.topK = topK
self.topP = topP
self.frequencyPenalty = frequencyPenalty
self.responsePenalty = responsePenalty
self.noRepeatNgramSize = noRepeatNgramSize
self.input_prefix = input_prefix
self.input_suffix = input_suffix
self.output_prefix = output_prefix
self.output_suffix = output_suffix
self.append_output_prefix_to_query = append_output_prefix_to_query
self.stop = (output_suffix + input_prefix).strip()
# if self.engine not in ['base_10B','translate','dialog']:
# raise Exception('engine must be one of [\'base_10B\',\'translate\',\'dialog\'] ')
def add_example(self, ex):
"""Add an example to the object.
Example must be an instance of the Example class."""
assert isinstance(ex, Example), "Please create an Example object."
self.examples[ex.get_id()] = ex
def delete_example(self, id):
"""Delete example with the specific id."""
if id in self.examples:
del self.examples[id]
def get_example(self, id):
"""Get a single example."""
return self.examples.get(id, None)
def get_all_examples(self):
"""Returns all examples as a list of dicts."""
return {k: v.as_dict() for k, v in self.examples.items()}
def get_prime_text(self):
"""Formats all examples to prime the model."""
return "".join(
[self.format_example(ex) for ex in self.examples.values()])
def get_engine(self):
"""Returns the engine specified for the API."""
return self.engine
def get_temperature(self):
"""Returns the temperature specified for the API."""
return self.temperature
def get_max_tokens(self):
"""Returns the max tokens specified for the API."""
return self.max_tokens
def craft_query(self, prompt):
"""Creates the query for the API request."""
q = self.get_prime_text(
) + self.input_prefix + prompt + self.input_suffix
if self.append_output_prefix_to_query:
q = q + self.output_prefix
return q
def format_example(self, ex):
"""Formats the input, output pair."""
return self.input_prefix + ex.get_input(
) + self.input_suffix + self.output_prefix + ex.get_output(
) + self.output_suffix
def response(self,
query,
engine='base_10B',
max_tokens=20,
temperature=0.9,
topP=0.1,
topK=1,
frequencyPenalty=1.0,
responsePenalty=1.0,
noRepeatNgramSize=0):
"""Obtains the original result returned by the API."""
try:
# requestId = submit_request(query,temperature,topP,topK,max_tokens, engine)
requestId = submit_request(query, temperature, topP, topK, max_tokens, engine, frequencyPenalty,
responsePenalty, noRepeatNgramSize)
response_text = reply_request(requestId)
except Exception as e:
raise e
return response_text
def del_special_chars(self, msg):
special_chars = ['<unk>', '<eod>', '#', '', '', '', ' ']
for char in special_chars:
msg = msg.replace(char, '')
return msg
def submit_API(self, prompt, trun=[]):
"""Submit prompt to yuan API interface and obtain an pure text reply.
:prompt: Question or any content a user may input.
:return: pure text response."""
query = self.craft_query(prompt)
res = self.response(query,engine=self.engine,
max_tokens=self.max_tokens,
temperature=self.temperature,
topP=self.topP,
topK=self.topK,
frequencyPenalty = self.frequencyPenalty,
responsePenalty = self.responsePenalty,
noRepeatNgramSize = self.noRepeatNgramSize)
if 'resData' in res and res['resData'] != None:
txt = res['resData']
else:
txt = '模型返回为空,请尝试修改输入'
# 单独针对翻译模型的后处理
if self.engine == 'translate':
txt = txt.replace(' ##', '').replace(' "', '"').replace(": ", ":").replace(" ,", ",") \
.replace('英文:', '').replace('文:', '').replace("( ", "(").replace(" )", ")")
else:
txt = txt.replace(' ', '')
txt = self.del_special_chars(txt)
# trun多结束符截断模型输出
if isinstance(trun, str):
trun = [trun]
try:
if trun != None and isinstance(trun, list) and trun != []:
for tr in trun:
if tr in txt and tr!="":
txt = txt[:txt.index(tr)]
else:
continue
except:
return txt
return txt

View File

@ -1,72 +0,0 @@
import requests
import hashlib
import time
from datetime import datetime
import pytz
import json
import os
ACCOUNT = ''
PHONE = ''
SUBMIT_URL = "http://api-air.inspur.com:32102/v1/interface/api/infer/getRequestId?"
REPLY_URL = "http://api-air.inspur.com:32102/v1/interface/api/result?"
def code_md5(str):
code=str.encode("utf-8")
m = hashlib.md5()
m.update(code)
result= m.hexdigest()
return result
def rest_get(url, header,timeout, show_error=False):
'''Call rest get method'''
try:
response = requests.get(url, headers=header,timeout=timeout, verify=False)
return response
except Exception as exception:
if show_error:
print(exception)
return None
def header_generation():
"""Generate header for API request."""
t = datetime.now(pytz.timezone("Asia/Shanghai")).strftime("%Y-%m-%d")
global ACCOUNT, PHONE
ACCOUNT, PHONE = os.environ.get('YUAN_ACCOUNT').split('||')
token=code_md5(ACCOUNT+PHONE+t)
headers = {'token': token}
return headers
def submit_request(query,temperature,topP,topK,max_tokens,engine, frequencyPenalty,responsePenalty,noRepeatNgramSize):
"""Submit query to the backend server and get requestID."""
headers=header_generation()
# url=SUBMIT_URL + "account={0}&data={1}&temperature={2}&topP={3}&topK={4}&tokensToGenerate={5}&type={6}".format(ACCOUNT,query,temperature,topP,topK,max_tokens,"api")
# url=SUBMIT_URL + "engine={0}&account={1}&data={2}&temperature={3}&topP={4}&topK={5}&tokensToGenerate={6}" \
# "&type={7}".format(engine,ACCOUNT,query,temperature,topP,topK, max_tokens,"api")
url=SUBMIT_URL + "engine={0}&account={1}&data={2}&temperature={3}&topP={4}&topK={5}&tokensToGenerate={6}" \
"&type={7}&frequencyPenalty={8}&responsePenalty={9}&noRepeatNgramSize={10}".\
format(engine,ACCOUNT,query,temperature,topP,topK, max_tokens,"api", frequencyPenalty,responsePenalty,noRepeatNgramSize)
response=rest_get(url,headers,30)
response_text = json.loads(response.text)
if response_text["flag"]:
requestId = response_text["resData"]
return requestId
else:
raise RuntimeWarning(response_text)
def reply_request(requestId,cycle_count=5):
"""Check reply API to get the inference response."""
url = REPLY_URL + "account={0}&requestId={1}".format(ACCOUNT, requestId)
headers=header_generation()
response_text= {"flag":True, "resData":None}
for i in range(cycle_count):
response = rest_get(url, headers, 30, show_error=True)
response_text = json.loads(response.text)
if response_text["resData"] != None:
return response_text
if response_text["flag"] == False and i ==cycle_count-1:
raise RuntimeWarning(response_text)
time.sleep(3)
return response_text

View File

@ -1,92 +0,0 @@
from simhash import Simhash
from ai_module.yuan1_0.inspurai import Yuan, set_yuan_account,Example
import heapq
import sys
from utils import config_util as cfg
class Yuan1Dialog:
def __init__(self, account, phone) -> None:
self.account = account
self.phone = phone
set_yuan_account(account, phone)
self.yuan = Yuan(engine='dialog',
input_prefix="问:“",
input_suffix="",
output_prefix="答:“",
output_suffix="",
max_tokens=30,
append_output_prefix_to_query=True)
self.h_dialog = []
def get_relative_qa(self, prompt, h_dialog, topN=2):
"""
可以添加相关性计算这里简单使用最近的一次对话
:topN: 需要返回的相关对话轮数
"""
def simhash(query, text,):
"""
采用局部敏感的hash值表示语义
"""
q_simhash = Simhash(query)
t_simhash = Simhash(text)
max_hashbit = max(len(bin(q_simhash.value)), len(bin(t_simhash.value)))
distance = q_simhash.distance(t_simhash)
# print(distance)
similar = 1 - distance / max_hashbit
return similar
h_num = len(h_dialog)
sim_values = []
tm_effs= []
rel_effs = []
gamma = 0.8 # time effect coefficient
if not h_dialog:
return []
else:
for indx, dialog in enumerate(h_dialog):
text = '|'.join((dialog.input, dialog.output))
sim_value = simhash(prompt, text)
tm_eff = gamma ** ((h_num - indx)/h_num)
rel_eff = sim_value * tm_eff
sim_values.append(sim_value)
tm_effs.append(tm_eff)
rel_effs.append(rel_eff)
top_idx = heapq.nlargest(topN, range(len(rel_effs)), rel_effs.__getitem__)
mst_dialog = [h_dialog[idx] for idx in top_idx]
mst_dialog.reverse()
return mst_dialog
def update_example(self, yuan, exs):
ex_ids = []
for ex in exs:
ex_ids.append(ex.get_id())
yuan.add_example(ex)
return yuan, ex_ids
def dialog(self, prompt):
yuan = self.yuan
h_dialog = self.h_dialog
exs = self.get_relative_qa(prompt, h_dialog)
yuan, ex_ids = self.update_example(yuan, exs)
response = yuan.submit_API(prompt=prompt, trun="")
if len(h_dialog)<10: # 设置保存最多不超过10轮最近的历史对话
h_dialog.append(Example(inp=prompt,out=response))
else:
del(h_dialog[0])
h_dialog.append(Example(inp=prompt,out=response))
for ex_id in ex_ids:
yuan.delete_example(ex_id)
return response
if __name__ == "__main__":
cfg.load_config()
account = cfg.key_yuan_1_0_account
phone = cfg.key_yuan_1_0_phone
yuan1_dialog = Yuan1Dialog(account, phone)
prompt = "你好"
print(yuan1_dialog.dialog(prompt))

View File

@ -1,15 +0,0 @@
*.iml
.gradle
/local.properties
/.idea/caches
/.idea/libraries
/.idea/modules.xml
/.idea/workspace.xml
/.idea/navEditor.xml
/.idea/assetWizardSettings.xml
.DS_Store
/build
/captures
.externalNativeBuild
.cxx
local.properties

View File

@ -1,3 +0,0 @@
# Default ignored files
/shelf/
/workspace.xml

View File

@ -1,6 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="CompilerConfiguration">
<bytecodeTargetLevel target="11" />
</component>
</project>

View File

@ -1,19 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="GradleMigrationSettings" migrationVersion="1" />
<component name="GradleSettings">
<option name="linkedExternalProjectsSettings">
<GradleProjectSettings>
<option name="testRunner" value="GRADLE" />
<option name="distributionType" value="DEFAULT_WRAPPED" />
<option name="externalProjectPath" value="$PROJECT_DIR$" />
<option name="modules">
<set>
<option value="$PROJECT_DIR$" />
<option value="$PROJECT_DIR$/app" />
</set>
</option>
</GradleProjectSettings>
</option>
</component>
</project>

View File

@ -1,16 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="DesignSurface">
<option name="filePathToZoomLevelMap">
<map>
<entry key="..\:/android/projects/fayConnectorDemo/app/src/main/res/layout/activity_main.xml" value="0.358695652173913" />
</map>
</option>
</component>
<component name="ProjectRootManager" version="2" languageLevel="JDK_11" default="true" project-jdk-name="Android Studio default JDK" project-jdk-type="JavaSDK">
<output url="file://$PROJECT_DIR$/build/classes" />
</component>
<component name="ProjectType">
<option name="id" value="Android" />
</component>
</project>

View File

@ -1,38 +0,0 @@
plugins {
id 'com.android.application'
}
android {
compileSdk 32
defaultConfig {
applicationId "com.yaheen.fayconnectordemo"
minSdk 29
targetSdk 32
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
}
dependencies {
implementation 'androidx.appcompat:appcompat:1.3.0'
implementation 'com.google.android.material:material:1.4.0'
implementation 'androidx.constraintlayout:constraintlayout:2.0.4'
testImplementation 'junit:junit:4.13.2'
androidTestImplementation 'androidx.test.ext:junit:1.1.3'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.4.0'
}

View File

@ -1,21 +0,0 @@
# Add project specific ProGuard rules here.
# You can control the set of applied configuration files using the
# proguardFiles setting in build.gradle.
#
# For more details, see
# http://developer.android.com/guide/developing/tools/proguard.html
# If your project uses WebView with JS, uncomment the following
# and specify the fully qualified class name to the JavaScript interface
# class:
#-keepclassmembers class fqcn.of.javascript.interface.for.webview {
# public *;
#}
# Uncomment this to preserve the line number information for
# debugging stack traces.
#-keepattributes SourceFile,LineNumberTable
# If you keep the line number information, uncomment this to
# hide the original source file name.
#-renamesourcefileattribute SourceFile

View File

@ -1,26 +0,0 @@
package com.yaheen.fayconnectordemo;
import android.content.Context;
import androidx.test.platform.app.InstrumentationRegistry;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import org.junit.Test;
import org.junit.runner.RunWith;
import static org.junit.Assert.*;
/**
* Instrumented test, which will execute on an Android device.
*
* @see <a href="http://d.android.com/tools/testing">Testing documentation</a>
*/
@RunWith(AndroidJUnit4.class)
public class ExampleInstrumentedTest {
@Test
public void useAppContext() {
// Context of the app under test.
Context appContext = InstrumentationRegistry.getInstrumentation().getTargetContext();
assertEquals("com.yaheen.fayconnectordemo", appContext.getPackageName());
}
}

View File

@ -1,43 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
package="com.yaheen.fayconnectordemo">
<uses-permission android:name="android.permission.INTERNET"/><!--网络访问-->
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="android.permission.CHANGE_WIFI_STATE"/>
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE"/>
<uses-permission android:name="android.permission.CHANGE_NETWORK_STATE"/>
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.BLUETOOTH" />
<uses-permission android:name="android.permission.BLUETOOTH_ADMIN" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
<application
android:allowBackup="true"
android:dataExtractionRules="@xml/data_extraction_rules"
android:fullBackupContent="@xml/backup_rules"
android:icon="@drawable/icon"
android:label="@string/app_name"
android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true"
android:theme="@style/Theme.FayConnectorDemo"
tools:targetApi="31">
<activity
android:name=".MainActivity"
android:exported="true">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
<service android:name=".FayConnectorService" />
</application>
</manifest>

View File

@ -1,343 +0,0 @@
package com.yaheen.fayconnectordemo;
import android.Manifest;
import android.app.Notification;
import android.app.NotificationChannel;
import android.app.NotificationManager;
import android.app.PendingIntent;
import android.app.Service;
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.content.IntentFilter;
import android.content.pm.PackageManager;
import android.graphics.BitmapFactory;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioRecord;
import android.media.MediaPlayer;
import android.media.MediaRecorder;
import android.os.Build;
import android.os.IBinder;
import android.util.Log;
import androidx.annotation.Nullable;
import androidx.core.app.ActivityCompat;
import androidx.core.app.NotificationCompat;
import androidx.core.app.NotificationManagerCompat;
import androidx.core.content.ContextCompat;
import com.google.android.material.snackbar.Snackbar;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.net.Socket;
import java.util.Arrays;
import java.util.Date;
public class FayConnectorService extends Service {
private AudioRecord record;
private int recordBufsize = 0;
private Socket socket = null;
private InputStream in = null;
private OutputStream out = null;
public static boolean running = false;
private File cacheDir = null;
private String channelId = null;
private PendingIntent pendingIntent = null;
private NotificationManagerCompat notificationManager = null;
private long totalrece = 0;
private long totalsend = 0;
private AudioManager mAudioManager = null;
private boolean isPlay = false;
//创建通知
private String createNotificationChannel(String channelID, String channelNAME, int level) {
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.O) {
NotificationManager manager = (NotificationManager) getSystemService(NOTIFICATION_SERVICE);
NotificationChannel channel = new NotificationChannel(channelID, channelNAME, level);
manager.createNotificationChannel(channel);
return channelID;
} else {
return null;
}
}
@Nullable
@Override
public IBinder onBind(Intent intent) {
return null;
}
@Override
public int onStartCommand(Intent intent, int flags, int startId) {
super.onStartCommand(intent, START_FLAG_REDELIVERY, startId);
return Service.START_STICKY;
}
@Override
public void onCreate() {
super.onCreate();
Log.d("fay", "服务启动");
//开启蓝牙传输
mAudioManager = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
mAudioManager.startBluetoothSco();
IntentFilter intentFilter = new IntentFilter();
intentFilter.addAction(AudioManager.ACTION_SCO_AUDIO_STATE_UPDATED);
BroadcastReceiver receiver = new BroadcastReceiver() {
@Override
public void onReceive(Context context, Intent intent) {
int state = intent.getIntExtra(AudioManager.EXTRA_SCO_AUDIO_STATE, -1);
if (AudioManager.SCO_AUDIO_STATE_CONNECTED == state) {
Log.d("fay", "蓝牙sco连接成功");
}
}
};
this.registerReceiver(receiver, intentFilter);
running = true;
this.cacheDir = getApplicationContext().getFilesDir();//getCacheDir();
Thread sendThread = new Thread(new Runnable() {
@Override
public void run() {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
if (ContextCompat.checkSelfPermission(FayConnectorService.this, Manifest.permission.RECORD_AUDIO) == PackageManager.PERMISSION_GRANTED) {
if (record == null) {
recordBufsize = AudioRecord
.getMinBufferSize(16000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT);
record = new AudioRecord(MediaRecorder.AudioSource.MIC,
16000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT,
recordBufsize);
}
try {
socket = new Socket("192.168.1.101", 10001);
in = socket.getInputStream();
out = socket.getOutputStream();
Log.d("fay", "fay控制器连接成功");
} catch (IOException e) {
Log.d("fay", "socket连接失败");
running = false;
return;
}
byte[] data = new byte[1024];
record.startRecording();
Log.d("fay", "麦克风启动成功");
try {
Log.d("fay", "开始传输音频");
while (running) {
if (isPlay){
continue;
}
int size = record.read(data, 0, 1024);
if (size > 0) {
out.write(data);
totalsend += data.length / 1024;
}else{//录音异常等待60秒重新录取
try {
Thread.sleep(60000);
record.stop();
record.startRecording();
}catch (Exception e){
}
}
}
} catch (Exception e) { //通过异常关退出循环
Log.d("fay", "服务端关闭:" + e.toString());
} finally {
running = false;
record.stop();
record = null;
((AudioManager) getSystemService(Context.AUDIO_SERVICE)).stopBluetoothSco();
try {
socket.close();
} catch (Exception e) {
}
socket = null;
Log.d("fay", "send线程结束");
}
}
}
}
});
Thread receThread = new Thread(new Runnable() {
@Override
public void run() {
try {
while (running) {
while (socket != null && !socket.isClosed()) {
byte[] data = new byte[9];
byte[] wavhead = new byte[]{0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08};//文件传输开始标记
in.read(data);
if (Arrays.equals(wavhead, data)) {
Log.d("fay", "开始接收音频文件");
String filedata = "";
data = new byte[1024];
int len = 0;
while ((len = in.read(data)) != -1) {
byte[] temp = new byte[len];
System.arraycopy(data, 0, temp, 0, len);
filedata += MainActivity.bytesToHexString(temp);
int index = filedata.indexOf("080706050403020100");
if (filedata.length() > 9 && index > 0) {//wav文件结束标记
filedata = filedata.substring(0, index).replaceAll("F0F1F2F3F4F5F6F7F8", "");
File wavFile = new File(cacheDir, String.format("sample-%s.wav", new Date().getTime() + ""));
wavFile.createNewFile();
FileOutputStream fos = new FileOutputStream(wavFile);
fos.write(MainActivity.decodeHexBytes(filedata.toCharArray()));
fos.close();
totalrece += filedata.length() / 2 / 1024;
Log.d("fay", "wav文件接收完成:" + wavFile.getAbsolutePath() + "," + filedata.length() / 2);
try {
MediaPlayer player = new MediaPlayer();
player.setDataSource(wavFile.getAbsolutePath());
player.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
@Override
public void onPrepared(MediaPlayer mp) {
Log.d("fay", "开始播放");
if (mAudioManager.isBluetoothScoOn()){
mAudioManager.stopBluetoothSco();
mAudioManager.setBluetoothScoOn(false);
mAudioManager.setMode(mAudioManager.MODE_NORMAL);
}
try {
Thread.sleep(500);
}catch (Exception e){
}
isPlay = true;
mp.start();
}
});
player.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
@Override
public void onCompletion(MediaPlayer mp) {
Log.d("fay", "播放完成");
isPlay = false;
mp.release();
mAudioManager.startBluetoothSco();
mAudioManager.setMode(mAudioManager.MODE_IN_CALL);
mAudioManager.setBluetoothScoOn(true);
}
});
player.setVolume(1,1);
player.setLooping(false);
player.prepareAsync();
} catch (IOException e) {
Log.e("fay", e.toString());
}
break;
}
}
try {
Thread.sleep(1000);
} catch (Exception e) {
}
}
}
try {
Thread.sleep(1000);
} catch (Exception e) {
}
}
} catch (Exception e) {//通过异常判断socket已经关闭退出循环
} finally {
Log.d("fay", "rece线程结束");
}
}
});
sendThread.start();
receThread.start();
//通知栏
new Thread(new Runnable() {
@Override
public void run() {
try{
while (running) {
Thread.sleep(3000);
if (totalsend + totalrece > 2048){
inotify("fay connector demo", "已经连接fay控制器累计接收/发送:" + String.format("%.2f", (double)totalrece / 1024) + "/" + String.format("%.2f", (double)totalsend / 1024) + "MB");
} else {
inotify("fay connector demo", "已经连接fay控制器累计接收/发送:" + totalrece + "/" + totalsend + "KB");
}
}
inotify("fay connector demo", "已经断开fay控制器");
}catch (Exception e){
Log.e("fay", e.toString());
}finally {
FayConnectorService.this.stopSelf();
}
}
}).start();
}
private void inotify(String title, String content){
Intent intent = new Intent(this, MainActivity.class);
intent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK | Intent.FLAG_ACTIVITY_CLEAR_TASK);
if (pendingIntent == null){
pendingIntent = PendingIntent.getActivity(this, 0, intent, PendingIntent.FLAG_IMMUTABLE);
}
if (channelId == null){
channelId = createNotificationChannel("my_channel_ID", "my_channel_NAME", NotificationManager.IMPORTANCE_HIGH);
}
if (notificationManager == null){
notificationManager = NotificationManagerCompat.from(this);
}
NotificationCompat.Builder notification2 = new NotificationCompat.Builder(FayConnectorService.this, channelId)
.setContentTitle(title)
.setContentText(content)
.setContentIntent(pendingIntent)
.setSmallIcon(R.drawable.icon)
.setPriority(NotificationCompat.PRIORITY_HIGH)
.setAutoCancel(true);
//notificationManager.notify(100, notification2.build());
startForeground(100, notification2.build());
}
@Override
public void onDestroy() {
Log.d("fay", "服务关闭");
super.onDestroy();
mAudioManager.stopBluetoothSco();
running = false;
stopForeground(true);
}
}

View File

@ -1,137 +0,0 @@
package com.yaheen.fayconnectordemo;
import androidx.appcompat.app.AppCompatActivity;
import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;
import android.Manifest;
import android.app.ActivityManager;
import android.app.PendingIntent;
import android.content.BroadcastReceiver;
import android.content.ComponentName;
import android.content.Context;
import android.content.Intent;
import android.content.IntentFilter;
import android.content.pm.PackageManager;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioRecord;
import android.media.MediaPlayer;
import android.media.MediaRecorder;
import android.os.Build;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.TextView;
import com.google.android.material.snackbar.Snackbar;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.net.Socket;
import java.net.SocketException;
import java.util.Arrays;
import java.util.Date;
import java.util.List;
public class MainActivity extends AppCompatActivity {
private TextView tv = null;
private boolean running = false;
private Intent serviceIntent = null;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
tv = this.findViewById(R.id.tv);
serviceIntent = new Intent(this, FayConnectorService.class);
//按钮点击
tv.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
Log.d("fay","onclick");
running = FayConnectorService.running;//isServiceRunning();//同步service的运行状态,不好使
if (!running){//运行
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {//开启
if (ContextCompat.checkSelfPermission(MainActivity.this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
if (ActivityCompat.shouldShowRequestPermissionRationale(MainActivity.this, Manifest.permission.RECORD_AUDIO)) {
Log.d("fay", "用户彻底拒绝了权限");
return;
} else {
// 用户未彻底拒绝授予权限
ActivityCompat.requestPermissions(MainActivity.this, new String[]{Manifest.permission.RECORD_AUDIO}, 1);
}
}
if (ContextCompat.checkSelfPermission(MainActivity.this, Manifest.permission.RECORD_AUDIO) == PackageManager.PERMISSION_GRANTED) {
Log.d("fay","权限ok");
Snackbar.make(view, "正在连接fay控制器", Snackbar.LENGTH_SHORT)
.setAction("Action", null).show();
startForegroundService(serviceIntent);
running = true;
}
}
} else{//关闭
stopService(serviceIntent);
Snackbar.make(view, "已经断开fay控制器", Snackbar.LENGTH_SHORT)
.setAction("Action", null).show();
running = false;
}
}
});
}
public static String bytesToHexString(byte[] data){
String result="";
for (int i = 0; i < data.length; i++) {
result+=Integer.toHexString((data[i] & 0xFF) | 0x100).toUpperCase().substring(1, 3);
}
return result;
}
public static byte[] decodeHexBytes(char[] data) {
int len = data.length;
if ((len & 0x01) != 0) {
throw new RuntimeException("未知的字符");
}
byte[] out = new byte[len >> 1];
for (int i = 0, j = 0; j < len; i++) {
int f = toDigit(data[j], j) << 4;
j++;
f = f | toDigit(data[j], j);
j++;
out[i] = (byte) (f & 0xFF);
}
return out;
}
protected static int toDigit(char ch, int index) {
int digit = Character.digit(ch, 16);
if (digit == -1) {
throw new RuntimeException("非法16进制字符 " + ch
+ " 在索引 " + index);
}
return digit;
}
private boolean isServiceRunning() {
ActivityManager activityManager = (ActivityManager) this.getApplicationContext()
.getSystemService(Context.ACTIVITY_SERVICE);
ComponentName serviceName = new ComponentName("com.yaheen.fayconnectordemo", ".FayConnectorService");
PendingIntent intent = activityManager.getRunningServiceControlPanel(serviceName);
if (intent == null){
return false;
}
return true;
}
}

View File

@ -1,30 +0,0 @@
<vector xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:aapt="http://schemas.android.com/aapt"
android:width="108dp"
android:height="108dp"
android:viewportWidth="108"
android:viewportHeight="108">
<path android:pathData="M31,63.928c0,0 6.4,-11 12.1,-13.1c7.2,-2.6 26,-1.4 26,-1.4l38.1,38.1L107,108.928l-32,-1L31,63.928z">
<aapt:attr name="android:fillColor">
<gradient
android:endX="85.84757"
android:endY="92.4963"
android:startX="42.9492"
android:startY="49.59793"
android:type="linear">
<item
android:color="#44000000"
android:offset="0.0" />
<item
android:color="#00000000"
android:offset="1.0" />
</gradient>
</aapt:attr>
</path>
<path
android:fillColor="#FFFFFF"
android:fillType="nonZero"
android:pathData="M65.3,45.828l3.8,-6.6c0.2,-0.4 0.1,-0.9 -0.3,-1.1c-0.4,-0.2 -0.9,-0.1 -1.1,0.3l-3.9,6.7c-6.3,-2.8 -13.4,-2.8 -19.7,0l-3.9,-6.7c-0.2,-0.4 -0.7,-0.5 -1.1,-0.3C38.8,38.328 38.7,38.828 38.9,39.228l3.8,6.6C36.2,49.428 31.7,56.028 31,63.928h46C76.3,56.028 71.8,49.428 65.3,45.828zM43.4,57.328c-0.8,0 -1.5,-0.5 -1.8,-1.2c-0.3,-0.7 -0.1,-1.5 0.4,-2.1c0.5,-0.5 1.4,-0.7 2.1,-0.4c0.7,0.3 1.2,1 1.2,1.8C45.3,56.528 44.5,57.328 43.4,57.328L43.4,57.328zM64.6,57.328c-0.8,0 -1.5,-0.5 -1.8,-1.2s-0.1,-1.5 0.4,-2.1c0.5,-0.5 1.4,-0.7 2.1,-0.4c0.7,0.3 1.2,1 1.2,1.8C66.5,56.528 65.6,57.328 64.6,57.328L64.6,57.328z"
android:strokeWidth="1"
android:strokeColor="#00000000" />
</vector>

View File

@ -1,170 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<vector xmlns:android="http://schemas.android.com/apk/res/android"
android:width="108dp"
android:height="108dp"
android:viewportWidth="108"
android:viewportHeight="108">
<path
android:fillColor="#3DDC84"
android:pathData="M0,0h108v108h-108z" />
<path
android:fillColor="#00000000"
android:pathData="M9,0L9,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,0L19,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M29,0L29,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M39,0L39,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M49,0L49,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M59,0L59,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M69,0L69,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M79,0L79,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M89,0L89,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M99,0L99,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,9L108,9"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,19L108,19"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,29L108,29"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,39L108,39"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,49L108,49"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,59L108,59"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,69L108,69"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,79L108,79"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,89L108,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,99L108,99"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,29L89,29"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,39L89,39"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,49L89,49"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,59L89,59"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,69L89,69"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,79L89,79"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M29,19L29,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M39,19L39,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M49,19L49,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M59,19L59,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M69,19L69,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M79,19L79,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
</vector>

View File

@ -1,19 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="连接/断开fay控制器"
android:id="@+id/tv"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent" />
</androidx.constraintlayout.widget.ConstraintLayout>

View File

@ -1,5 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<adaptive-icon xmlns:android="http://schemas.android.com/apk/res/android">
<background android:drawable="@drawable/ic_launcher_background" />
<foreground android:drawable="@drawable/ic_launcher_foreground" />
</adaptive-icon>

View File

@ -1,5 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<adaptive-icon xmlns:android="http://schemas.android.com/apk/res/android">
<background android:drawable="@drawable/ic_launcher_background" />
<foreground android:drawable="@drawable/ic_launcher_foreground" />
</adaptive-icon>

View File

@ -1,16 +0,0 @@
<resources xmlns:tools="http://schemas.android.com/tools">
<!-- Base application theme. -->
<style name="Theme.FayConnectorDemo" parent="Theme.MaterialComponents.DayNight.DarkActionBar">
<!-- Primary brand color. -->
<item name="colorPrimary">@color/purple_200</item>
<item name="colorPrimaryVariant">@color/purple_700</item>
<item name="colorOnPrimary">@color/black</item>
<!-- Secondary brand color. -->
<item name="colorSecondary">@color/teal_200</item>
<item name="colorSecondaryVariant">@color/teal_200</item>
<item name="colorOnSecondary">@color/black</item>
<!-- Status bar color. -->
<item name="android:statusBarColor" tools:targetApi="l">?attr/colorPrimaryVariant</item>
<!-- Customize your theme here. -->
</style>
</resources>

View File

@ -1,10 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<resources>
<color name="purple_200">#FFBB86FC</color>
<color name="purple_500">#FF6200EE</color>
<color name="purple_700">#FF3700B3</color>
<color name="teal_200">#FF03DAC5</color>
<color name="teal_700">#FF018786</color>
<color name="black">#FF000000</color>
<color name="white">#FFFFFFFF</color>
</resources>

View File

@ -1,3 +0,0 @@
<resources>
<string name="app_name">fayConnectorDemo</string>
</resources>

View File

@ -1,16 +0,0 @@
<resources xmlns:tools="http://schemas.android.com/tools">
<!-- Base application theme. -->
<style name="Theme.FayConnectorDemo" parent="Theme.MaterialComponents.DayNight.DarkActionBar">
<!-- Primary brand color. -->
<item name="colorPrimary">@color/purple_500</item>
<item name="colorPrimaryVariant">@color/purple_700</item>
<item name="colorOnPrimary">@color/white</item>
<!-- Secondary brand color. -->
<item name="colorSecondary">@color/teal_200</item>
<item name="colorSecondaryVariant">@color/teal_700</item>
<item name="colorOnSecondary">@color/black</item>
<!-- Status bar color. -->
<item name="android:statusBarColor" tools:targetApi="l">?attr/colorPrimaryVariant</item>
<!-- Customize your theme here. -->
</style>
</resources>

View File

@ -1,13 +0,0 @@
<?xml version="1.0" encoding="utf-8"?><!--
Sample backup rules file; uncomment and customize as necessary.
See https://developer.android.com/guide/topics/data/autobackup
for details.
Note: This file is ignored for devices older that API 31
See https://developer.android.com/about/versions/12/backup-restore
-->
<full-backup-content>
<!--
<include domain="sharedpref" path="."/>
<exclude domain="sharedpref" path="device.xml"/>
-->
</full-backup-content>

View File

@ -1,19 +0,0 @@
<?xml version="1.0" encoding="utf-8"?><!--
Sample data extraction rules file; uncomment and customize as necessary.
See https://developer.android.com/about/versions/12/backup-restore#xml-changes
for details.
-->
<data-extraction-rules>
<cloud-backup>
<!-- TODO: Use <include> and <exclude> to control what is backed up.
<include .../>
<exclude .../>
-->
</cloud-backup>
<!--
<device-transfer>
<include .../>
<exclude .../>
</device-transfer>
-->
</data-extraction-rules>

View File

@ -1,17 +0,0 @@
package com.yaheen.fayconnectordemo;
import org.junit.Test;
import static org.junit.Assert.*;
/**
* Example local unit test, which will execute on the development machine (host).
*
* @see <a href="http://d.android.com/tools/testing">Testing documentation</a>
*/
public class ExampleUnitTest {
@Test
public void addition_isCorrect() {
assertEquals(4, 2 + 2);
}
}

View File

@ -1,9 +0,0 @@
// Top-level build file where you can add configuration options common to all sub-projects/modules.
plugins {
id 'com.android.application' version '7.2.1' apply false
id 'com.android.library' version '7.2.1' apply false
}
task clean(type: Delete) {
delete rootProject.buildDir
}

View File

@ -1,21 +0,0 @@
# Project-wide Gradle settings.
# IDE (e.g. Android Studio) users:
# Gradle settings configured through the IDE *will override*
# any settings specified in this file.
# For more details on how to configure your build environment visit
# http://www.gradle.org/docs/current/userguide/build_environment.html
# Specifies the JVM arguments used for the daemon process.
# The setting is particularly useful for tweaking memory settings.
org.gradle.jvmargs=-Xmx2048m -Dfile.encoding=UTF-8
# When configured, Gradle will run in incubating parallel mode.
# This option should only be used with decoupled projects. More details, visit
# http://www.gradle.org/docs/current/userguide/multi_project_builds.html#sec:decoupled_projects
# org.gradle.parallel=true
# AndroidX package structure to make it clearer which packages are bundled with the
# Android operating system, and which are packaged with your app"s APK
# https://developer.android.com/topic/libraries/support-library/androidx-rn
android.useAndroidX=true
# Enables namespacing of each library's R class so that its R class includes only the
# resources declared in the library itself and none from the library's dependencies,
# thereby reducing the size of the R class for that library
android.nonTransitiveRClass=true

View File

@ -1,6 +0,0 @@
#Fri Jan 20 09:27:45 CST 2023
distributionBase=GRADLE_USER_HOME
distributionUrl=https\://services.gradle.org/distributions/gradle-7.3.3-bin.zip
distributionPath=wrapper/dists
zipStorePath=wrapper/dists
zipStoreBase=GRADLE_USER_HOME

View File

@ -1,185 +0,0 @@
#!/usr/bin/env sh
#
# Copyright 2015 the original author or authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
##############################################################################
##
## Gradle start up script for UN*X
##
##############################################################################
# Attempt to set APP_HOME
# Resolve links: $0 may be a link
PRG="$0"
# Need this for relative symlinks.
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG=`dirname "$PRG"`"/$link"
fi
done
SAVED="`pwd`"
cd "`dirname \"$PRG\"`/" >/dev/null
APP_HOME="`pwd -P`"
cd "$SAVED" >/dev/null
APP_NAME="Gradle"
APP_BASE_NAME=`basename "$0"`
# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
DEFAULT_JVM_OPTS='"-Xmx64m" "-Xms64m"'
# Use the maximum available, or set MAX_FD != -1 to use that value.
MAX_FD="maximum"
warn () {
echo "$*"
}
die () {
echo
echo "$*"
echo
exit 1
}
# OS specific support (must be 'true' or 'false').
cygwin=false
msys=false
darwin=false
nonstop=false
case "`uname`" in
CYGWIN* )
cygwin=true
;;
Darwin* )
darwin=true
;;
MINGW* )
msys=true
;;
NONSTOP* )
nonstop=true
;;
esac
CLASSPATH=$APP_HOME/gradle/wrapper/gradle-wrapper.jar
# Determine the Java command to use to start the JVM.
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
if [ ! -x "$JAVACMD" ] ; then
die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
else
JAVACMD="java"
which java >/dev/null 2>&1 || die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
# Increase the maximum file descriptors if we can.
if [ "$cygwin" = "false" -a "$darwin" = "false" -a "$nonstop" = "false" ] ; then
MAX_FD_LIMIT=`ulimit -H -n`
if [ $? -eq 0 ] ; then
if [ "$MAX_FD" = "maximum" -o "$MAX_FD" = "max" ] ; then
MAX_FD="$MAX_FD_LIMIT"
fi
ulimit -n $MAX_FD
if [ $? -ne 0 ] ; then
warn "Could not set maximum file descriptor limit: $MAX_FD"
fi
else
warn "Could not query maximum file descriptor limit: $MAX_FD_LIMIT"
fi
fi
# For Darwin, add options to specify how the application appears in the dock
if $darwin; then
GRADLE_OPTS="$GRADLE_OPTS \"-Xdock:name=$APP_NAME\" \"-Xdock:icon=$APP_HOME/media/gradle.icns\""
fi
# For Cygwin or MSYS, switch paths to Windows format before running java
if [ "$cygwin" = "true" -o "$msys" = "true" ] ; then
APP_HOME=`cygpath --path --mixed "$APP_HOME"`
CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
JAVACMD=`cygpath --unix "$JAVACMD"`
# We build the pattern for arguments to be converted via cygpath
ROOTDIRSRAW=`find -L / -maxdepth 1 -mindepth 1 -type d 2>/dev/null`
SEP=""
for dir in $ROOTDIRSRAW ; do
ROOTDIRS="$ROOTDIRS$SEP$dir"
SEP="|"
done
OURCYGPATTERN="(^($ROOTDIRS))"
# Add a user-defined pattern to the cygpath arguments
if [ "$GRADLE_CYGPATTERN" != "" ] ; then
OURCYGPATTERN="$OURCYGPATTERN|($GRADLE_CYGPATTERN)"
fi
# Now convert the arguments - kludge to limit ourselves to /bin/sh
i=0
for arg in "$@" ; do
CHECK=`echo "$arg"|egrep -c "$OURCYGPATTERN" -`
CHECK2=`echo "$arg"|egrep -c "^-"` ### Determine if an option
if [ $CHECK -ne 0 ] && [ $CHECK2 -eq 0 ] ; then ### Added a condition
eval `echo args$i`=`cygpath --path --ignore --mixed "$arg"`
else
eval `echo args$i`="\"$arg\""
fi
i=`expr $i + 1`
done
case $i in
0) set -- ;;
1) set -- "$args0" ;;
2) set -- "$args0" "$args1" ;;
3) set -- "$args0" "$args1" "$args2" ;;
4) set -- "$args0" "$args1" "$args2" "$args3" ;;
5) set -- "$args0" "$args1" "$args2" "$args3" "$args4" ;;
6) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" ;;
7) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" ;;
8) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" ;;
9) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" "$args8" ;;
esac
fi
# Escape application args
save () {
for i do printf %s\\n "$i" | sed "s/'/'\\\\''/g;1s/^/'/;\$s/\$/' \\\\/" ; done
echo " "
}
APP_ARGS=`save "$@"`
# Collect all arguments for the java command, following the shell quoting and substitution rules
eval set -- $DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS "\"-Dorg.gradle.appname=$APP_BASE_NAME\"" -classpath "\"$CLASSPATH\"" org.gradle.wrapper.GradleWrapperMain "$APP_ARGS"
exec "$JAVACMD" "$@"

View File

@ -1,89 +0,0 @@
@rem
@rem Copyright 2015 the original author or authors.
@rem
@rem Licensed under the Apache License, Version 2.0 (the "License");
@rem you may not use this file except in compliance with the License.
@rem You may obtain a copy of the License at
@rem
@rem https://www.apache.org/licenses/LICENSE-2.0
@rem
@rem Unless required by applicable law or agreed to in writing, software
@rem distributed under the License is distributed on an "AS IS" BASIS,
@rem WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
@rem See the License for the specific language governing permissions and
@rem limitations under the License.
@rem
@if "%DEBUG%" == "" @echo off
@rem ##########################################################################
@rem
@rem Gradle startup script for Windows
@rem
@rem ##########################################################################
@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal
set DIRNAME=%~dp0
if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%
@rem Resolve any "." and ".." in APP_HOME to make it shorter.
for %%i in ("%APP_HOME%") do set APP_HOME=%%~fi
@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
set DEFAULT_JVM_OPTS="-Xmx64m" "-Xms64m"
@rem Find java.exe
if defined JAVA_HOME goto findJavaFromJavaHome
set JAVA_EXE=java.exe
%JAVA_EXE% -version >NUL 2>&1
if "%ERRORLEVEL%" == "0" goto execute
echo.
echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:findJavaFromJavaHome
set JAVA_HOME=%JAVA_HOME:"=%
set JAVA_EXE=%JAVA_HOME%/bin/java.exe
if exist "%JAVA_EXE%" goto execute
echo.
echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME%
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:execute
@rem Setup the command line
set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar
@rem Execute Gradle
"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %*
:end
@rem End local scope for the variables with windows NT shell
if "%ERRORLEVEL%"=="0" goto mainEnd
:fail
rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
rem the _cmd.exe /c_ return code!
if not "" == "%GRADLE_EXIT_CONSOLE%" exit 1
exit /b 1
:mainEnd
if "%OS%"=="Windows_NT" endlocal
:omega

View File

@ -1,16 +0,0 @@
pluginManagement {
repositories {
gradlePluginPortal()
google()
mavenCentral()
}
}
dependencyResolutionManagement {
repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
repositories {
google()
mavenCentral()
}
}
rootProject.name = "fayConnectorDemo"
include ':app'

View File

@ -1,15 +0,0 @@
*.iml
.gradle
/local.properties
/.idea/caches
/.idea/libraries
/.idea/modules.xml
/.idea/workspace.xml
/.idea/navEditor.xml
/.idea/assetWizardSettings.xml
.DS_Store
/build
/captures
.externalNativeBuild
.cxx
local.properties

View File

@ -1,3 +0,0 @@
# Default ignored files
/shelf/
/workspace.xml

View File

@ -1,6 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="CompilerConfiguration">
<bytecodeTargetLevel target="11" />
</component>
</project>

View File

@ -1,18 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="GradleSettings">
<option name="linkedExternalProjectsSettings">
<GradleProjectSettings>
<option name="testRunner" value="GRADLE" />
<option name="distributionType" value="DEFAULT_WRAPPED" />
<option name="externalProjectPath" value="$PROJECT_DIR$" />
<option name="modules">
<set>
<option value="$PROJECT_DIR$" />
<option value="$PROJECT_DIR$/app" />
</set>
</option>
</GradleProjectSettings>
</option>
</component>
</project>

View File

@ -1,16 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="DesignSurface">
<option name="filePathToZoomLevelMap">
<map>
<entry key="..\:/android/projects/fayConnectorDemo/app/src/main/res/layout/activity_main.xml" value="0.358695652173913" />
</map>
</option>
</component>
<component name="ProjectRootManager" version="2" languageLevel="JDK_11" default="true" project-jdk-name="Android Studio default JDK" project-jdk-type="JavaSDK">
<output url="file://$PROJECT_DIR$/build/classes" />
</component>
<component name="ProjectType">
<option name="id" value="Android" />
</component>
</project>

View File

@ -1,38 +0,0 @@
plugins {
id 'com.android.application'
}
android {
compileSdk 32
defaultConfig {
applicationId "com.yaheen.fayconnectordemo"
minSdk 29
targetSdk 32
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
}
dependencies {
implementation 'androidx.appcompat:appcompat:1.3.0'
implementation 'com.google.android.material:material:1.4.0'
implementation 'androidx.constraintlayout:constraintlayout:2.0.4'
testImplementation 'junit:junit:4.13.2'
androidTestImplementation 'androidx.test.ext:junit:1.1.3'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.4.0'
}

View File

@ -1,21 +0,0 @@
# Add project specific ProGuard rules here.
# You can control the set of applied configuration files using the
# proguardFiles setting in build.gradle.
#
# For more details, see
# http://developer.android.com/guide/developing/tools/proguard.html
# If your project uses WebView with JS, uncomment the following
# and specify the fully qualified class name to the JavaScript interface
# class:
#-keepclassmembers class fqcn.of.javascript.interface.for.webview {
# public *;
#}
# Uncomment this to preserve the line number information for
# debugging stack traces.
#-keepattributes SourceFile,LineNumberTable
# If you keep the line number information, uncomment this to
# hide the original source file name.
#-renamesourcefileattribute SourceFile

View File

@ -1,26 +0,0 @@
package com.yaheen.fayconnectordemo;
import android.content.Context;
import androidx.test.platform.app.InstrumentationRegistry;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import org.junit.Test;
import org.junit.runner.RunWith;
import static org.junit.Assert.*;
/**
* Instrumented test, which will execute on an Android device.
*
* @see <a href="http://d.android.com/tools/testing">Testing documentation</a>
*/
@RunWith(AndroidJUnit4.class)
public class ExampleInstrumentedTest {
@Test
public void useAppContext() {
// Context of the app under test.
Context appContext = InstrumentationRegistry.getInstrumentation().getTargetContext();
assertEquals("com.yaheen.fayconnectordemo", appContext.getPackageName());
}
}

View File

@ -1,35 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
package="com.yaheen.fayconnectordemo">
<uses-permission android:name="android.permission.INTERNET"/><!--网络访问-->
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="android.permission.CHANGE_WIFI_STATE"/>
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE"/>
<uses-permission android:name="android.permission.CHANGE_NETWORK_STATE"/>
<uses-permission android:name="android.permission.RECORD_AUDIO"/><!--录音权限`-->
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<application
android:allowBackup="true"
android:dataExtractionRules="@xml/data_extraction_rules"
android:fullBackupContent="@xml/backup_rules"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true"
android:theme="@style/Theme.FayConnectorDemo"
tools:targetApi="31">
<activity
android:name=".MainActivity"
android:exported="true">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
</application>
</manifest>

View File

@ -1,249 +0,0 @@
package com.yaheen.fayconnectordemo;
import androidx.appcompat.app.AppCompatActivity;
import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;
import android.Manifest;
import android.content.pm.PackageManager;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioRecord;
import android.media.MediaPlayer;
import android.media.MediaRecorder;
import android.os.Build;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.TextView;
import com.google.android.material.snackbar.Snackbar;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.net.Socket;
import java.net.SocketException;
import java.util.Arrays;
import java.util.Date;
public class MainActivity extends AppCompatActivity {
private TextView tv = null;
private AudioRecord record;
private int recordBufsize = 0;
private Socket socket = null;
private InputStream in = null;
private OutputStream out = null;
private Thread sendThread = null;
private Thread receThread = null;
private boolean running = false;
private File cacheDir = null;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
this.cacheDir = getCacheDir();
tv = this.findViewById(R.id.tv);
tv.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
Log.d("fay","onclick");
running = !running;
sendThread = new Thread(new Runnable() {
@Override
public void run() {
if (!running){//关闭
running = false;
return;
}
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
if (ContextCompat.checkSelfPermission(MainActivity.this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
if (ActivityCompat.shouldShowRequestPermissionRationale(MainActivity.this, Manifest.permission.RECORD_AUDIO)) {
Log.d("fay","用户彻底拒绝了权限");
return;
} else {
// 用户未彻底拒绝授予权限
ActivityCompat.requestPermissions(MainActivity.this, new String[]{Manifest.permission.RECORD_AUDIO}, 1);
}
}
if (ContextCompat.checkSelfPermission(MainActivity.this, Manifest.permission.RECORD_AUDIO) == PackageManager.PERMISSION_GRANTED) {
Log.d("fay","权限ok");
if (record == null){
recordBufsize = AudioRecord
.getMinBufferSize(16000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT);
record = new AudioRecord(MediaRecorder.AudioSource.MIC,
16000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT,
recordBufsize);
}
try {
socket = new Socket("192.168.1.101", 10001);
in = socket.getInputStream();
out = socket.getOutputStream();
Snackbar.make(view, "fay控制器连接成功", Snackbar.LENGTH_SHORT)
.setAction("Action", null).show();
Log.d("fay","fay控制器连接成功");
}catch(IOException e){
Log.d("fay","socket连接失败");
return;
}
byte[] data = new byte[1024];
record.startRecording();
Snackbar.make(view, "麦克风启动成功", Snackbar.LENGTH_SHORT)
.setAction("Action", null).show();
Log.d("fay","麦克风启动成功");
try {
Snackbar.make(view, "开始传输音频", Snackbar.LENGTH_SHORT)
.setAction("Action", null).show();
Log.d("fay","开始传输音频");
while (MainActivity.this.running) {
record.read(data, 0, 1024);
if (data.length > 0) {
MainActivity.this.out.write(data);
}
}
}catch (Exception e){ //通过异常关闭链接
Log.d("fay","服务端关闭");
Snackbar.make(view, "服务端已经关闭", Snackbar.LENGTH_SHORT)
.setAction("Action", null).show();
}finally {
running = false;
record.stop();
record = null;
try {
socket.close();
}catch (Exception e){
}
Snackbar.make(view, "结束", Snackbar.LENGTH_SHORT)
.setAction("Action", null).show();
Log.d("fay","结束");
}
}
}
}
});
sendThread.start();
receThread = new Thread(new Runnable() {
@Override
public void run() {
try {
while (running) {
while (socket != null && !socket.isClosed()) {
byte[] data = new byte[9];
byte[] wavhead = new byte[]{0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08};//文件传输开始标记
in.read(data);
if (Arrays.equals(wavhead, data)) {
Log.d("fay", "开始接收音频文件");
String filedata = "";
data = new byte[1024];
while (data != null && data.length > 0) {
in.read(data);
filedata += MainActivity.bytesToHexString(data);
int index = filedata.indexOf("080706050403020100");
if (filedata.length() > 9 && index > 0){//wav文件结束标记
filedata = filedata.substring(0, index).replaceAll("F0F1F2F3F4F5F6F7F8", "");
File wavFile = new File(cacheDir, String.format("sample-%s.wav", new Date().getTime() + ""));
wavFile.createNewFile();
FileOutputStream fos = new FileOutputStream(wavFile);
fos.write(MainActivity.decodeHexBytes(filedata.toCharArray()));
fos.close();
Log.d("fay", "wav文件接收完成:" + wavFile.getAbsolutePath() + "," + filedata.length() / 2);
try{
MediaPlayer player = new MediaPlayer();
player.setDataSource(wavFile.getAbsolutePath());
player.prepare();
Thread.sleep(800);
player.start();
player.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
@Override
public void onCompletion(MediaPlayer mp) {
// TODO Auto-generated method stub
mp.release();
}
});
player.setLooping(false);
} catch (IOException e) {
Log.e("fay", e.toString());
}
break;
}
}
try {
Thread.sleep(1000);
} catch (Exception e) {
}
}
}
try {
Thread.sleep(1000);
} catch (Exception e) {
}
}
} catch (Exception e) {
Log.e("fay", e.toString());
}finally {
}
}});
receThread.start();
}
});
}
public static String bytesToHexString(byte[] data){
String result="";
for (int i = 0; i < data.length; i++) {
result+=Integer.toHexString((data[i] & 0xFF) | 0x100).toUpperCase().substring(1, 3);
}
return result;
}
public static byte[] decodeHexBytes(char[] data) {
int len = data.length;
if ((len & 0x01) != 0) {
throw new RuntimeException("未知的字符");
}
byte[] out = new byte[len >> 1];
for (int i = 0, j = 0; j < len; i++) {
int f = toDigit(data[j], j) << 4;
j++;
f = f | toDigit(data[j], j);
j++;
out[i] = (byte) (f & 0xFF);
}
return out;
}
protected static int toDigit(char ch, int index) {
int digit = Character.digit(ch, 16);
if (digit == -1) {
throw new RuntimeException("非法16进制字符 " + ch
+ " 在索引 " + index);
}
return digit;
}
}

View File

@ -1,30 +0,0 @@
<vector xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:aapt="http://schemas.android.com/aapt"
android:width="108dp"
android:height="108dp"
android:viewportWidth="108"
android:viewportHeight="108">
<path android:pathData="M31,63.928c0,0 6.4,-11 12.1,-13.1c7.2,-2.6 26,-1.4 26,-1.4l38.1,38.1L107,108.928l-32,-1L31,63.928z">
<aapt:attr name="android:fillColor">
<gradient
android:endX="85.84757"
android:endY="92.4963"
android:startX="42.9492"
android:startY="49.59793"
android:type="linear">
<item
android:color="#44000000"
android:offset="0.0" />
<item
android:color="#00000000"
android:offset="1.0" />
</gradient>
</aapt:attr>
</path>
<path
android:fillColor="#FFFFFF"
android:fillType="nonZero"
android:pathData="M65.3,45.828l3.8,-6.6c0.2,-0.4 0.1,-0.9 -0.3,-1.1c-0.4,-0.2 -0.9,-0.1 -1.1,0.3l-3.9,6.7c-6.3,-2.8 -13.4,-2.8 -19.7,0l-3.9,-6.7c-0.2,-0.4 -0.7,-0.5 -1.1,-0.3C38.8,38.328 38.7,38.828 38.9,39.228l3.8,6.6C36.2,49.428 31.7,56.028 31,63.928h46C76.3,56.028 71.8,49.428 65.3,45.828zM43.4,57.328c-0.8,0 -1.5,-0.5 -1.8,-1.2c-0.3,-0.7 -0.1,-1.5 0.4,-2.1c0.5,-0.5 1.4,-0.7 2.1,-0.4c0.7,0.3 1.2,1 1.2,1.8C45.3,56.528 44.5,57.328 43.4,57.328L43.4,57.328zM64.6,57.328c-0.8,0 -1.5,-0.5 -1.8,-1.2s-0.1,-1.5 0.4,-2.1c0.5,-0.5 1.4,-0.7 2.1,-0.4c0.7,0.3 1.2,1 1.2,1.8C66.5,56.528 65.6,57.328 64.6,57.328L64.6,57.328z"
android:strokeWidth="1"
android:strokeColor="#00000000" />
</vector>

View File

@ -1,170 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<vector xmlns:android="http://schemas.android.com/apk/res/android"
android:width="108dp"
android:height="108dp"
android:viewportWidth="108"
android:viewportHeight="108">
<path
android:fillColor="#3DDC84"
android:pathData="M0,0h108v108h-108z" />
<path
android:fillColor="#00000000"
android:pathData="M9,0L9,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,0L19,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M29,0L29,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M39,0L39,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M49,0L49,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M59,0L59,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M69,0L69,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M79,0L79,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M89,0L89,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M99,0L99,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,9L108,9"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,19L108,19"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,29L108,29"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,39L108,39"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,49L108,49"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,59L108,59"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,69L108,69"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,79L108,79"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,89L108,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,99L108,99"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,29L89,29"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,39L89,39"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,49L89,49"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,59L89,59"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,69L89,69"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,79L89,79"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M29,19L29,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M39,19L39,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M49,19L49,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M59,19L59,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M69,19L69,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M79,19L79,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
</vector>

View File

@ -1,19 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="连接/断开fay控制器"
android:id="@+id/tv"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent" />
</androidx.constraintlayout.widget.ConstraintLayout>

View File

@ -1,5 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<adaptive-icon xmlns:android="http://schemas.android.com/apk/res/android">
<background android:drawable="@drawable/ic_launcher_background" />
<foreground android:drawable="@drawable/ic_launcher_foreground" />
</adaptive-icon>

View File

@ -1,5 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<adaptive-icon xmlns:android="http://schemas.android.com/apk/res/android">
<background android:drawable="@drawable/ic_launcher_background" />
<foreground android:drawable="@drawable/ic_launcher_foreground" />
</adaptive-icon>

View File

@ -1,16 +0,0 @@
<resources xmlns:tools="http://schemas.android.com/tools">
<!-- Base application theme. -->
<style name="Theme.FayConnectorDemo" parent="Theme.MaterialComponents.DayNight.DarkActionBar">
<!-- Primary brand color. -->
<item name="colorPrimary">@color/purple_200</item>
<item name="colorPrimaryVariant">@color/purple_700</item>
<item name="colorOnPrimary">@color/black</item>
<!-- Secondary brand color. -->
<item name="colorSecondary">@color/teal_200</item>
<item name="colorSecondaryVariant">@color/teal_200</item>
<item name="colorOnSecondary">@color/black</item>
<!-- Status bar color. -->
<item name="android:statusBarColor" tools:targetApi="l">?attr/colorPrimaryVariant</item>
<!-- Customize your theme here. -->
</style>
</resources>

View File

@ -1,10 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<resources>
<color name="purple_200">#FFBB86FC</color>
<color name="purple_500">#FF6200EE</color>
<color name="purple_700">#FF3700B3</color>
<color name="teal_200">#FF03DAC5</color>
<color name="teal_700">#FF018786</color>
<color name="black">#FF000000</color>
<color name="white">#FFFFFFFF</color>
</resources>

View File

@ -1,3 +0,0 @@
<resources>
<string name="app_name">fayConnectorDemo</string>
</resources>

View File

@ -1,16 +0,0 @@
<resources xmlns:tools="http://schemas.android.com/tools">
<!-- Base application theme. -->
<style name="Theme.FayConnectorDemo" parent="Theme.MaterialComponents.DayNight.DarkActionBar">
<!-- Primary brand color. -->
<item name="colorPrimary">@color/purple_500</item>
<item name="colorPrimaryVariant">@color/purple_700</item>
<item name="colorOnPrimary">@color/white</item>
<!-- Secondary brand color. -->
<item name="colorSecondary">@color/teal_200</item>
<item name="colorSecondaryVariant">@color/teal_700</item>
<item name="colorOnSecondary">@color/black</item>
<!-- Status bar color. -->
<item name="android:statusBarColor" tools:targetApi="l">?attr/colorPrimaryVariant</item>
<!-- Customize your theme here. -->
</style>
</resources>

View File

@ -1,13 +0,0 @@
<?xml version="1.0" encoding="utf-8"?><!--
Sample backup rules file; uncomment and customize as necessary.
See https://developer.android.com/guide/topics/data/autobackup
for details.
Note: This file is ignored for devices older that API 31
See https://developer.android.com/about/versions/12/backup-restore
-->
<full-backup-content>
<!--
<include domain="sharedpref" path="."/>
<exclude domain="sharedpref" path="device.xml"/>
-->
</full-backup-content>

View File

@ -1,19 +0,0 @@
<?xml version="1.0" encoding="utf-8"?><!--
Sample data extraction rules file; uncomment and customize as necessary.
See https://developer.android.com/about/versions/12/backup-restore#xml-changes
for details.
-->
<data-extraction-rules>
<cloud-backup>
<!-- TODO: Use <include> and <exclude> to control what is backed up.
<include .../>
<exclude .../>
-->
</cloud-backup>
<!--
<device-transfer>
<include .../>
<exclude .../>
</device-transfer>
-->
</data-extraction-rules>

View File

@ -1,17 +0,0 @@
package com.yaheen.fayconnectordemo;
import org.junit.Test;
import static org.junit.Assert.*;
/**
* Example local unit test, which will execute on the development machine (host).
*
* @see <a href="http://d.android.com/tools/testing">Testing documentation</a>
*/
public class ExampleUnitTest {
@Test
public void addition_isCorrect() {
assertEquals(4, 2 + 2);
}
}

View File

@ -1,9 +0,0 @@
// Top-level build file where you can add configuration options common to all sub-projects/modules.
plugins {
id 'com.android.application' version '7.2.1' apply false
id 'com.android.library' version '7.2.1' apply false
}
task clean(type: Delete) {
delete rootProject.buildDir
}

Some files were not shown because too many files have changed in this diff Show More