add coreseek dockerfile

This commit is contained in:
FengJunSun 2016-12-17 19:39:01 +08:00
parent f41ae09edf
commit df74e47110
2626 changed files with 987645 additions and 1 deletions

2
.gitignore vendored Normal file
View File

@ -0,0 +1,2 @@
.DS_Store
sphinx.conf

51
Dockerfile Normal file
View File

@ -0,0 +1,51 @@
FROM ubuntu:12.04
MAINTAINER Mark mark@douwantech.com
#ADD ./sources.list /etc/apt/sources.list
RUN apt-get update
#RUN apt-get upgrade -y
RUN DEBIAN_FRONTEND=noninteractive apt-get install -y \
build-essential \
make \
gcc \
g++ \
automake \
libtool \
mysql-client \
libmysqlclient15-dev \
libxml2-dev \
libexpat1-dev \
cron
RUN mkdir -p /usr/local/src/coreseek
ADD ./coreseek /usr/local/src/coreseek
RUN chmod 755 -R /usr/local/src/coreseek
WORKDIR /usr/local/src/coreseek/mmseg-3.2.14
RUN ./bootstrap
RUN ./configure
RUN make && make install
WORKDIR /usr/local/src/coreseek/csft-4.1
RUN ./buildconf.sh
RUN ./configure --without-unixodbc --with-mmseg --with-mysql
RUN make && make install
ADD ./cron/sphinx /etc/cron.hourly/sphinx
VOLUME ['/usr/local/etc/sphinx', '/var/log/sphinx']
RUN ln -s /usr/local/etc/sphinx/sphinx.conf /usr/local/etc/csft.conf
RUN mkdir -p /var/sphinx/log/
RUN mkdir -p /var/sphinx/data/
WORKDIR /
EXPOSE 9312
ADD ./entrypoint.sh /
RUN chmod +x /entrypoint.sh
ENTRYPOINT ["/entrypoint.sh"]

View File

@ -1,2 +1,78 @@
# Dockfile-Coreseek
sphinx chinese full text index
Coreseek 是一款中文全文检索/搜索软件,以 GPLv2 许可协议开源发布,基于 Sphinx 研发并独立发布,专攻中文搜索和信息处理领域,适用于行业/垂直搜索、论坛/站内搜索、数据库搜索、文档/文献检索、信息检索、数据挖掘等应用场景,用户可以免费下载使用
Docker 提供了一个可以运行你的应用程序的封套(envelope),或者说容器。它原本是 dotCloud 启动的一个业余项目,并在前些时候开源了。它吸引了大量的关注和讨论,导致 dotCloud 把它重命名到 Docker Inc。它最初是用 Go 语言编写的,它就相当于是加在 LXCLinuX Containerslinux 容器)上的管道,允许开发者在更高层次的概念上工作。
-
#需要创建个配置文件 `/path/sphinx/sphinx.conf`
```
################# sphinx config #######################
source search
{
type = mysql
sql_host = dbhost
sql_user = dbname
sql_pass = dbpass
sql_db = db
sql_port = 3306
sql_query_pre = SET NAMES utf8
sql_query_pre = SET SESSION query_cache_type=OFF
sql_query = SELECT *,CASE WHEN `kind` = 'News' THEN 2 WHEN `kind` = 'Activity' THEN 1 WHEN `kind` = 'Service' THEN 0 END AS kind2 FROM searches
sql_attr_uint = item_id
sql_attr_uint = kind2
sql_attr_timestamp = updated_at
sql_attr_timestamp = created_at
sql_ranged_throttle = 0
}
index search
{
source = search
path = /var/sphinx/data/search
docinfo = extern
mlock = 0
preopen = 1
min_word_len = 1
charset_type = zh_cn.utf-8
charset_dictpath = /usr/local/etc/
min_prefix_len = 0
min_infix_len = 1
ngram_len = 0
}
indexer
{
mem_limit = 1024M
write_buffer = 4M
}
searchd
{
listen = 0.0.0.0:9312
log = /var/sphinx/log/sphinx.log
query_log = /var/sphinx/log/query.log
read_timeout = 2
max_children = 0
pid_file = /var/run/sphinx.pid
max_matches = 100000
seamless_rotate = 1
preopen_indexes = 0
unlink_old = 1
read_buffer = 8M
compat_sphinxql_magics = 0
}
# EOF
```
#启动 `docker`
```
docker run --name sphinx -v /root/sphinx/sphinx:/usr/local/etc/sphinx -p 9312:9312 -i registry.cn-hangzhou.aliyuncs.com/ror/coreseek:latest
```

67
coreseek/README.txt Executable file
View File

@ -0,0 +1,67 @@
最新使用文档请查看http://www.coreseek.cn/products/products-install/
目录说明:
csft-x.y.zcoreseek源代码
mmseg-i.j.kmmseg源代码
testpack测试配置和数据包
testpack测试说明
目录说明:
apiapi接口和测试脚本
etc配置文件
etc/pysourcepython数据源脚本
var运行数据
var/data索引文件
var/log搜索日志
var/test测试源数据
配置1
测试对象xml数据源中文分词与搜索
对应配置etc/csft.conf
测试数据var/test/test.xml
PHP程序api/test_coreseek.php
在线说明http://www.coreseek.cn/products/products-install/install_on_bsd_linux/
配置2
测试对象xml数据源单字切分与搜索
对应配置etc/csft_cjk.conf
测试数据var/test/test.xml
PHP程序api/test_coreseek.php
在线说明http://www.coreseek.cn/products-install/ngram_len_cjk/
配置3
测试对象mysql数据源中文分词与搜索
对应配置etc/csft_mysql.conf
测试数据var/test/documents.sql
PHP程序api/test_coreseek.php
测试说明请先将测试数据导入数据库并设置好配置文件中的MySQL用户密码数据库
在线说明http://www.coreseek.cn/products-install/mysql/
配置4
测试对象python数据源中文分词与搜索
对应配置etc/csft_demo_python.conf
数据脚本etc/pysource/csft_demo/__init__.py
PHP程序api/test_coreseek.php
测试说明请先安装Python 2.6 (x86)
在线说明http://www.coreseek.cn/products-install/python/
配置5
测试对象python+mssql数据源中文分词与搜索
对应配置etc/csft_demo_python_pymssql.conf
数据脚本etc/pysource/csft_demo_pymssql/__init__.py
PHP程序api/test_coreseek.php
测试说明请先安装Python 2.6 (x86)、pymssqlpy2.6
在线说明http://www.coreseek.cn/products-install/python/
coreseek-4.y.z测试
配置6
测试对象RT实时索引中文分词与搜索
对应配置etc/csft_rtindex.conf
PHP程序api/test_coreseek_rtindex.php
在线说明http://www.coreseek.cn/products-install/rt-indexes/
配置7
测试对象RT实时索引单字切分与搜索
对应配置etc/csft_rtindex_cjk.conf
PHP程序api/test_coreseek_rtindex.php
在线说明http://www.coreseek.cn/products-install/rt-indexes/

344
coreseek/csft-4.1/COPYING Executable file
View File

@ -0,0 +1,344 @@
GNU GENERAL PUBLIC LICENSE
Version 2, June 1991
Copyright (C) 1989, 1991 Free Software Foundation, Inc.
59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The licenses for most software are designed to take away your
freedom to share and change it. By contrast, the GNU General Public
License is intended to guarantee your freedom to share and change free
software--to make sure the software is free for all its users. This
General Public License applies to most of the Free Software
Foundation's software and to any other program whose authors commit to
using it. (Some other Free Software Foundation software is covered by
the GNU Library General Public License instead.) You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
this service if you wish), that you receive source code or can get it
if you want it, that you can change the software or use pieces of it
in new free programs; and that you know you can do these things.
To protect your rights, we need to make restrictions that forbid
anyone to deny you these rights or to ask you to surrender the rights.
These restrictions translate to certain responsibilities for you if you
distribute copies of the software, or if you modify it.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must give the recipients all the rights that
you have. You must make sure that they, too, receive or can get the
source code. And you must show them these terms so they know their
rights.
We protect your rights with two steps: (1) copyright the software, and
(2) offer you this license which gives you legal permission to copy,
distribute and/or modify the software.
Also, for each author's protection and ours, we want to make certain
that everyone understands that there is no warranty for this free
software. If the software is modified by someone else and passed on, we
want its recipients to know that what they have is not the original, so
that any problems introduced by others will not reflect on the original
authors' reputations.
Finally, any free program is threatened constantly by software
patents. We wish to avoid the danger that redistributors of a free
program will individually obtain patent licenses, in effect making the
program proprietary. To prevent this, we have made it clear that any
patent must be licensed for everyone's free use or not licensed at all.
The precise terms and conditions for copying, distribution and
modification follow.
GNU GENERAL PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
0. This License applies to any program or other work which contains
a notice placed by the copyright holder saying it may be distributed
under the terms of this General Public License. The "Program", below,
refers to any such program or work, and a "work based on the Program"
means either the Program or any derivative work under copyright law:
that is to say, a work containing the Program or a portion of it,
either verbatim or with modifications and/or translated into another
language. (Hereinafter, translation is included without limitation in
the term "modification".) Each licensee is addressed as "you".
Activities other than copying, distribution and modification are not
covered by this License; they are outside its scope. The act of
running the Program is not restricted, and the output from the Program
is covered only if its contents constitute a work based on the
Program (independent of having been made by running the Program).
Whether that is true depends on what the Program does.
1. You may copy and distribute verbatim copies of the Program's
source code as you receive it, in any medium, provided that you
conspicuously and appropriately publish on each copy an appropriate
copyright notice and disclaimer of warranty; keep intact all the
notices that refer to this License and to the absence of any warranty;
and give any other recipients of the Program a copy of this License
along with the Program.
You may charge a fee for the physical act of transferring a copy, and
you may at your option offer warranty protection in exchange for a fee.
2. You may modify your copy or copies of the Program or any portion
of it, thus forming a work based on the Program, and copy and
distribute such modifications or work under the terms of Section 1
above, provided that you also meet all of these conditions:
a) You must cause the modified files to carry prominent notices
stating that you changed the files and the date of any change.
b) You must cause any work that you distribute or publish, that in
whole or in part contains or is derived from the Program or any
part thereof, to be licensed as a whole at no charge to all third
parties under the terms of this License.
c) If the modified program normally reads commands interactively
when run, you must cause it, when started running for such
interactive use in the most ordinary way, to print or display an
announcement including an appropriate copyright notice and a
notice that there is no warranty (or else, saying that you provide
a warranty) and that users may redistribute the program under
these conditions, and telling the user how to view a copy of this
License. (Exception: if the Program itself is interactive but
does not normally print such an announcement, your work based on
the Program is not required to print an announcement.)
These requirements apply to the modified work as a whole. If
identifiable sections of that work are not derived from the Program,
and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those
sections when you distribute them as separate works. But when you
distribute the same sections as part of a whole which is a work based
on the Program, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote it.
Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Program.
In addition, mere aggregation of another work not based on the Program
with the Program (or with a work based on the Program) on a volume of
a storage or distribution medium does not bring the other work under
the scope of this License.
3. You may copy and distribute the Program (or a work based on it,
under Section 2) in object code or executable form under the terms of
Sections 1 and 2 above provided that you also do one of the following:
a) Accompany it with the complete corresponding machine-readable
source code, which must be distributed under the terms of Sections
1 and 2 above on a medium customarily used for software interchange; or,
b) Accompany it with a written offer, valid for at least three
years, to give any third party, for a charge no more than your
cost of physically performing source distribution, a complete
machine-readable copy of the corresponding source code, to be
distributed under the terms of Sections 1 and 2 above on a medium
customarily used for software interchange; or,
c) Accompany it with the information you received as to the offer
to distribute corresponding source code. (This alternative is
allowed only for noncommercial distribution and only if you
received the program in object code or executable form with such
an offer, in accord with Subsection b above.)
The source code for a work means the preferred form of the work for
making modifications to it. For an executable work, complete source
code means all the source code for all modules it contains, plus any
associated interface definition files, plus the scripts used to
control compilation and installation of the executable. However, as a
special exception, the source code distributed need not include
anything that is normally distributed (in either source or binary
form) with the major components (compiler, kernel, and so on) of the
operating system on which the executable runs, unless that component
itself accompanies the executable.
If distribution of executable or object code is made by offering
access to copy from a designated place, then offering equivalent
access to copy the source code from the same place counts as
distribution of the source code, even though third parties are not
compelled to copy the source along with the object code.
4. You may not copy, modify, sublicense, or distribute the Program
except as expressly provided under this License. Any attempt
otherwise to copy, modify, sublicense or distribute the Program is
void, and will automatically terminate your rights under this License.
However, parties who have received copies, or rights, from you under
this License will not have their licenses terminated so long as such
parties remain in full compliance.
5. You are not required to accept this License, since you have not
signed it. However, nothing else grants you permission to modify or
distribute the Program or its derivative works. These actions are
prohibited by law if you do not accept this License. Therefore, by
modifying or distributing the Program (or any work based on the
Program), you indicate your acceptance of this License to do so, and
all its terms and conditions for copying, distributing or modifying
the Program or works based on it.
6. Each time you redistribute the Program (or any work based on the
Program), the recipient automatically receives a license from the
original licensor to copy, distribute or modify the Program subject to
these terms and conditions. You may not impose any further
restrictions on the recipients' exercise of the rights granted herein.
You are not responsible for enforcing compliance by third parties to
this License.
7. If, as a consequence of a court judgment or allegation of patent
infringement or for any other reason (not limited to patent issues),
conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot
distribute so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you
may not distribute the Program at all. For example, if a patent
license would not permit royalty-free redistribution of the Program by
all those who receive copies directly or indirectly through you, then
the only way you could satisfy both it and this License would be to
refrain entirely from distribution of the Program.
If any portion of this section is held invalid or unenforceable under
any particular circumstance, the balance of the section is intended to
apply and the section as a whole is intended to apply in other
circumstances.
It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system, which is
implemented by public license practices. Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
system; it is up to the author/donor to decide if he or she is willing
to distribute software through any other system and a licensee cannot
impose that choice.
This section is intended to make thoroughly clear what is believed to
be a consequence of the rest of this License.
8. If the distribution and/or use of the Program is restricted in
certain countries either by patents or by copyrighted interfaces, the
original copyright holder who places the Program under this License
may add an explicit geographical distribution limitation excluding
those countries, so that distribution is permitted only in or among
countries not thus excluded. In such case, this License incorporates
the limitation as if written in the body of this License.
9. The Free Software Foundation may publish revised and/or new versions
of the General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the Program
specifies a version number of this License which applies to it and "any
later version", you have the option of following the terms and conditions
either of that version or of any later version published by the Free
Software Foundation. If the Program does not specify a version number of
this License, you may choose any version ever published by the Free Software
Foundation.
10. If you wish to incorporate parts of the Program into other free
programs whose distribution conditions are different, write to the author
to ask for permission. For software which is copyrighted by the Free
Software Foundation, write to the Free Software Foundation; we sometimes
make exceptions for this. Our decision will be guided by the two goals
of preserving the free status of all derivatives of our free software and
of promoting the sharing and reuse of software generally.
NO WARRANTY
11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
REPAIR OR CORRECTION.
12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
POSSIBILITY OF SUCH DAMAGES.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
convey the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) 19yy <name of author>
This program is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program; if not, write to the Free Software
Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
Also add information on how to contact you by electronic and paper mail.
If the program is interactive, make it output a short notice like this
when it starts in an interactive mode:
Gnomovision version 69, Copyright (C) 19yy name of author
Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, the commands you use may
be called something other than `show w' and `show c'; they could even be
mouse-clicks or menu items--whatever suits your program.
You should also get your employer (if you work as a programmer) or your
school, if any, to sign a "copyright disclaimer" for the program, if
necessary. Here is a sample; alter the names:
Yoyodyne, Inc., hereby disclaims all copyright interest in the program
`Gnomovision' (which makes passes at compilers) written by James Hacker.
<signature of Ty Coon>, 1 April 1989
Ty Coon, President of Vice
This General Public License does not permit incorporating your program into
proprietary programs. If your program is a subroutine library, you may
consider it more useful to permit linking proprietary applications with the
library. If this is what you want to do, use the GNU Library General
Public License instead of this License.

1
coreseek/csft-4.1/INSTALL Executable file
View File

@ -0,0 +1 @@
Please refer to <<Installation>> section in doc/sphinx.txt or doc/sphinx.html.

788
coreseek/csft-4.1/Makefile Normal file
View File

@ -0,0 +1,788 @@
# Makefile.in generated by automake 1.11.3 from Makefile.am.
# Makefile. Generated from Makefile.in by configure.
# Copyright (C) 1994, 1995, 1996, 1997, 1998, 1999, 2000, 2001, 2002,
# 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011 Free Software
# Foundation, Inc.
# This Makefile.in is free software; the Free Software Foundation
# gives unlimited permission to copy and/or distribute it,
# with or without modifications, as long as this notice is preserved.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY, to the extent permitted by law; without
# even the implied warranty of MERCHANTABILITY or FITNESS FOR A
# PARTICULAR PURPOSE.
pkgdatadir = $(datadir)/sphinx
pkgincludedir = $(includedir)/sphinx
pkglibdir = $(libdir)/sphinx
pkglibexecdir = $(libexecdir)/sphinx
am__cd = CDPATH="$${ZSH_VERSION+.}$(PATH_SEPARATOR)" && cd
install_sh_DATA = $(install_sh) -c -m 644
install_sh_PROGRAM = $(install_sh) -c
install_sh_SCRIPT = $(install_sh) -c
INSTALL_HEADER = $(INSTALL_DATA)
transform = $(program_transform_name)
NORMAL_INSTALL = :
PRE_INSTALL = :
POST_INSTALL = :
NORMAL_UNINSTALL = :
PRE_UNINSTALL = :
POST_UNINSTALL = :
subdir = .
DIST_COMMON = $(am__configure_deps) $(srcdir)/Makefile.am \
$(srcdir)/Makefile.in $(srcdir)/sphinx-min.conf.in \
$(srcdir)/sphinx.conf.in $(top_srcdir)/config/config.h.in \
$(top_srcdir)/configure COPYING INSTALL config/depcomp \
config/install-sh config/missing
ACLOCAL_M4 = $(top_srcdir)/aclocal.m4
am__aclocal_m4_deps = $(top_srcdir)/acinclude.m4 \
$(top_srcdir)/python.m4 $(top_srcdir)/configure.ac
am__configure_deps = $(am__aclocal_m4_deps) $(CONFIGURE_DEPENDENCIES) \
$(ACLOCAL_M4)
am__CONFIG_DISTCLEAN_FILES = config.status config.cache config.log \
configure.lineno config.status.lineno
mkinstalldirs = $(install_sh) -d
CONFIG_HEADER = $(top_builddir)/config/config.h
CONFIG_CLEAN_FILES = sphinx.conf.dist sphinx-min.conf.dist
CONFIG_CLEAN_VPATH_FILES =
SOURCES =
DIST_SOURCES =
RECURSIVE_TARGETS = all-recursive check-recursive dvi-recursive \
html-recursive info-recursive install-data-recursive \
install-dvi-recursive install-exec-recursive \
install-html-recursive install-info-recursive \
install-pdf-recursive install-ps-recursive install-recursive \
installcheck-recursive installdirs-recursive pdf-recursive \
ps-recursive uninstall-recursive
am__vpath_adj_setup = srcdirstrip=`echo "$(srcdir)" | sed 's|.|.|g'`;
am__vpath_adj = case $$p in \
$(srcdir)/*) f=`echo "$$p" | sed "s|^$$srcdirstrip/||"`;; \
*) f=$$p;; \
esac;
am__strip_dir = f=`echo $$p | sed -e 's|^.*/||'`;
am__install_max = 40
am__nobase_strip_setup = \
srcdirstrip=`echo "$(srcdir)" | sed 's/[].[^$$\\*|]/\\\\&/g'`
am__nobase_strip = \
for p in $$list; do echo "$$p"; done | sed -e "s|$$srcdirstrip/||"
am__nobase_list = $(am__nobase_strip_setup); \
for p in $$list; do echo "$$p $$p"; done | \
sed "s| $$srcdirstrip/| |;"' / .*\//!s/ .*/ ./; s,\( .*\)/[^/]*$$,\1,' | \
$(AWK) 'BEGIN { files["."] = "" } { files[$$2] = files[$$2] " " $$1; \
if (++n[$$2] == $(am__install_max)) \
{ print $$2, files[$$2]; n[$$2] = 0; files[$$2] = "" } } \
END { for (dir in files) print dir, files[dir] }'
am__base_list = \
sed '$$!N;$$!N;$$!N;$$!N;$$!N;$$!N;$$!N;s/\n/ /g' | \
sed '$$!N;$$!N;$$!N;$$!N;s/\n/ /g'
am__uninstall_files_from_dir = { \
test -z "$$files" \
|| { test ! -d "$$dir" && test ! -f "$$dir" && test ! -r "$$dir"; } \
|| { echo " ( cd '$$dir' && rm -f" $$files ")"; \
$(am__cd) "$$dir" && rm -f $$files; }; \
}
am__installdirs = "$(DESTDIR)$(sysconfdir)"
DATA = $(sysconf_DATA)
RECURSIVE_CLEAN_TARGETS = mostlyclean-recursive clean-recursive \
distclean-recursive maintainer-clean-recursive
AM_RECURSIVE_TARGETS = $(RECURSIVE_TARGETS:-recursive=) \
$(RECURSIVE_CLEAN_TARGETS:-recursive=) tags TAGS ctags CTAGS \
distdir dist dist-all distcheck
ETAGS = etags
CTAGS = ctags
DIST_SUBDIRS = src test doc libstemmer_c
DISTFILES = $(DIST_COMMON) $(DIST_SOURCES) $(TEXINFOS) $(EXTRA_DIST)
distdir = $(PACKAGE)-$(VERSION)
top_distdir = $(distdir)
am__remove_distdir = \
if test -d "$(distdir)"; then \
find "$(distdir)" -type d ! -perm -200 -exec chmod u+w {} ';' \
&& rm -rf "$(distdir)" \
|| { sleep 5 && rm -rf "$(distdir)"; }; \
else :; fi
am__relativize = \
dir0=`pwd`; \
sed_first='s,^\([^/]*\)/.*$$,\1,'; \
sed_rest='s,^[^/]*/*,,'; \
sed_last='s,^.*/\([^/]*\)$$,\1,'; \
sed_butlast='s,/*[^/]*$$,,'; \
while test -n "$$dir1"; do \
first=`echo "$$dir1" | sed -e "$$sed_first"`; \
if test "$$first" != "."; then \
if test "$$first" = ".."; then \
dir2=`echo "$$dir0" | sed -e "$$sed_last"`/"$$dir2"; \
dir0=`echo "$$dir0" | sed -e "$$sed_butlast"`; \
else \
first2=`echo "$$dir2" | sed -e "$$sed_first"`; \
if test "$$first2" = "$$first"; then \
dir2=`echo "$$dir2" | sed -e "$$sed_rest"`; \
else \
dir2="../$$dir2"; \
fi; \
dir0="$$dir0"/"$$first"; \
fi; \
fi; \
dir1=`echo "$$dir1" | sed -e "$$sed_rest"`; \
done; \
reldir="$$dir2"
DIST_ARCHIVES = $(distdir).tar.gz
GZIP_ENV = --best
distuninstallcheck_listfiles = find . -type f -print
am__distuninstallcheck_listfiles = $(distuninstallcheck_listfiles) \
| sed 's|^\./|$(prefix)/|' | grep -v '$(infodir)/dir$$'
distcleancheck_listfiles = find . -type f -print
ACLOCAL = ${SHELL} /home/coreseek/coreseek-4.1-beta/csft-4.1/config/missing --run aclocal-1.11
AMTAR = $${TAR-tar}
AUTOCONF = ${SHELL} /home/coreseek/coreseek-4.1-beta/csft-4.1/config/missing --run autoconf
AUTOHEADER = ${SHELL} /home/coreseek/coreseek-4.1-beta/csft-4.1/config/missing --run autoheader
AUTOMAKE = ${SHELL} /home/coreseek/coreseek-4.1-beta/csft-4.1/config/missing --run automake-1.11
AWK = gawk
CC = gcc
CCDEPMODE = depmode=gcc3
CFLAGS = -Wall -g -D_FILE_OFFSET_BITS=64 -O3 -DNDEBUG
CONFDIR = /usr/local/coreseek/var
CPP = gcc -E
CPPFLAGS = -I/usr/local/include -pthread -I/usr/include/mysql -DBIG_JOINS=1 -fno-strict-aliasing -g -I/usr/local/mmseg3/include/mmseg/
CXX = g++
CXXDEPMODE = depmode=gcc3
CXXFLAGS = -Wall -g -D_FILE_OFFSET_BITS=64 -O3 -DNDEBUG
CYGPATH_W = echo
DEFS = -DHAVE_CONFIG_H
DEPDIR = .deps
ECHO_C =
ECHO_N = -n
ECHO_T =
EGREP = /bin/grep -E
EXEEXT =
GREP = /bin/grep
INSTALL = /usr/bin/install -c
INSTALL_DATA = ${INSTALL} -m 644
INSTALL_PROGRAM = ${INSTALL}
INSTALL_SCRIPT = ${INSTALL}
INSTALL_STRIP_PROGRAM = $(install_sh) -c -s
LDFLAGS =
LIBOBJS =
LIBRT = -lrt
LIBS = -ldl -lm -lz -lexpat -L/usr/local/lib -lrt -lpthread
LTLIBOBJS =
MAINT = #
MAKEINFO = ${SHELL} /home/coreseek/coreseek-4.1-beta/csft-4.1/config/missing --run makeinfo
MKDIR_P = /bin/mkdir -p
MMSEG_CFLAGS = -I/usr/local/mmseg3/include/mmseg/
MMSEG_LIBS = -L/usr/local/mmseg3/lib/ -lmmseg
MYSQL_CFLAGS = -I/usr/include/mysql -DBIG_JOINS=1 -fno-strict-aliasing -g
MYSQL_LIBS = -L/usr/lib/x86_64-linux-gnu -lmysqlclient -lpthread -lz -lm -lrt -ldl
OBJEXT = o
PACKAGE = sphinx
PACKAGE_BUGREPORT = shodan(at)shodan.ru
PACKAGE_NAME = sphinx
PACKAGE_STRING = sphinx 1.11
PACKAGE_TARNAME = sphinx
PACKAGE_URL =
PACKAGE_VERSION = 1.11
PATH_SEPARATOR = :
PGSQL_CFLAGS =
PGSQL_LIBS =
PYTHON = /usr/bin/python
PYTHON_CPPFLAGS =
PYTHON_EXEC_PREFIX = ${exec_prefix}
PYTHON_LIBS =
PYTHON_PLATFORM = linux2
PYTHON_PREFIX = ${prefix}
PYTHON_VERSION = 2.7
RANLIB = ranlib
SET_MAKE =
SHELL = /bin/bash
STRIP =
VERSION = 1.11
abs_builddir = /home/coreseek/coreseek-4.1-beta/csft-4.1
abs_srcdir = /home/coreseek/coreseek-4.1-beta/csft-4.1
abs_top_builddir = /home/coreseek/coreseek-4.1-beta/csft-4.1
abs_top_srcdir = /home/coreseek/coreseek-4.1-beta/csft-4.1
ac_ct_CC = gcc
ac_ct_CXX = g++
am__include = include
am__leading_dot = .
am__quote =
am__tar = $${TAR-tar} chof - "$$tardir"
am__untar = $${TAR-tar} xf -
bindir = ${exec_prefix}/bin
build_alias =
builddir = .
datadir = ${datarootdir}
datarootdir = ${prefix}/share
docdir = ${datarootdir}/doc/${PACKAGE_TARNAME}
dvidir = ${docdir}
exec_prefix = ${prefix}
host_alias =
htmldir = ${docdir}
includedir = ${prefix}/include
infodir = ${datarootdir}/info
install_sh = ${SHELL} /home/coreseek/coreseek-4.1-beta/csft-4.1/config/install-sh
libdir = ${exec_prefix}/lib
libexecdir = ${exec_prefix}/libexec
localedir = ${datarootdir}/locale
localstatedir = ${prefix}/var
mandir = ${datarootdir}/man
mkdir_p = /bin/mkdir -p
oldincludedir = /usr/include
pdfdir = ${docdir}
pgconfig =
pkgpyexecdir = ${pyexecdir}/sphinx
pkgpythondir = ${pythondir}/sphinx
prefix = /usr/local/coreseek
program_transform_name = s,x,x,
psdir = ${docdir}
pyexecdir = ${exec_prefix}/lib/python2.7/site-packages
pythondir = ${prefix}/lib/python2.7/site-packages
sbindir = ${exec_prefix}/sbin
sharedstatedir = ${prefix}/com
srcdir = .
sysconfdir = ${prefix}/etc
target_alias =
top_build_prefix =
top_builddir = .
top_srcdir = .
SUBDIRS = src test doc
#SUBDIRS = libstemmer_c src test doc
EXTRA_DIST = api storage sphinx.conf.in sphinx-min.conf.in example.sql
sysconf_DATA = sphinx.conf.dist sphinx-min.conf.dist example.sql
all: all-recursive
.SUFFIXES:
am--refresh: Makefile
@:
$(srcdir)/Makefile.in: # $(srcdir)/Makefile.am $(am__configure_deps)
@for dep in $?; do \
case '$(am__configure_deps)' in \
*$$dep*) \
echo ' cd $(srcdir) && $(AUTOMAKE) --foreign'; \
$(am__cd) $(srcdir) && $(AUTOMAKE) --foreign \
&& exit 0; \
exit 1;; \
esac; \
done; \
echo ' cd $(top_srcdir) && $(AUTOMAKE) --foreign Makefile'; \
$(am__cd) $(top_srcdir) && \
$(AUTOMAKE) --foreign Makefile
.PRECIOUS: Makefile
Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status
@case '$?' in \
*config.status*) \
echo ' $(SHELL) ./config.status'; \
$(SHELL) ./config.status;; \
*) \
echo ' cd $(top_builddir) && $(SHELL) ./config.status $@ $(am__depfiles_maybe)'; \
cd $(top_builddir) && $(SHELL) ./config.status $@ $(am__depfiles_maybe);; \
esac;
$(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES)
$(SHELL) ./config.status --recheck
$(top_srcdir)/configure: # $(am__configure_deps)
$(am__cd) $(srcdir) && $(AUTOCONF)
$(ACLOCAL_M4): # $(am__aclocal_m4_deps)
$(am__cd) $(srcdir) && $(ACLOCAL) $(ACLOCAL_AMFLAGS)
$(am__aclocal_m4_deps):
config/config.h: config/stamp-h1
@if test ! -f $@; then rm -f config/stamp-h1; else :; fi
@if test ! -f $@; then $(MAKE) $(AM_MAKEFLAGS) config/stamp-h1; else :; fi
config/stamp-h1: $(top_srcdir)/config/config.h.in $(top_builddir)/config.status
@rm -f config/stamp-h1
cd $(top_builddir) && $(SHELL) ./config.status config/config.h
$(top_srcdir)/config/config.h.in: # $(am__configure_deps)
($(am__cd) $(top_srcdir) && $(AUTOHEADER))
rm -f config/stamp-h1
touch $@
distclean-hdr:
-rm -f config/config.h config/stamp-h1
sphinx.conf.dist: $(top_builddir)/config.status $(srcdir)/sphinx.conf.in
cd $(top_builddir) && $(SHELL) ./config.status $@
sphinx-min.conf.dist: $(top_builddir)/config.status $(srcdir)/sphinx-min.conf.in
cd $(top_builddir) && $(SHELL) ./config.status $@
install-sysconfDATA: $(sysconf_DATA)
@$(NORMAL_INSTALL)
test -z "$(sysconfdir)" || $(MKDIR_P) "$(DESTDIR)$(sysconfdir)"
@list='$(sysconf_DATA)'; test -n "$(sysconfdir)" || list=; \
for p in $$list; do \
if test -f "$$p"; then d=; else d="$(srcdir)/"; fi; \
echo "$$d$$p"; \
done | $(am__base_list) | \
while read files; do \
echo " $(INSTALL_DATA) $$files '$(DESTDIR)$(sysconfdir)'"; \
$(INSTALL_DATA) $$files "$(DESTDIR)$(sysconfdir)" || exit $$?; \
done
uninstall-sysconfDATA:
@$(NORMAL_UNINSTALL)
@list='$(sysconf_DATA)'; test -n "$(sysconfdir)" || list=; \
files=`for p in $$list; do echo $$p; done | sed -e 's|^.*/||'`; \
dir='$(DESTDIR)$(sysconfdir)'; $(am__uninstall_files_from_dir)
# This directory's subdirectories are mostly independent; you can cd
# into them and run `make' without going through this Makefile.
# To change the values of `make' variables: instead of editing Makefiles,
# (1) if the variable is set in `config.status', edit `config.status'
# (which will cause the Makefiles to be regenerated when you run `make');
# (2) otherwise, pass the desired values on the `make' command line.
$(RECURSIVE_TARGETS):
@fail= failcom='exit 1'; \
for f in x $$MAKEFLAGS; do \
case $$f in \
*=* | --[!k]*);; \
*k*) failcom='fail=yes';; \
esac; \
done; \
dot_seen=no; \
target=`echo $@ | sed s/-recursive//`; \
list='$(SUBDIRS)'; for subdir in $$list; do \
echo "Making $$target in $$subdir"; \
if test "$$subdir" = "."; then \
dot_seen=yes; \
local_target="$$target-am"; \
else \
local_target="$$target"; \
fi; \
($(am__cd) $$subdir && $(MAKE) $(AM_MAKEFLAGS) $$local_target) \
|| eval $$failcom; \
done; \
if test "$$dot_seen" = "no"; then \
$(MAKE) $(AM_MAKEFLAGS) "$$target-am" || exit 1; \
fi; test -z "$$fail"
$(RECURSIVE_CLEAN_TARGETS):
@fail= failcom='exit 1'; \
for f in x $$MAKEFLAGS; do \
case $$f in \
*=* | --[!k]*);; \
*k*) failcom='fail=yes';; \
esac; \
done; \
dot_seen=no; \
case "$@" in \
distclean-* | maintainer-clean-*) list='$(DIST_SUBDIRS)' ;; \
*) list='$(SUBDIRS)' ;; \
esac; \
rev=''; for subdir in $$list; do \
if test "$$subdir" = "."; then :; else \
rev="$$subdir $$rev"; \
fi; \
done; \
rev="$$rev ."; \
target=`echo $@ | sed s/-recursive//`; \
for subdir in $$rev; do \
echo "Making $$target in $$subdir"; \
if test "$$subdir" = "."; then \
local_target="$$target-am"; \
else \
local_target="$$target"; \
fi; \
($(am__cd) $$subdir && $(MAKE) $(AM_MAKEFLAGS) $$local_target) \
|| eval $$failcom; \
done && test -z "$$fail"
tags-recursive:
list='$(SUBDIRS)'; for subdir in $$list; do \
test "$$subdir" = . || ($(am__cd) $$subdir && $(MAKE) $(AM_MAKEFLAGS) tags); \
done
ctags-recursive:
list='$(SUBDIRS)'; for subdir in $$list; do \
test "$$subdir" = . || ($(am__cd) $$subdir && $(MAKE) $(AM_MAKEFLAGS) ctags); \
done
ID: $(HEADERS) $(SOURCES) $(LISP) $(TAGS_FILES)
list='$(SOURCES) $(HEADERS) $(LISP) $(TAGS_FILES)'; \
unique=`for i in $$list; do \
if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \
done | \
$(AWK) '{ files[$$0] = 1; nonempty = 1; } \
END { if (nonempty) { for (i in files) print i; }; }'`; \
mkid -fID $$unique
tags: TAGS
TAGS: tags-recursive $(HEADERS) $(SOURCES) $(TAGS_DEPENDENCIES) \
$(TAGS_FILES) $(LISP)
set x; \
here=`pwd`; \
if ($(ETAGS) --etags-include --version) >/dev/null 2>&1; then \
include_option=--etags-include; \
empty_fix=.; \
else \
include_option=--include; \
empty_fix=; \
fi; \
list='$(SUBDIRS)'; for subdir in $$list; do \
if test "$$subdir" = .; then :; else \
test ! -f $$subdir/TAGS || \
set "$$@" "$$include_option=$$here/$$subdir/TAGS"; \
fi; \
done; \
list='$(SOURCES) $(HEADERS) $(LISP) $(TAGS_FILES)'; \
unique=`for i in $$list; do \
if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \
done | \
$(AWK) '{ files[$$0] = 1; nonempty = 1; } \
END { if (nonempty) { for (i in files) print i; }; }'`; \
shift; \
if test -z "$(ETAGS_ARGS)$$*$$unique"; then :; else \
test -n "$$unique" || unique=$$empty_fix; \
if test $$# -gt 0; then \
$(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \
"$$@" $$unique; \
else \
$(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \
$$unique; \
fi; \
fi
ctags: CTAGS
CTAGS: ctags-recursive $(HEADERS) $(SOURCES) $(TAGS_DEPENDENCIES) \
$(TAGS_FILES) $(LISP)
list='$(SOURCES) $(HEADERS) $(LISP) $(TAGS_FILES)'; \
unique=`for i in $$list; do \
if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \
done | \
$(AWK) '{ files[$$0] = 1; nonempty = 1; } \
END { if (nonempty) { for (i in files) print i; }; }'`; \
test -z "$(CTAGS_ARGS)$$unique" \
|| $(CTAGS) $(CTAGSFLAGS) $(AM_CTAGSFLAGS) $(CTAGS_ARGS) \
$$unique
GTAGS:
here=`$(am__cd) $(top_builddir) && pwd` \
&& $(am__cd) $(top_srcdir) \
&& gtags -i $(GTAGS_ARGS) "$$here"
distclean-tags:
-rm -f TAGS ID GTAGS GRTAGS GSYMS GPATH tags
distdir: $(DISTFILES)
$(am__remove_distdir)
test -d "$(distdir)" || mkdir "$(distdir)"
@srcdirstrip=`echo "$(srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \
topsrcdirstrip=`echo "$(top_srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \
list='$(DISTFILES)'; \
dist_files=`for file in $$list; do echo $$file; done | \
sed -e "s|^$$srcdirstrip/||;t" \
-e "s|^$$topsrcdirstrip/|$(top_builddir)/|;t"`; \
case $$dist_files in \
*/*) $(MKDIR_P) `echo "$$dist_files" | \
sed '/\//!d;s|^|$(distdir)/|;s,/[^/]*$$,,' | \
sort -u` ;; \
esac; \
for file in $$dist_files; do \
if test -f $$file || test -d $$file; then d=.; else d=$(srcdir); fi; \
if test -d $$d/$$file; then \
dir=`echo "/$$file" | sed -e 's,/[^/]*$$,,'`; \
if test -d "$(distdir)/$$file"; then \
find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \
fi; \
if test -d $(srcdir)/$$file && test $$d != $(srcdir); then \
cp -fpR $(srcdir)/$$file "$(distdir)$$dir" || exit 1; \
find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \
fi; \
cp -fpR $$d/$$file "$(distdir)$$dir" || exit 1; \
else \
test -f "$(distdir)/$$file" \
|| cp -p $$d/$$file "$(distdir)/$$file" \
|| exit 1; \
fi; \
done
@list='$(DIST_SUBDIRS)'; for subdir in $$list; do \
if test "$$subdir" = .; then :; else \
test -d "$(distdir)/$$subdir" \
|| $(MKDIR_P) "$(distdir)/$$subdir" \
|| exit 1; \
fi; \
done
@list='$(DIST_SUBDIRS)'; for subdir in $$list; do \
if test "$$subdir" = .; then :; else \
dir1=$$subdir; dir2="$(distdir)/$$subdir"; \
$(am__relativize); \
new_distdir=$$reldir; \
dir1=$$subdir; dir2="$(top_distdir)"; \
$(am__relativize); \
new_top_distdir=$$reldir; \
echo " (cd $$subdir && $(MAKE) $(AM_MAKEFLAGS) top_distdir="$$new_top_distdir" distdir="$$new_distdir" \\"; \
echo " am__remove_distdir=: am__skip_length_check=: am__skip_mode_fix=: distdir)"; \
($(am__cd) $$subdir && \
$(MAKE) $(AM_MAKEFLAGS) \
top_distdir="$$new_top_distdir" \
distdir="$$new_distdir" \
am__remove_distdir=: \
am__skip_length_check=: \
am__skip_mode_fix=: \
distdir) \
|| exit 1; \
fi; \
done
-test -n "$(am__skip_mode_fix)" \
|| find "$(distdir)" -type d ! -perm -755 \
-exec chmod u+rwx,go+rx {} \; -o \
! -type d ! -perm -444 -links 1 -exec chmod a+r {} \; -o \
! -type d ! -perm -400 -exec chmod a+r {} \; -o \
! -type d ! -perm -444 -exec $(install_sh) -c -m a+r {} {} \; \
|| chmod -R a+r "$(distdir)"
dist-gzip: distdir
tardir=$(distdir) && $(am__tar) | GZIP=$(GZIP_ENV) gzip -c >$(distdir).tar.gz
$(am__remove_distdir)
dist-bzip2: distdir
tardir=$(distdir) && $(am__tar) | BZIP2=$${BZIP2--9} bzip2 -c >$(distdir).tar.bz2
$(am__remove_distdir)
dist-lzip: distdir
tardir=$(distdir) && $(am__tar) | lzip -c $${LZIP_OPT--9} >$(distdir).tar.lz
$(am__remove_distdir)
dist-lzma: distdir
tardir=$(distdir) && $(am__tar) | lzma -9 -c >$(distdir).tar.lzma
$(am__remove_distdir)
dist-xz: distdir
tardir=$(distdir) && $(am__tar) | XZ_OPT=$${XZ_OPT--e} xz -c >$(distdir).tar.xz
$(am__remove_distdir)
dist-tarZ: distdir
tardir=$(distdir) && $(am__tar) | compress -c >$(distdir).tar.Z
$(am__remove_distdir)
dist-shar: distdir
shar $(distdir) | GZIP=$(GZIP_ENV) gzip -c >$(distdir).shar.gz
$(am__remove_distdir)
dist-zip: distdir
-rm -f $(distdir).zip
zip -rq $(distdir).zip $(distdir)
$(am__remove_distdir)
dist dist-all: distdir
tardir=$(distdir) && $(am__tar) | GZIP=$(GZIP_ENV) gzip -c >$(distdir).tar.gz
$(am__remove_distdir)
# This target untars the dist file and tries a VPATH configuration. Then
# it guarantees that the distribution is self-contained by making another
# tarfile.
distcheck: dist
case '$(DIST_ARCHIVES)' in \
*.tar.gz*) \
GZIP=$(GZIP_ENV) gzip -dc $(distdir).tar.gz | $(am__untar) ;;\
*.tar.bz2*) \
bzip2 -dc $(distdir).tar.bz2 | $(am__untar) ;;\
*.tar.lzma*) \
lzma -dc $(distdir).tar.lzma | $(am__untar) ;;\
*.tar.lz*) \
lzip -dc $(distdir).tar.lz | $(am__untar) ;;\
*.tar.xz*) \
xz -dc $(distdir).tar.xz | $(am__untar) ;;\
*.tar.Z*) \
uncompress -c $(distdir).tar.Z | $(am__untar) ;;\
*.shar.gz*) \
GZIP=$(GZIP_ENV) gzip -dc $(distdir).shar.gz | unshar ;;\
*.zip*) \
unzip $(distdir).zip ;;\
esac
chmod -R a-w $(distdir); chmod a+w $(distdir)
mkdir $(distdir)/_build
mkdir $(distdir)/_inst
chmod a-w $(distdir)
test -d $(distdir)/_build || exit 0; \
dc_install_base=`$(am__cd) $(distdir)/_inst && pwd | sed -e 's,^[^:\\/]:[\\/],/,'` \
&& dc_destdir="$${TMPDIR-/tmp}/am-dc-$$$$/" \
&& am__cwd=`pwd` \
&& $(am__cd) $(distdir)/_build \
&& ../configure --srcdir=.. --prefix="$$dc_install_base" \
$(AM_DISTCHECK_CONFIGURE_FLAGS) \
$(DISTCHECK_CONFIGURE_FLAGS) \
&& $(MAKE) $(AM_MAKEFLAGS) \
&& $(MAKE) $(AM_MAKEFLAGS) dvi \
&& $(MAKE) $(AM_MAKEFLAGS) check \
&& $(MAKE) $(AM_MAKEFLAGS) install \
&& $(MAKE) $(AM_MAKEFLAGS) installcheck \
&& $(MAKE) $(AM_MAKEFLAGS) uninstall \
&& $(MAKE) $(AM_MAKEFLAGS) distuninstallcheck_dir="$$dc_install_base" \
distuninstallcheck \
&& chmod -R a-w "$$dc_install_base" \
&& ({ \
(cd ../.. && umask 077 && mkdir "$$dc_destdir") \
&& $(MAKE) $(AM_MAKEFLAGS) DESTDIR="$$dc_destdir" install \
&& $(MAKE) $(AM_MAKEFLAGS) DESTDIR="$$dc_destdir" uninstall \
&& $(MAKE) $(AM_MAKEFLAGS) DESTDIR="$$dc_destdir" \
distuninstallcheck_dir="$$dc_destdir" distuninstallcheck; \
} || { rm -rf "$$dc_destdir"; exit 1; }) \
&& rm -rf "$$dc_destdir" \
&& $(MAKE) $(AM_MAKEFLAGS) dist \
&& rm -rf $(DIST_ARCHIVES) \
&& $(MAKE) $(AM_MAKEFLAGS) distcleancheck \
&& cd "$$am__cwd" \
|| exit 1
$(am__remove_distdir)
@(echo "$(distdir) archives ready for distribution: "; \
list='$(DIST_ARCHIVES)'; for i in $$list; do echo $$i; done) | \
sed -e 1h -e 1s/./=/g -e 1p -e 1x -e '$$p' -e '$$x'
distuninstallcheck:
@test -n '$(distuninstallcheck_dir)' || { \
echo 'ERROR: trying to run $@ with an empty' \
'$$(distuninstallcheck_dir)' >&2; \
exit 1; \
}; \
$(am__cd) '$(distuninstallcheck_dir)' || { \
echo 'ERROR: cannot chdir into $(distuninstallcheck_dir)' >&2; \
exit 1; \
}; \
test `$(am__distuninstallcheck_listfiles) | wc -l` -eq 0 \
|| { echo "ERROR: files left after uninstall:" ; \
if test -n "$(DESTDIR)"; then \
echo " (check DESTDIR support)"; \
fi ; \
$(distuninstallcheck_listfiles) ; \
exit 1; } >&2
distcleancheck: distclean
@if test '$(srcdir)' = . ; then \
echo "ERROR: distcleancheck can only run from a VPATH build" ; \
exit 1 ; \
fi
@test `$(distcleancheck_listfiles) | wc -l` -eq 0 \
|| { echo "ERROR: files left in build directory after distclean:" ; \
$(distcleancheck_listfiles) ; \
exit 1; } >&2
check-am: all-am
check: check-recursive
all-am: Makefile $(DATA)
installdirs: installdirs-recursive
installdirs-am:
for dir in "$(DESTDIR)$(sysconfdir)"; do \
test -z "$$dir" || $(MKDIR_P) "$$dir"; \
done
install: install-recursive
install-exec: install-exec-recursive
install-data: install-data-recursive
uninstall: uninstall-recursive
install-am: all-am
@$(MAKE) $(AM_MAKEFLAGS) install-exec-am install-data-am
installcheck: installcheck-recursive
install-strip:
if test -z '$(STRIP)'; then \
$(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \
install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \
install; \
else \
$(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \
install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \
"INSTALL_PROGRAM_ENV=STRIPPROG='$(STRIP)'" install; \
fi
mostlyclean-generic:
clean-generic:
distclean-generic:
-test -z "$(CONFIG_CLEAN_FILES)" || rm -f $(CONFIG_CLEAN_FILES)
-test . = "$(srcdir)" || test -z "$(CONFIG_CLEAN_VPATH_FILES)" || rm -f $(CONFIG_CLEAN_VPATH_FILES)
maintainer-clean-generic:
@echo "This command is intended for maintainers to use"
@echo "it deletes files that may require special tools to rebuild."
clean: clean-recursive
clean-am: clean-generic mostlyclean-am
distclean: distclean-recursive
-rm -f $(am__CONFIG_DISTCLEAN_FILES)
-rm -f Makefile
distclean-am: clean-am distclean-generic distclean-hdr distclean-tags
dvi: dvi-recursive
dvi-am:
html: html-recursive
html-am:
info: info-recursive
info-am:
install-data-am:
@$(NORMAL_INSTALL)
$(MAKE) $(AM_MAKEFLAGS) install-data-hook
install-dvi: install-dvi-recursive
install-dvi-am:
install-exec-am: install-sysconfDATA
install-html: install-html-recursive
install-html-am:
install-info: install-info-recursive
install-info-am:
install-man:
install-pdf: install-pdf-recursive
install-pdf-am:
install-ps: install-ps-recursive
install-ps-am:
installcheck-am:
maintainer-clean: maintainer-clean-recursive
-rm -f $(am__CONFIG_DISTCLEAN_FILES)
-rm -rf $(top_srcdir)/autom4te.cache
-rm -f Makefile
maintainer-clean-am: distclean-am maintainer-clean-generic
mostlyclean: mostlyclean-recursive
mostlyclean-am: mostlyclean-generic
pdf: pdf-recursive
pdf-am:
ps: ps-recursive
ps-am:
uninstall-am: uninstall-sysconfDATA
.MAKE: $(RECURSIVE_CLEAN_TARGETS) $(RECURSIVE_TARGETS) ctags-recursive \
install-am install-data-am install-strip tags-recursive
.PHONY: $(RECURSIVE_CLEAN_TARGETS) $(RECURSIVE_TARGETS) CTAGS GTAGS \
all all-am am--refresh check check-am clean clean-generic \
ctags ctags-recursive dist dist-all dist-bzip2 dist-gzip \
dist-lzip dist-lzma dist-shar dist-tarZ dist-xz dist-zip \
distcheck distclean distclean-generic distclean-hdr \
distclean-tags distcleancheck distdir distuninstallcheck dvi \
dvi-am html html-am info info-am install install-am \
install-data install-data-am install-data-hook install-dvi \
install-dvi-am install-exec install-exec-am install-html \
install-html-am install-info install-info-am install-man \
install-pdf install-pdf-am install-ps install-ps-am \
install-strip install-sysconfDATA installcheck installcheck-am \
installdirs installdirs-am maintainer-clean \
maintainer-clean-generic mostlyclean mostlyclean-generic pdf \
pdf-am ps ps-am tags tags-recursive uninstall uninstall-am \
uninstall-sysconfDATA
install-data-hook:
mkdir -p $(DESTDIR)$(localstatedir)/data && mkdir -p $(DESTDIR)$(localstatedir)/log
# Tell versions [3.59,3.63) of GNU make to not export all variables.
# Otherwise a system limit (for SysV at least) may be exceeded.
.NOEXPORT:

11
coreseek/csft-4.1/Makefile.am Executable file
View File

@ -0,0 +1,11 @@
if USE_LIBSTEMMER
SUBDIRS = libstemmer_c src test doc
else
SUBDIRS = src test doc
endif
EXTRA_DIST = api storage sphinx.conf.in sphinx-min.conf.in example.sql
sysconf_DATA = sphinx.conf.dist sphinx-min.conf.dist example.sql
install-data-hook:
mkdir -p $(DESTDIR)$(localstatedir)/data && mkdir -p $(DESTDIR)$(localstatedir)/log

View File

@ -0,0 +1,788 @@
# Makefile.in generated by automake 1.11.3 from Makefile.am.
# @configure_input@
# Copyright (C) 1994, 1995, 1996, 1997, 1998, 1999, 2000, 2001, 2002,
# 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011 Free Software
# Foundation, Inc.
# This Makefile.in is free software; the Free Software Foundation
# gives unlimited permission to copy and/or distribute it,
# with or without modifications, as long as this notice is preserved.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY, to the extent permitted by law; without
# even the implied warranty of MERCHANTABILITY or FITNESS FOR A
# PARTICULAR PURPOSE.
@SET_MAKE@
VPATH = @srcdir@
pkgdatadir = $(datadir)/@PACKAGE@
pkgincludedir = $(includedir)/@PACKAGE@
pkglibdir = $(libdir)/@PACKAGE@
pkglibexecdir = $(libexecdir)/@PACKAGE@
am__cd = CDPATH="$${ZSH_VERSION+.}$(PATH_SEPARATOR)" && cd
install_sh_DATA = $(install_sh) -c -m 644
install_sh_PROGRAM = $(install_sh) -c
install_sh_SCRIPT = $(install_sh) -c
INSTALL_HEADER = $(INSTALL_DATA)
transform = $(program_transform_name)
NORMAL_INSTALL = :
PRE_INSTALL = :
POST_INSTALL = :
NORMAL_UNINSTALL = :
PRE_UNINSTALL = :
POST_UNINSTALL = :
subdir = .
DIST_COMMON = $(am__configure_deps) $(srcdir)/Makefile.am \
$(srcdir)/Makefile.in $(srcdir)/sphinx-min.conf.in \
$(srcdir)/sphinx.conf.in $(top_srcdir)/config/config.h.in \
$(top_srcdir)/configure COPYING INSTALL config/depcomp \
config/install-sh config/missing
ACLOCAL_M4 = $(top_srcdir)/aclocal.m4
am__aclocal_m4_deps = $(top_srcdir)/acinclude.m4 \
$(top_srcdir)/python.m4 $(top_srcdir)/configure.ac
am__configure_deps = $(am__aclocal_m4_deps) $(CONFIGURE_DEPENDENCIES) \
$(ACLOCAL_M4)
am__CONFIG_DISTCLEAN_FILES = config.status config.cache config.log \
configure.lineno config.status.lineno
mkinstalldirs = $(install_sh) -d
CONFIG_HEADER = $(top_builddir)/config/config.h
CONFIG_CLEAN_FILES = sphinx.conf.dist sphinx-min.conf.dist
CONFIG_CLEAN_VPATH_FILES =
SOURCES =
DIST_SOURCES =
RECURSIVE_TARGETS = all-recursive check-recursive dvi-recursive \
html-recursive info-recursive install-data-recursive \
install-dvi-recursive install-exec-recursive \
install-html-recursive install-info-recursive \
install-pdf-recursive install-ps-recursive install-recursive \
installcheck-recursive installdirs-recursive pdf-recursive \
ps-recursive uninstall-recursive
am__vpath_adj_setup = srcdirstrip=`echo "$(srcdir)" | sed 's|.|.|g'`;
am__vpath_adj = case $$p in \
$(srcdir)/*) f=`echo "$$p" | sed "s|^$$srcdirstrip/||"`;; \
*) f=$$p;; \
esac;
am__strip_dir = f=`echo $$p | sed -e 's|^.*/||'`;
am__install_max = 40
am__nobase_strip_setup = \
srcdirstrip=`echo "$(srcdir)" | sed 's/[].[^$$\\*|]/\\\\&/g'`
am__nobase_strip = \
for p in $$list; do echo "$$p"; done | sed -e "s|$$srcdirstrip/||"
am__nobase_list = $(am__nobase_strip_setup); \
for p in $$list; do echo "$$p $$p"; done | \
sed "s| $$srcdirstrip/| |;"' / .*\//!s/ .*/ ./; s,\( .*\)/[^/]*$$,\1,' | \
$(AWK) 'BEGIN { files["."] = "" } { files[$$2] = files[$$2] " " $$1; \
if (++n[$$2] == $(am__install_max)) \
{ print $$2, files[$$2]; n[$$2] = 0; files[$$2] = "" } } \
END { for (dir in files) print dir, files[dir] }'
am__base_list = \
sed '$$!N;$$!N;$$!N;$$!N;$$!N;$$!N;$$!N;s/\n/ /g' | \
sed '$$!N;$$!N;$$!N;$$!N;s/\n/ /g'
am__uninstall_files_from_dir = { \
test -z "$$files" \
|| { test ! -d "$$dir" && test ! -f "$$dir" && test ! -r "$$dir"; } \
|| { echo " ( cd '$$dir' && rm -f" $$files ")"; \
$(am__cd) "$$dir" && rm -f $$files; }; \
}
am__installdirs = "$(DESTDIR)$(sysconfdir)"
DATA = $(sysconf_DATA)
RECURSIVE_CLEAN_TARGETS = mostlyclean-recursive clean-recursive \
distclean-recursive maintainer-clean-recursive
AM_RECURSIVE_TARGETS = $(RECURSIVE_TARGETS:-recursive=) \
$(RECURSIVE_CLEAN_TARGETS:-recursive=) tags TAGS ctags CTAGS \
distdir dist dist-all distcheck
ETAGS = etags
CTAGS = ctags
DIST_SUBDIRS = src test doc libstemmer_c
DISTFILES = $(DIST_COMMON) $(DIST_SOURCES) $(TEXINFOS) $(EXTRA_DIST)
distdir = $(PACKAGE)-$(VERSION)
top_distdir = $(distdir)
am__remove_distdir = \
if test -d "$(distdir)"; then \
find "$(distdir)" -type d ! -perm -200 -exec chmod u+w {} ';' \
&& rm -rf "$(distdir)" \
|| { sleep 5 && rm -rf "$(distdir)"; }; \
else :; fi
am__relativize = \
dir0=`pwd`; \
sed_first='s,^\([^/]*\)/.*$$,\1,'; \
sed_rest='s,^[^/]*/*,,'; \
sed_last='s,^.*/\([^/]*\)$$,\1,'; \
sed_butlast='s,/*[^/]*$$,,'; \
while test -n "$$dir1"; do \
first=`echo "$$dir1" | sed -e "$$sed_first"`; \
if test "$$first" != "."; then \
if test "$$first" = ".."; then \
dir2=`echo "$$dir0" | sed -e "$$sed_last"`/"$$dir2"; \
dir0=`echo "$$dir0" | sed -e "$$sed_butlast"`; \
else \
first2=`echo "$$dir2" | sed -e "$$sed_first"`; \
if test "$$first2" = "$$first"; then \
dir2=`echo "$$dir2" | sed -e "$$sed_rest"`; \
else \
dir2="../$$dir2"; \
fi; \
dir0="$$dir0"/"$$first"; \
fi; \
fi; \
dir1=`echo "$$dir1" | sed -e "$$sed_rest"`; \
done; \
reldir="$$dir2"
DIST_ARCHIVES = $(distdir).tar.gz
GZIP_ENV = --best
distuninstallcheck_listfiles = find . -type f -print
am__distuninstallcheck_listfiles = $(distuninstallcheck_listfiles) \
| sed 's|^\./|$(prefix)/|' | grep -v '$(infodir)/dir$$'
distcleancheck_listfiles = find . -type f -print
ACLOCAL = @ACLOCAL@
AMTAR = @AMTAR@
AUTOCONF = @AUTOCONF@
AUTOHEADER = @AUTOHEADER@
AUTOMAKE = @AUTOMAKE@
AWK = @AWK@
CC = @CC@
CCDEPMODE = @CCDEPMODE@
CFLAGS = @CFLAGS@
CONFDIR = @CONFDIR@
CPP = @CPP@
CPPFLAGS = @CPPFLAGS@
CXX = @CXX@
CXXDEPMODE = @CXXDEPMODE@
CXXFLAGS = @CXXFLAGS@
CYGPATH_W = @CYGPATH_W@
DEFS = @DEFS@
DEPDIR = @DEPDIR@
ECHO_C = @ECHO_C@
ECHO_N = @ECHO_N@
ECHO_T = @ECHO_T@
EGREP = @EGREP@
EXEEXT = @EXEEXT@
GREP = @GREP@
INSTALL = @INSTALL@
INSTALL_DATA = @INSTALL_DATA@
INSTALL_PROGRAM = @INSTALL_PROGRAM@
INSTALL_SCRIPT = @INSTALL_SCRIPT@
INSTALL_STRIP_PROGRAM = @INSTALL_STRIP_PROGRAM@
LDFLAGS = @LDFLAGS@
LIBOBJS = @LIBOBJS@
LIBRT = @LIBRT@
LIBS = @LIBS@
LTLIBOBJS = @LTLIBOBJS@
MAINT = @MAINT@
MAKEINFO = @MAKEINFO@
MKDIR_P = @MKDIR_P@
MMSEG_CFLAGS = @MMSEG_CFLAGS@
MMSEG_LIBS = @MMSEG_LIBS@
MYSQL_CFLAGS = @MYSQL_CFLAGS@
MYSQL_LIBS = @MYSQL_LIBS@
OBJEXT = @OBJEXT@
PACKAGE = @PACKAGE@
PACKAGE_BUGREPORT = @PACKAGE_BUGREPORT@
PACKAGE_NAME = @PACKAGE_NAME@
PACKAGE_STRING = @PACKAGE_STRING@
PACKAGE_TARNAME = @PACKAGE_TARNAME@
PACKAGE_URL = @PACKAGE_URL@
PACKAGE_VERSION = @PACKAGE_VERSION@
PATH_SEPARATOR = @PATH_SEPARATOR@
PGSQL_CFLAGS = @PGSQL_CFLAGS@
PGSQL_LIBS = @PGSQL_LIBS@
PYTHON = @PYTHON@
PYTHON_CPPFLAGS = @PYTHON_CPPFLAGS@
PYTHON_EXEC_PREFIX = @PYTHON_EXEC_PREFIX@
PYTHON_LIBS = @PYTHON_LIBS@
PYTHON_PLATFORM = @PYTHON_PLATFORM@
PYTHON_PREFIX = @PYTHON_PREFIX@
PYTHON_VERSION = @PYTHON_VERSION@
RANLIB = @RANLIB@
SET_MAKE = @SET_MAKE@
SHELL = @SHELL@
STRIP = @STRIP@
VERSION = @VERSION@
abs_builddir = @abs_builddir@
abs_srcdir = @abs_srcdir@
abs_top_builddir = @abs_top_builddir@
abs_top_srcdir = @abs_top_srcdir@
ac_ct_CC = @ac_ct_CC@
ac_ct_CXX = @ac_ct_CXX@
am__include = @am__include@
am__leading_dot = @am__leading_dot@
am__quote = @am__quote@
am__tar = @am__tar@
am__untar = @am__untar@
bindir = @bindir@
build_alias = @build_alias@
builddir = @builddir@
datadir = @datadir@
datarootdir = @datarootdir@
docdir = @docdir@
dvidir = @dvidir@
exec_prefix = @exec_prefix@
host_alias = @host_alias@
htmldir = @htmldir@
includedir = @includedir@
infodir = @infodir@
install_sh = @install_sh@
libdir = @libdir@
libexecdir = @libexecdir@
localedir = @localedir@
localstatedir = @localstatedir@
mandir = @mandir@
mkdir_p = @mkdir_p@
oldincludedir = @oldincludedir@
pdfdir = @pdfdir@
pgconfig = @pgconfig@
pkgpyexecdir = @pkgpyexecdir@
pkgpythondir = @pkgpythondir@
prefix = @prefix@
program_transform_name = @program_transform_name@
psdir = @psdir@
pyexecdir = @pyexecdir@
pythondir = @pythondir@
sbindir = @sbindir@
sharedstatedir = @sharedstatedir@
srcdir = @srcdir@
sysconfdir = @sysconfdir@
target_alias = @target_alias@
top_build_prefix = @top_build_prefix@
top_builddir = @top_builddir@
top_srcdir = @top_srcdir@
@USE_LIBSTEMMER_FALSE@SUBDIRS = src test doc
@USE_LIBSTEMMER_TRUE@SUBDIRS = libstemmer_c src test doc
EXTRA_DIST = api storage sphinx.conf.in sphinx-min.conf.in example.sql
sysconf_DATA = sphinx.conf.dist sphinx-min.conf.dist example.sql
all: all-recursive
.SUFFIXES:
am--refresh: Makefile
@:
$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps)
@for dep in $?; do \
case '$(am__configure_deps)' in \
*$$dep*) \
echo ' cd $(srcdir) && $(AUTOMAKE) --foreign'; \
$(am__cd) $(srcdir) && $(AUTOMAKE) --foreign \
&& exit 0; \
exit 1;; \
esac; \
done; \
echo ' cd $(top_srcdir) && $(AUTOMAKE) --foreign Makefile'; \
$(am__cd) $(top_srcdir) && \
$(AUTOMAKE) --foreign Makefile
.PRECIOUS: Makefile
Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status
@case '$?' in \
*config.status*) \
echo ' $(SHELL) ./config.status'; \
$(SHELL) ./config.status;; \
*) \
echo ' cd $(top_builddir) && $(SHELL) ./config.status $@ $(am__depfiles_maybe)'; \
cd $(top_builddir) && $(SHELL) ./config.status $@ $(am__depfiles_maybe);; \
esac;
$(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES)
$(SHELL) ./config.status --recheck
$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps)
$(am__cd) $(srcdir) && $(AUTOCONF)
$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps)
$(am__cd) $(srcdir) && $(ACLOCAL) $(ACLOCAL_AMFLAGS)
$(am__aclocal_m4_deps):
config/config.h: config/stamp-h1
@if test ! -f $@; then rm -f config/stamp-h1; else :; fi
@if test ! -f $@; then $(MAKE) $(AM_MAKEFLAGS) config/stamp-h1; else :; fi
config/stamp-h1: $(top_srcdir)/config/config.h.in $(top_builddir)/config.status
@rm -f config/stamp-h1
cd $(top_builddir) && $(SHELL) ./config.status config/config.h
$(top_srcdir)/config/config.h.in: @MAINTAINER_MODE_TRUE@ $(am__configure_deps)
($(am__cd) $(top_srcdir) && $(AUTOHEADER))
rm -f config/stamp-h1
touch $@
distclean-hdr:
-rm -f config/config.h config/stamp-h1
sphinx.conf.dist: $(top_builddir)/config.status $(srcdir)/sphinx.conf.in
cd $(top_builddir) && $(SHELL) ./config.status $@
sphinx-min.conf.dist: $(top_builddir)/config.status $(srcdir)/sphinx-min.conf.in
cd $(top_builddir) && $(SHELL) ./config.status $@
install-sysconfDATA: $(sysconf_DATA)
@$(NORMAL_INSTALL)
test -z "$(sysconfdir)" || $(MKDIR_P) "$(DESTDIR)$(sysconfdir)"
@list='$(sysconf_DATA)'; test -n "$(sysconfdir)" || list=; \
for p in $$list; do \
if test -f "$$p"; then d=; else d="$(srcdir)/"; fi; \
echo "$$d$$p"; \
done | $(am__base_list) | \
while read files; do \
echo " $(INSTALL_DATA) $$files '$(DESTDIR)$(sysconfdir)'"; \
$(INSTALL_DATA) $$files "$(DESTDIR)$(sysconfdir)" || exit $$?; \
done
uninstall-sysconfDATA:
@$(NORMAL_UNINSTALL)
@list='$(sysconf_DATA)'; test -n "$(sysconfdir)" || list=; \
files=`for p in $$list; do echo $$p; done | sed -e 's|^.*/||'`; \
dir='$(DESTDIR)$(sysconfdir)'; $(am__uninstall_files_from_dir)
# This directory's subdirectories are mostly independent; you can cd
# into them and run `make' without going through this Makefile.
# To change the values of `make' variables: instead of editing Makefiles,
# (1) if the variable is set in `config.status', edit `config.status'
# (which will cause the Makefiles to be regenerated when you run `make');
# (2) otherwise, pass the desired values on the `make' command line.
$(RECURSIVE_TARGETS):
@fail= failcom='exit 1'; \
for f in x $$MAKEFLAGS; do \
case $$f in \
*=* | --[!k]*);; \
*k*) failcom='fail=yes';; \
esac; \
done; \
dot_seen=no; \
target=`echo $@ | sed s/-recursive//`; \
list='$(SUBDIRS)'; for subdir in $$list; do \
echo "Making $$target in $$subdir"; \
if test "$$subdir" = "."; then \
dot_seen=yes; \
local_target="$$target-am"; \
else \
local_target="$$target"; \
fi; \
($(am__cd) $$subdir && $(MAKE) $(AM_MAKEFLAGS) $$local_target) \
|| eval $$failcom; \
done; \
if test "$$dot_seen" = "no"; then \
$(MAKE) $(AM_MAKEFLAGS) "$$target-am" || exit 1; \
fi; test -z "$$fail"
$(RECURSIVE_CLEAN_TARGETS):
@fail= failcom='exit 1'; \
for f in x $$MAKEFLAGS; do \
case $$f in \
*=* | --[!k]*);; \
*k*) failcom='fail=yes';; \
esac; \
done; \
dot_seen=no; \
case "$@" in \
distclean-* | maintainer-clean-*) list='$(DIST_SUBDIRS)' ;; \
*) list='$(SUBDIRS)' ;; \
esac; \
rev=''; for subdir in $$list; do \
if test "$$subdir" = "."; then :; else \
rev="$$subdir $$rev"; \
fi; \
done; \
rev="$$rev ."; \
target=`echo $@ | sed s/-recursive//`; \
for subdir in $$rev; do \
echo "Making $$target in $$subdir"; \
if test "$$subdir" = "."; then \
local_target="$$target-am"; \
else \
local_target="$$target"; \
fi; \
($(am__cd) $$subdir && $(MAKE) $(AM_MAKEFLAGS) $$local_target) \
|| eval $$failcom; \
done && test -z "$$fail"
tags-recursive:
list='$(SUBDIRS)'; for subdir in $$list; do \
test "$$subdir" = . || ($(am__cd) $$subdir && $(MAKE) $(AM_MAKEFLAGS) tags); \
done
ctags-recursive:
list='$(SUBDIRS)'; for subdir in $$list; do \
test "$$subdir" = . || ($(am__cd) $$subdir && $(MAKE) $(AM_MAKEFLAGS) ctags); \
done
ID: $(HEADERS) $(SOURCES) $(LISP) $(TAGS_FILES)
list='$(SOURCES) $(HEADERS) $(LISP) $(TAGS_FILES)'; \
unique=`for i in $$list; do \
if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \
done | \
$(AWK) '{ files[$$0] = 1; nonempty = 1; } \
END { if (nonempty) { for (i in files) print i; }; }'`; \
mkid -fID $$unique
tags: TAGS
TAGS: tags-recursive $(HEADERS) $(SOURCES) $(TAGS_DEPENDENCIES) \
$(TAGS_FILES) $(LISP)
set x; \
here=`pwd`; \
if ($(ETAGS) --etags-include --version) >/dev/null 2>&1; then \
include_option=--etags-include; \
empty_fix=.; \
else \
include_option=--include; \
empty_fix=; \
fi; \
list='$(SUBDIRS)'; for subdir in $$list; do \
if test "$$subdir" = .; then :; else \
test ! -f $$subdir/TAGS || \
set "$$@" "$$include_option=$$here/$$subdir/TAGS"; \
fi; \
done; \
list='$(SOURCES) $(HEADERS) $(LISP) $(TAGS_FILES)'; \
unique=`for i in $$list; do \
if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \
done | \
$(AWK) '{ files[$$0] = 1; nonempty = 1; } \
END { if (nonempty) { for (i in files) print i; }; }'`; \
shift; \
if test -z "$(ETAGS_ARGS)$$*$$unique"; then :; else \
test -n "$$unique" || unique=$$empty_fix; \
if test $$# -gt 0; then \
$(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \
"$$@" $$unique; \
else \
$(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \
$$unique; \
fi; \
fi
ctags: CTAGS
CTAGS: ctags-recursive $(HEADERS) $(SOURCES) $(TAGS_DEPENDENCIES) \
$(TAGS_FILES) $(LISP)
list='$(SOURCES) $(HEADERS) $(LISP) $(TAGS_FILES)'; \
unique=`for i in $$list; do \
if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \
done | \
$(AWK) '{ files[$$0] = 1; nonempty = 1; } \
END { if (nonempty) { for (i in files) print i; }; }'`; \
test -z "$(CTAGS_ARGS)$$unique" \
|| $(CTAGS) $(CTAGSFLAGS) $(AM_CTAGSFLAGS) $(CTAGS_ARGS) \
$$unique
GTAGS:
here=`$(am__cd) $(top_builddir) && pwd` \
&& $(am__cd) $(top_srcdir) \
&& gtags -i $(GTAGS_ARGS) "$$here"
distclean-tags:
-rm -f TAGS ID GTAGS GRTAGS GSYMS GPATH tags
distdir: $(DISTFILES)
$(am__remove_distdir)
test -d "$(distdir)" || mkdir "$(distdir)"
@srcdirstrip=`echo "$(srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \
topsrcdirstrip=`echo "$(top_srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \
list='$(DISTFILES)'; \
dist_files=`for file in $$list; do echo $$file; done | \
sed -e "s|^$$srcdirstrip/||;t" \
-e "s|^$$topsrcdirstrip/|$(top_builddir)/|;t"`; \
case $$dist_files in \
*/*) $(MKDIR_P) `echo "$$dist_files" | \
sed '/\//!d;s|^|$(distdir)/|;s,/[^/]*$$,,' | \
sort -u` ;; \
esac; \
for file in $$dist_files; do \
if test -f $$file || test -d $$file; then d=.; else d=$(srcdir); fi; \
if test -d $$d/$$file; then \
dir=`echo "/$$file" | sed -e 's,/[^/]*$$,,'`; \
if test -d "$(distdir)/$$file"; then \
find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \
fi; \
if test -d $(srcdir)/$$file && test $$d != $(srcdir); then \
cp -fpR $(srcdir)/$$file "$(distdir)$$dir" || exit 1; \
find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \
fi; \
cp -fpR $$d/$$file "$(distdir)$$dir" || exit 1; \
else \
test -f "$(distdir)/$$file" \
|| cp -p $$d/$$file "$(distdir)/$$file" \
|| exit 1; \
fi; \
done
@list='$(DIST_SUBDIRS)'; for subdir in $$list; do \
if test "$$subdir" = .; then :; else \
test -d "$(distdir)/$$subdir" \
|| $(MKDIR_P) "$(distdir)/$$subdir" \
|| exit 1; \
fi; \
done
@list='$(DIST_SUBDIRS)'; for subdir in $$list; do \
if test "$$subdir" = .; then :; else \
dir1=$$subdir; dir2="$(distdir)/$$subdir"; \
$(am__relativize); \
new_distdir=$$reldir; \
dir1=$$subdir; dir2="$(top_distdir)"; \
$(am__relativize); \
new_top_distdir=$$reldir; \
echo " (cd $$subdir && $(MAKE) $(AM_MAKEFLAGS) top_distdir="$$new_top_distdir" distdir="$$new_distdir" \\"; \
echo " am__remove_distdir=: am__skip_length_check=: am__skip_mode_fix=: distdir)"; \
($(am__cd) $$subdir && \
$(MAKE) $(AM_MAKEFLAGS) \
top_distdir="$$new_top_distdir" \
distdir="$$new_distdir" \
am__remove_distdir=: \
am__skip_length_check=: \
am__skip_mode_fix=: \
distdir) \
|| exit 1; \
fi; \
done
-test -n "$(am__skip_mode_fix)" \
|| find "$(distdir)" -type d ! -perm -755 \
-exec chmod u+rwx,go+rx {} \; -o \
! -type d ! -perm -444 -links 1 -exec chmod a+r {} \; -o \
! -type d ! -perm -400 -exec chmod a+r {} \; -o \
! -type d ! -perm -444 -exec $(install_sh) -c -m a+r {} {} \; \
|| chmod -R a+r "$(distdir)"
dist-gzip: distdir
tardir=$(distdir) && $(am__tar) | GZIP=$(GZIP_ENV) gzip -c >$(distdir).tar.gz
$(am__remove_distdir)
dist-bzip2: distdir
tardir=$(distdir) && $(am__tar) | BZIP2=$${BZIP2--9} bzip2 -c >$(distdir).tar.bz2
$(am__remove_distdir)
dist-lzip: distdir
tardir=$(distdir) && $(am__tar) | lzip -c $${LZIP_OPT--9} >$(distdir).tar.lz
$(am__remove_distdir)
dist-lzma: distdir
tardir=$(distdir) && $(am__tar) | lzma -9 -c >$(distdir).tar.lzma
$(am__remove_distdir)
dist-xz: distdir
tardir=$(distdir) && $(am__tar) | XZ_OPT=$${XZ_OPT--e} xz -c >$(distdir).tar.xz
$(am__remove_distdir)
dist-tarZ: distdir
tardir=$(distdir) && $(am__tar) | compress -c >$(distdir).tar.Z
$(am__remove_distdir)
dist-shar: distdir
shar $(distdir) | GZIP=$(GZIP_ENV) gzip -c >$(distdir).shar.gz
$(am__remove_distdir)
dist-zip: distdir
-rm -f $(distdir).zip
zip -rq $(distdir).zip $(distdir)
$(am__remove_distdir)
dist dist-all: distdir
tardir=$(distdir) && $(am__tar) | GZIP=$(GZIP_ENV) gzip -c >$(distdir).tar.gz
$(am__remove_distdir)
# This target untars the dist file and tries a VPATH configuration. Then
# it guarantees that the distribution is self-contained by making another
# tarfile.
distcheck: dist
case '$(DIST_ARCHIVES)' in \
*.tar.gz*) \
GZIP=$(GZIP_ENV) gzip -dc $(distdir).tar.gz | $(am__untar) ;;\
*.tar.bz2*) \
bzip2 -dc $(distdir).tar.bz2 | $(am__untar) ;;\
*.tar.lzma*) \
lzma -dc $(distdir).tar.lzma | $(am__untar) ;;\
*.tar.lz*) \
lzip -dc $(distdir).tar.lz | $(am__untar) ;;\
*.tar.xz*) \
xz -dc $(distdir).tar.xz | $(am__untar) ;;\
*.tar.Z*) \
uncompress -c $(distdir).tar.Z | $(am__untar) ;;\
*.shar.gz*) \
GZIP=$(GZIP_ENV) gzip -dc $(distdir).shar.gz | unshar ;;\
*.zip*) \
unzip $(distdir).zip ;;\
esac
chmod -R a-w $(distdir); chmod a+w $(distdir)
mkdir $(distdir)/_build
mkdir $(distdir)/_inst
chmod a-w $(distdir)
test -d $(distdir)/_build || exit 0; \
dc_install_base=`$(am__cd) $(distdir)/_inst && pwd | sed -e 's,^[^:\\/]:[\\/],/,'` \
&& dc_destdir="$${TMPDIR-/tmp}/am-dc-$$$$/" \
&& am__cwd=`pwd` \
&& $(am__cd) $(distdir)/_build \
&& ../configure --srcdir=.. --prefix="$$dc_install_base" \
$(AM_DISTCHECK_CONFIGURE_FLAGS) \
$(DISTCHECK_CONFIGURE_FLAGS) \
&& $(MAKE) $(AM_MAKEFLAGS) \
&& $(MAKE) $(AM_MAKEFLAGS) dvi \
&& $(MAKE) $(AM_MAKEFLAGS) check \
&& $(MAKE) $(AM_MAKEFLAGS) install \
&& $(MAKE) $(AM_MAKEFLAGS) installcheck \
&& $(MAKE) $(AM_MAKEFLAGS) uninstall \
&& $(MAKE) $(AM_MAKEFLAGS) distuninstallcheck_dir="$$dc_install_base" \
distuninstallcheck \
&& chmod -R a-w "$$dc_install_base" \
&& ({ \
(cd ../.. && umask 077 && mkdir "$$dc_destdir") \
&& $(MAKE) $(AM_MAKEFLAGS) DESTDIR="$$dc_destdir" install \
&& $(MAKE) $(AM_MAKEFLAGS) DESTDIR="$$dc_destdir" uninstall \
&& $(MAKE) $(AM_MAKEFLAGS) DESTDIR="$$dc_destdir" \
distuninstallcheck_dir="$$dc_destdir" distuninstallcheck; \
} || { rm -rf "$$dc_destdir"; exit 1; }) \
&& rm -rf "$$dc_destdir" \
&& $(MAKE) $(AM_MAKEFLAGS) dist \
&& rm -rf $(DIST_ARCHIVES) \
&& $(MAKE) $(AM_MAKEFLAGS) distcleancheck \
&& cd "$$am__cwd" \
|| exit 1
$(am__remove_distdir)
@(echo "$(distdir) archives ready for distribution: "; \
list='$(DIST_ARCHIVES)'; for i in $$list; do echo $$i; done) | \
sed -e 1h -e 1s/./=/g -e 1p -e 1x -e '$$p' -e '$$x'
distuninstallcheck:
@test -n '$(distuninstallcheck_dir)' || { \
echo 'ERROR: trying to run $@ with an empty' \
'$$(distuninstallcheck_dir)' >&2; \
exit 1; \
}; \
$(am__cd) '$(distuninstallcheck_dir)' || { \
echo 'ERROR: cannot chdir into $(distuninstallcheck_dir)' >&2; \
exit 1; \
}; \
test `$(am__distuninstallcheck_listfiles) | wc -l` -eq 0 \
|| { echo "ERROR: files left after uninstall:" ; \
if test -n "$(DESTDIR)"; then \
echo " (check DESTDIR support)"; \
fi ; \
$(distuninstallcheck_listfiles) ; \
exit 1; } >&2
distcleancheck: distclean
@if test '$(srcdir)' = . ; then \
echo "ERROR: distcleancheck can only run from a VPATH build" ; \
exit 1 ; \
fi
@test `$(distcleancheck_listfiles) | wc -l` -eq 0 \
|| { echo "ERROR: files left in build directory after distclean:" ; \
$(distcleancheck_listfiles) ; \
exit 1; } >&2
check-am: all-am
check: check-recursive
all-am: Makefile $(DATA)
installdirs: installdirs-recursive
installdirs-am:
for dir in "$(DESTDIR)$(sysconfdir)"; do \
test -z "$$dir" || $(MKDIR_P) "$$dir"; \
done
install: install-recursive
install-exec: install-exec-recursive
install-data: install-data-recursive
uninstall: uninstall-recursive
install-am: all-am
@$(MAKE) $(AM_MAKEFLAGS) install-exec-am install-data-am
installcheck: installcheck-recursive
install-strip:
if test -z '$(STRIP)'; then \
$(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \
install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \
install; \
else \
$(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \
install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \
"INSTALL_PROGRAM_ENV=STRIPPROG='$(STRIP)'" install; \
fi
mostlyclean-generic:
clean-generic:
distclean-generic:
-test -z "$(CONFIG_CLEAN_FILES)" || rm -f $(CONFIG_CLEAN_FILES)
-test . = "$(srcdir)" || test -z "$(CONFIG_CLEAN_VPATH_FILES)" || rm -f $(CONFIG_CLEAN_VPATH_FILES)
maintainer-clean-generic:
@echo "This command is intended for maintainers to use"
@echo "it deletes files that may require special tools to rebuild."
clean: clean-recursive
clean-am: clean-generic mostlyclean-am
distclean: distclean-recursive
-rm -f $(am__CONFIG_DISTCLEAN_FILES)
-rm -f Makefile
distclean-am: clean-am distclean-generic distclean-hdr distclean-tags
dvi: dvi-recursive
dvi-am:
html: html-recursive
html-am:
info: info-recursive
info-am:
install-data-am:
@$(NORMAL_INSTALL)
$(MAKE) $(AM_MAKEFLAGS) install-data-hook
install-dvi: install-dvi-recursive
install-dvi-am:
install-exec-am: install-sysconfDATA
install-html: install-html-recursive
install-html-am:
install-info: install-info-recursive
install-info-am:
install-man:
install-pdf: install-pdf-recursive
install-pdf-am:
install-ps: install-ps-recursive
install-ps-am:
installcheck-am:
maintainer-clean: maintainer-clean-recursive
-rm -f $(am__CONFIG_DISTCLEAN_FILES)
-rm -rf $(top_srcdir)/autom4te.cache
-rm -f Makefile
maintainer-clean-am: distclean-am maintainer-clean-generic
mostlyclean: mostlyclean-recursive
mostlyclean-am: mostlyclean-generic
pdf: pdf-recursive
pdf-am:
ps: ps-recursive
ps-am:
uninstall-am: uninstall-sysconfDATA
.MAKE: $(RECURSIVE_CLEAN_TARGETS) $(RECURSIVE_TARGETS) ctags-recursive \
install-am install-data-am install-strip tags-recursive
.PHONY: $(RECURSIVE_CLEAN_TARGETS) $(RECURSIVE_TARGETS) CTAGS GTAGS \
all all-am am--refresh check check-am clean clean-generic \
ctags ctags-recursive dist dist-all dist-bzip2 dist-gzip \
dist-lzip dist-lzma dist-shar dist-tarZ dist-xz dist-zip \
distcheck distclean distclean-generic distclean-hdr \
distclean-tags distcleancheck distdir distuninstallcheck dvi \
dvi-am html html-am info info-am install install-am \
install-data install-data-am install-data-hook install-dvi \
install-dvi-am install-exec install-exec-am install-html \
install-html-am install-info install-info-am install-man \
install-pdf install-pdf-am install-ps install-ps-am \
install-strip install-sysconfDATA installcheck installcheck-am \
installdirs installdirs-am maintainer-clean \
maintainer-clean-generic mostlyclean mostlyclean-generic pdf \
pdf-am ps ps-am tags tags-recursive uninstall uninstall-am \
uninstall-sysconfDATA
install-data-hook:
mkdir -p $(DESTDIR)$(localstatedir)/data && mkdir -p $(DESTDIR)$(localstatedir)/log
# Tell versions [3.59,3.63) of GNU make to not export all variables.
# Otherwise a system limit (for SysV at least) may be exceeded.
.NOEXPORT:

423
coreseek/csft-4.1/acinclude.m4 Executable file
View File

@ -0,0 +1,423 @@
dnl ---------------------------------------------------------------------------
dnl Macro: AC_CHECK_MYSQL
dnl Check for custom MySQL paths in --with-mysql-* options.
dnl If some paths are missing, check if mysql_config exists.
dnl ---------------------------------------------------------------------------
AC_DEFUN([AC_CHECK_MYSQL],[
mysqlconfig_locations="mysql_config /usr/bin/mysql_config /usr/local/bin/mysql_config /usr/local/mysql/bin/mysql_config /opt/mysql/bin/mysql_config /usr/pkg/bin/mysql_config"
user_mysql_includes=
user_mysql_libs=
# check explicit MySQL root for mysql_config, include, lib
if test [ x$1 != xyes -a x$1 != xno ]
then
mysqlroot=`echo $1 | sed -e 's+/$++'`
if test [ -x "$mysqlroot/bin/mysql_config" ]
then
# if there's mysql_config, that's the best route
mysqlconfig_locations="$mysqlroot/bin/mysql_config"
elif test [ -d "$mysqlroot/include" -a -d "$mysqlroot/lib" ]
then
# explicit root; do not check well-known paths
mysqlconfig_locations=
# includes
if test [ -d "$mysqlroot/include/mysql" ]
then
user_mysql_includes="$mysqlroot/include/mysql"
else
user_mysql_includes="$mysqlroot/include"
fi
# libs
if test [ -d "$mysqlroot/lib/mysql" ]
then
user_mysql_libs="$mysqlroot/lib/mysql"
else
user_mysql_libs="$mysqlroot/lib"
fi
else
AC_MSG_ERROR([invalid MySQL root directory '$mysqlroot'; neither bin/mysql_config, nor include/ and lib/ were found there])
fi
fi
# try running mysql_config
AC_MSG_CHECKING([for mysql_config])
for mysqlconfig in $mysqlconfig_locations
do
if test [ -n "$mysqlconfig" ]
then
MYSQL_CFLAGS=`${mysqlconfig} --cflags 2>/dev/null`
MYSQL_LIBS=`${mysqlconfig} --libs 2>/dev/null`
if test [ $? -eq 0 ]
then
AC_MSG_RESULT([$mysqlconfig])
mysqlconfig=
break
else
MYSQL_CFLAGS=
MYSQL_LIBS=
fi
fi
done
if test [ -n "$mysqlconfig" ]
then
mysqlconfig_used=
AC_MSG_RESULT([not found])
else
mysqlconfig_used=yes
fi
# if there's nothing from mysql_config, check well-known include paths
# explicit overrides will be applied later
if test [ -z "$MYSQL_CFLAGS" ]
then
for CANDIDATE in "$user_mysql_includes" "/usr/local/mysql/include" "/usr/local/mysql/include/mysql" \
"/usr/include/mysql"
do
if test [ -n "$CANDIDATE" -a -r "$CANDIDATE/mysql.h" ]
then
MYSQL_CFLAGS="-I$CANDIDATE"
break
fi
done
fi
# if there's nothing from mysql_config, check well-known library paths
# explicit overrides will be applied later
if test [ -z "$MYSQL_LIBS" ]
then
for CANDIDATE in "$user_mysql_libs" "/usr/lib64/mysql" \
"/usr/local/mysql/lib/mysql" "/usr/local/mysql/lib" \
"/usr/local/lib/mysql" "/usr/lib/mysql" \
"/opt/mysql/lib/mysql" "/usr/pkg/lib/mysql"
do
if test [ -n "$CANDIDATE" -a -d "$CANDIDATE" ]
then
MYSQL_LIBS="-L$CANDIDATE -lmysqlclient -lz"
break
fi
done
fi
# apply explicit include path overrides
AC_ARG_WITH([mysql-includes],
AC_HELP_STRING([--with-mysql-includes], [path to MySQL header files]),
[ac_cv_mysql_includes=$withval])
if test [ -n "$ac_cv_mysql_includes" ]
then
MYSQL_CFLAGS="-I$ac_cv_mysql_includes"
fi
# apply explicit lib path overrides
AC_ARG_WITH([mysql-libs],
AC_HELP_STRING([--with-mysql-libs], [path to MySQL libraries]),
[ac_cv_mysql_libs=$withval])
if test [ -n "$ac_cv_mysql_libs" ]
then
# Trim trailing '.libs' if user passed it in --with-mysql-libs option
ac_cv_mysql_libs=`echo ${ac_cv_mysql_libs} | sed -e 's/.libs$//' \
-e 's+.libs/$++'`
MYSQL_LIBS="-L$ac_cv_mysql_libs -lmysqlclient -lz"
fi
# if we got options from mysqlconfig try to actually use them
if test [ -n "$mysqlconfig_used" -a -n "$MYSQL_CFLAGS" -a -n "$MYSQL_LIBS" ]
then
_CFLAGS=$CFLAGS
_LIBS=$LIBS
CFLAGS="$CFLAGS $MYSQL_CFLAGS"
LIBS="$LIBS $MYSQL_LIBS"
AC_CHECK_FUNC(mysql_real_connect,[],
[
# if mysql binary was built using a different compiler and we
# got options from mysql_config some of them might not work
# with compiler we will be using
# throw away everything that isn't one of -D -L -I -l and retry
MYSQL_CFLAGS=`echo $MYSQL_CFLAGS | sed -e 's/-[[^DLIl]][[^ ]]*//g'`
MYSQL_LIBS=`echo $MYSQL_LIBS | sed -e 's/-[[^DLIl]][[^ ]]*//g'`
CFLAGS="$_CFLAGS $MYSQL_CFLAGS"
LIBS="$_LIBS $MYSQL_LIBS"
unset ac_cv_func_mysql_real_connect
AC_CHECK_FUNC(mysql_real_connect,[],
[
# ... that didn't help
# clear flags, the code below will complain
MYSQL_CFLAGS=
MYSQL_LIBS=
])
])
CFLAGS=$_CFLAGS
LIBS=$_LIBS
fi
# now that we did all we could, perform final checks
AC_MSG_CHECKING([MySQL include files])
if test [ -z "$MYSQL_CFLAGS" ]
then
AC_MSG_ERROR([missing include files.
******************************************************************************
ERROR: cannot find MySQL include files.
Check that you do have MySQL include files installed.
The package name is typically 'mysql-devel'.
If include files are installed on your system, but you are still getting
this message, you should do one of the following:
1) either specify includes location explicitly, using --with-mysql-includes;
2) or specify MySQL installation root location explicitly, using --with-mysql;
3) or make sure that the path to 'mysql_config' program is listed in
your PATH environment variable.
To disable MySQL support, use --without-mysql option.
******************************************************************************
])
else
AC_MSG_RESULT([$MYSQL_CFLAGS])
fi
AC_MSG_CHECKING([MySQL libraries])
if test [ -z "$MYSQL_LIBS" ]
then
AC_MSG_ERROR([missing libraries.
******************************************************************************
ERROR: cannot find MySQL libraries.
Check that you do have MySQL libraries installed.
The package name is typically 'mysql-devel'.
If libraries are installed on your system, but you are still getting
this message, you should do one of the following:
1) either specify libraries location explicitly, using --with-mysql-libs;
2) or specify MySQL installation root location explicitly, using --with-mysql;
3) or make sure that the path to 'mysql_config' program is listed in
your PATH environment variable.
To disable MySQL support, use --without-mysql option.
******************************************************************************
])
else
AC_MSG_RESULT([$MYSQL_LIBS])
fi
])
dnl ---------------------------------------------------------------------------
dnl Macro: AC_CHECK_PGSQL
dnl First check for custom PostgreSQL paths in --with-pgsql-* options.
dnl If some paths are missing, check if pg_config exists.
dnl ---------------------------------------------------------------------------
AC_DEFUN([AC_CHECK_PGSQL],[
# Check for custom includes path
if test [ -z "$ac_cv_pgsql_includes" ]
then
AC_ARG_WITH([pgsql-includes],
AC_HELP_STRING([--with-pgsql-includes], [path to PostgreSQL header files]),
[ac_cv_pgsql_includes=$withval])
fi
if test [ -n "$ac_cv_pgsql_includes" ]
then
AC_CACHE_CHECK([PostgreSQL includes], [ac_cv_pgsql_includes], [ac_cv_pgsql_includes=""])
PGSQL_CFLAGS="-I$ac_cv_pgsql_includes"
fi
# Check for custom library path
if test [ -z "$ac_cv_pgsql_libs" ]
then
AC_ARG_WITH([pgsql-libs],
AC_HELP_STRING([--with-pgsql-libs], [path to PostgreSQL libraries]),
[ac_cv_pgsql_libs=$withval])
fi
if test [ -n "$ac_cv_pgsql_libs" ]
then
AC_CACHE_CHECK([PostgreSQL libraries], [ac_cv_pgsql_libs], [ac_cv_pgsql_libs=""])
PGSQL_LIBS="-L$ac_cv_pgsql_libs -lpq"
fi
# If some path is missing, try to autodetermine with pgsql_config
if test [ -z "$ac_cv_pgsql_includes" -o -z "$ac_cv_pgsql_libs" ]
then
if test [ -z "$pgconfig" ]
then
AC_PATH_PROG(pgconfig,pg_config)
fi
if test [ -z "$pgconfig" ]
then
AC_MSG_ERROR([pg_config executable not found
********************************************************************************
ERROR: cannot find PostgreSQL libraries. If you want to compile with PosgregSQL support,
you must either specify file locations explicitly using
--with-pgsql-includes and --with-pgsql-libs options, or make sure path to
pg_config is listed in your PATH environment variable. If you want to
disable PostgreSQL support, use --without-pgsql option.
********************************************************************************
])
else
if test [ -z "$ac_cv_pgsql_includes" ]
then
AC_MSG_CHECKING(PostgreSQL C flags)
PGSQL_CFLAGS="-I`${pgconfig} --includedir`"
AC_MSG_RESULT($PGSQL_CFLAGS)
fi
if test [ -z "$ac_cv_pgsql_libs" ]
then
AC_MSG_CHECKING(PostgreSQL linker flags)
PGSQL_LIBS="-L`${pgconfig} --libdir` -lpq"
AC_MSG_RESULT($PGSQL_LIBS)
fi
fi
fi
])
dnl ---------------------------------------------------------------------------
dnl Macro: AC_CHECK_MMSEG
dnl First check for custom PostgreSQL paths in --with-mmseg-* options.
dnl If some paths are missing, check if pg_config exists.
dnl ---------------------------------------------------------------------------
AC_DEFUN([AC_CHECK_MMSEG],[
# if there's nothing from mysql_config, check well-known include paths
# explicit overrides will be applied later
if test [ -z "$MMSEG_CFLAGS" ]
then
for CANDIDATE in "$user_mmseg_includes" "/usr/local/include/mmseg" "/usr/include/mmseg"
do
if test [ -n "$CANDIDATE" -a -r "$CANDIDATE/Segmenter.h" ]
then
MMSEG_CFLAGS="-I$CANDIDATE"
break
fi
done
fi
# if there's nothing from mysql_config, check well-known library paths
# explicit overrides will be applied later
if test [ -z "$MMSEG_LIBS" ]
then
for CANDIDATE in "$user_mmseg_libs" "/usr/lib64" \
"/usr/local/lib" "/usr/local/mmseg/lib" \
"/usr/local/lib/mmseg" "/usr/lib" \
"/opt/mmseg/lib"
do
if test [ -n "$CANDIDATE" -a -d "$CANDIDATE" ]
then
MMSEG_LIBS="-L$CANDIDATE -lmmseg"
break
fi
done
fi
# apply explicit include path overrides
AC_ARG_WITH([mmseg-includes],
AC_HELP_STRING([--with-mmseg-includes], [path to libmmseg header files]),
[ac_cv_mmseg_includes=$withval])
if test [ -n "$ac_cv_mmseg_includes" ]
then
MMSEG_CFLAGS="-I$ac_cv_mmseg_includes"
fi
# apply explicit lib path overrides
AC_ARG_WITH([mmseg-libs],
AC_HELP_STRING([--with-mmseg-libs], [path to libmmseg libraries]),
[ac_cv_mmseg_libs=$withval])
if test [ -n "$ac_cv_mmseg_libs" ]
then
# Trim trailing '.libs' if user passed it in --with-mysql-libs option
ac_cv_mmseg_libs=`echo ${ac_cv_mmseg_libs} | sed -e 's/.libs$//' \
-e 's+.libs/$++'`
MMSEG_LIBS="-L$ac_cv_mmseg_libs -lmmseg"
fi
# now that we did all we could, perform final checks
AC_MSG_CHECKING([libmmseg include files])
if test [ -z "$MMSEG_CFLAGS" ]
then
AC_MSG_ERROR([missing include files.
******************************************************************************
ERROR: cannot find libmmseg include files.
To disable libmmseg support, use --without-mmseg option.
******************************************************************************
])
else
AC_MSG_RESULT([$MMSEG_CFLAGS])
fi
AC_MSG_CHECKING([libmmseg libraries])
if test [ -z "$MMSEG_LIBS" ]
then
AC_MSG_ERROR([missing libraries.
******************************************************************************
ERROR: cannot find libmmseg libraries.
To disable libmmseg support, use --without-mmseg option.
******************************************************************************
])
else
AC_MSG_RESULT([$MMSEG_LIBS])
fi
])
dnl ---------------------------------------------------------------------------
dnl Macro: SPHINX_CONFIGURE_PART
dnl
dnl Tells what stage is ./configure running now, nicely formatted
dnl ---------------------------------------------------------------------------
dnl SPHINX_CONFIGURE_PART(MESSAGE)
AC_DEFUN([SPHINX_CONFIGURE_PART],[
AC_MSG_RESULT()
AC_MSG_RESULT([$1])
TMP=`echo $1 | sed -e sX.X-Xg`
AC_MSG_RESULT([$TMP])
AC_MSG_RESULT()
])
dnl ---------------------------------------------------------------------------
dnl Macro: SPHINX_CHECK_DEFINE
dnl
dnl Checks if this symbol is defined in that header file
dnl ---------------------------------------------------------------------------
AC_DEFUN([SPHINX_CHECK_DEFINE],[
AC_CACHE_CHECK([for $1 in $2],ac_cv_define_$1,[
AC_EGREP_CPP(YES_IS_DEFINED, [
#include <$2>
#ifdef $1
YES_IS_DEFINED
#endif
], ac_cv_define_$1=yes, ac_cv_define_$1=no)
])
if test "$ac_cv_define_$1" = "yes"; then
AC_DEFINE(HAVE_$1, 1, [Define if $1 is defined in $2])
fi
])

1215
coreseek/csft-4.1/aclocal.m4 vendored Normal file

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,2 @@
Manifest-Version: 1.0
Main-Class: org.sphx.api.test

View File

@ -0,0 +1,33 @@
#
# $Id$
#
# Makefile to automate sphinxapi.jar source builds
#
# order matters; full rebuild is always performed; but it somehow works
SOURCES = \
SphinxMatch.java \
SphinxException.java \
SphinxWordInfo.java \
SphinxResult.java \
SphinxClient.java \
test.java
CLASSES = $(SOURCES:.java=.class)
all : sphinxapi.jar
clean:
rm -fr org
rm -f sphinxapi.jar
sphinxapi.jar: $(CLASSES)
jar cfm sphinxapi.jar MANIFEST.MF org/sphx/api
.SUFFIXES: .java .class
vpath %.class org/sphx/api
.java.class:
javac -classpath . -d . $<
#
# $Id$
#

View File

@ -0,0 +1,26 @@
Sphinx Java API notes
----------------------
0) THIS IS A WORK IN PROGRESS. COMPATIBILITY-BREAKING CLASS INTERFACE
CHANGES STILL MIGHT BE PERFORMED. SUGGESTIONS ARE WELCOME.
1) Officially supported JDKs are 1.5 and above.
2) The code would probably build with prior JDK versions as well,
but since JDK 1.4 is already in End-Of-Life transition period,
this could be gradually dropped.
3) To build `sphinxapi.jar':
- make sure that `javac' and `jar' are in PATH
- make sure JAVA_HOME is properly set
- issue `make'
4) To run sample client program:
- issue `java -jar sphinxapi.jar'
5) Warnings about "unchecked" mode on 1.5+ are caused by keeping
the code compatible with 1.4. Fix suggestions are welcome.
--eof--

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,24 @@
/*
* $Id$
*/
package org.sphx.api;
/** Exception thrown on attempts to pass invalid arguments to Sphinx API methods. */
public class SphinxException extends Exception
{
/** Trivial constructor. */
public SphinxException()
{
}
/** Constructor from error message string. */
public SphinxException ( String message )
{
super ( message );
}
}
/*
* $Id$
*/

View File

@ -0,0 +1,35 @@
/*
* $Id$
*/
package org.sphx.api;
import java.util.*;
/**
* Matched document information, as in search result.
*/
public class SphinxMatch
{
/** Matched document ID. */
public long docId;
/** Matched document weight. */
public int weight;
/** Matched document attribute values. */
public ArrayList attrValues;
/** Trivial constructor. */
public SphinxMatch ( long docId, int weight )
{
this.docId = docId;
this.weight = weight;
this.attrValues = new ArrayList();
}
}
/*
* $Id$
*/

View File

@ -0,0 +1,75 @@
/*
* $Id$
*/
package org.sphx.api;
/**
* Search result set.
*
* Includes retrieved matches array, status code and error/warning messages,
* query stats, and per-word stats.
*/
public class SphinxResult
{
/** Full-text field namess. */
public String[] fields;
/** Attribute names. */
public String[] attrNames;
/** Attribute types (refer to SPH_ATTR_xxx constants in SphinxClient). */
public int[] attrTypes;
/** Retrieved matches. */
public SphinxMatch[] matches;
/** Total matches in this result set. */
public int total;
/** Total matches found in the index(es). */
public int totalFound;
/** Elapsed time (as reported by searchd), in seconds. */
public float time;
/** Per-word statistics. */
public SphinxWordInfo[] words;
/** Warning message, if any. */
public String warning = null;
/** Error message, if any. */
public String error = null;
/** Query status (refer to SEARCHD_xxx constants in SphinxClient). */
private int status = -1;
/** Trivial constructor, initializes an empty result set. */
public SphinxResult()
{
this.attrNames = new String[0];
this.matches = new SphinxMatch[0];;
this.words = new SphinxWordInfo[0];
this.fields = new String[0];
this.attrTypes = new int[0];
}
/** Get query status. */
public int getStatus()
{
return status;
}
/** Set query status (accessible from API package only). */
void setStatus ( int status )
{
this.status = status;
}
}
/*
* $Id$
*/

View File

@ -0,0 +1,30 @@
/*
* $Id$
*/
package org.sphx.api;
/** Per-word statistics class. */
public class SphinxWordInfo
{
/** Word form as returned from search daemon, stemmed or otherwise postprocessed. */
public String word;
/** Total amount of matching documents in collection. */
public long docs;
/** Total amount of hits (occurences) in collection. */
public long hits;
/** Trivial constructor. */
public SphinxWordInfo ( String word, long docs, long hits )
{
this.word = word;
this.docs = docs;
this.hits = hits;
}
}
/*
* $Id$
*/

View File

@ -0,0 +1,3 @@
@echo off
javac -cp . -d . *.java
jar cfm sphinxapi.jar MANIFEST.MF org/sphx/api

View File

@ -0,0 +1 @@
@javadoc *.java -d doc

View File

@ -0,0 +1,164 @@
/*
* $Id$
*/
package org.sphx.api;
import java.util.*;
/**
* Test class for sphinx API
*/
public class test
{
public static void main ( String[] argv ) throws SphinxException
{
if ( argv==null || argv.length<1 )
{
System.out.print ( "Usage: java -jar sphinxapi.jar [OPTIONS] query words\n\n" );
System.out.print ( "Options are:\n" );
System.out.print ( "-h, --host <HOST>\tconnect to searchd at host HOST\n" );
System.out.print ( "-p, --port\t\tconnect to searchd at port PORT\n" );
System.out.print ( "-i, --index <IDX>\tsearch through index(es) specified by IDX\n" );
System.out.print ( "-s, --sortby <CLAUSE>\tsort matches by 'CLAUSE' in sort_extended mode\n" );
System.out.print ( "-S, --sortexpr <EXPR>\tsort matches by 'EXPR' DESC in sort_expr mode\n" );
System.out.print ( "-a, --any\t\tuse 'match any word' matching mode\n" );
System.out.print ( "-b, --boolean\t\tuse 'boolean query' matching mode\n" );
System.out.print ( "-e, --extended\t\tuse 'extended query' matching mode\n" );
System.out.print ( "-ph,--phrase\t\tuse 'exact phrase' matching mode\n" );
// System.out.print ( "-f, --filter <ATTR>\tfilter by attribute 'ATTR' (default is 'group_id')\n" );
// System.out.print ( "-v, --value <VAL>\tadd VAL to allowed 'group_id' values list\n" );
System.out.print ( "-g, --groupby <EXPR>\tgroup matches by 'EXPR'\n" );
System.out.print ( "-gs,--groupsort <EXPR>\tsort groups by 'EXPR'\n" );
// System.out.print ( "-d, --distinct <ATTR>\tcount distinct values of 'ATTR''\n" );
System.out.print ( "-l, --limit <COUNT>\tretrieve COUNT matches (default: 20)\n" );
System.out.print ( "-ga, --geoanchor <LATATTR> <LONGATTR> <LAT> <LONG>\n" );
System.out.print ( "\t\t\tset anchor for geodistance\n" );
System.out.print ( "--select <EXPRS>\tselect the listed expressions only\n" );
System.exit ( 0 );
}
StringBuffer q = new StringBuffer();
String host = "localhost";
int port = 9312;
int mode = SphinxClient.SPH_MATCH_ALL;
String index = "*";
int offset = 0;
int limit = 20;
int sortMode = SphinxClient.SPH_SORT_RELEVANCE;
String sortClause = "";
String groupBy = "";
String groupSort = "";
SphinxClient cl = new SphinxClient();
/* parse arguments */
if ( argv!=null)
for ( int i=0; i<argv.length; i++ )
{
String arg = argv[i];
if ( "-h".equals(arg) || "--host".equals(arg) ) host = argv[++i];
else if ( "-p".equals(arg) || "--port".equals(arg) ) port = Integer.parseInt ( argv[++i] );
else if ( "-i".equals(arg) || "--index".equals(arg) ) index = argv[++i];
else if ( "-s".equals(arg) || "--sortby".equals(arg) ) { sortMode = SphinxClient.SPH_SORT_EXTENDED; sortClause = argv[++i]; }
else if ( "-S".equals(arg) || "--sortexpr".equals(arg) ) { sortMode = SphinxClient.SPH_SORT_EXPR; sortClause = argv[++i]; }
else if ( "-a".equals(arg) || "--any".equals(arg) ) mode = SphinxClient.SPH_MATCH_ANY;
else if ( "-b".equals(arg) || "--boolean".equals(arg) ) mode = SphinxClient.SPH_MATCH_BOOLEAN;
else if ( "-e".equals(arg) || "--extended".equals(arg) ) mode = SphinxClient.SPH_MATCH_EXTENDED;
else if ( "-ph".equals(arg)|| "--phrase".equals(arg) ) mode = SphinxClient.SPH_MATCH_PHRASE;
else if ( "-e2".equals(arg) ) mode = SphinxClient.SPH_MATCH_EXTENDED2;
else if ( "-g".equals(arg) || "--group".equals(arg) ) groupBy = argv[++i];
else if ( "-gs".equals(arg)|| "--groupsort".equals(arg) ) groupSort = argv[++i];
else if ( "-o".equals(arg) || "--offset".equals(arg) ) offset = Integer.parseInt(argv[++i]);
else if ( "-l".equals(arg) || "--limit".equals(arg) ) limit = Integer.parseInt(argv[++i]);
else if ( "-ga".equals(arg)|| "--geoanchor".equals(arg) ) cl.SetGeoAnchor ( argv[++i], argv[++i], Float.parseFloat(argv[++i]), Float.parseFloat(argv[++i]) );
else if ( "--select".equals(arg) ) cl.SetSelect ( argv[++i] );
else q.append ( argv[i] ).append ( " " );
}
cl.SetServer ( host, port );
cl.SetWeights ( new int[] { 100, 1 } );
cl.SetMatchMode ( mode );
cl.SetLimits ( offset, limit );
cl.SetSortMode ( sortMode, sortClause );
if ( groupBy.length()>0 )
cl.SetGroupBy ( groupBy, SphinxClient.SPH_GROUPBY_ATTR, groupSort );
SphinxResult res = cl.Query(q.toString(), index);
if ( res==null )
{
System.err.println ( "Error: " + cl.GetLastError() );
System.exit ( 1 );
}
if ( cl.GetLastWarning()!=null && cl.GetLastWarning().length()>0 )
System.out.println ( "WARNING: " + cl.GetLastWarning() + "\n" );
/* print me out */
System.out.println ( "Query '" + q + "' retrieved " + res.total + " of " + res.totalFound + " matches in " + res.time + " sec." );
System.out.println ( "Query stats:" );
for ( int i=0; i<res.words.length; i++ )
{
SphinxWordInfo wordInfo = res.words[i];
System.out.println ( "\t'" + wordInfo.word + "' found " + wordInfo.hits + " times in " + wordInfo.docs + " documents" );
}
System.out.println ( "\nMatches:" );
for ( int i=0; i<res.matches.length; i++ )
{
SphinxMatch info = res.matches[i];
System.out.print ( (i+1) + ". id=" + info.docId + ", weight=" + info.weight );
if ( res.attrNames==null || res.attrTypes==null )
continue;
for ( int a=0; a<res.attrNames.length; a++ )
{
System.out.print ( ", " + res.attrNames[a] + "=" );
if ( res.attrTypes[a]==SphinxClient.SPH_ATTR_MULTI || res.attrTypes[a]==SphinxClient.SPH_ATTR_MULTI64 )
{
System.out.print ( "(" );
long[] attrM = (long[]) info.attrValues.get(a);
if ( attrM!=null )
for ( int j=0; j<attrM.length; j++ )
{
if ( j!=0 )
System.out.print ( "," );
System.out.print ( attrM[j] );
}
System.out.print ( ")" );
} else
{
switch ( res.attrTypes[a] )
{
case SphinxClient.SPH_ATTR_INTEGER:
case SphinxClient.SPH_ATTR_ORDINAL:
case SphinxClient.SPH_ATTR_FLOAT:
case SphinxClient.SPH_ATTR_BIGINT:
case SphinxClient.SPH_ATTR_STRING:
/* ints, longs, floats, strings.. print as is */
System.out.print ( info.attrValues.get(a) );
break;
case SphinxClient.SPH_ATTR_TIMESTAMP:
Long iStamp = (Long) info.attrValues.get(a);
Date date = new Date ( iStamp.longValue()*1000 );
System.out.print ( date.toString() );
break;
default:
System.out.print ( "(unknown-attr-type=" + res.attrTypes[a] + ")" );
}
}
}
System.out.println();
}
}
}
/*
* $Id$
*/

View File

@ -0,0 +1,481 @@
GNU LIBRARY GENERAL PUBLIC LICENSE
Version 2, June 1991
Copyright (C) 1991 Free Software Foundation, Inc.
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
[This is the first released version of the library GPL. It is
numbered 2 because it goes with version 2 of the ordinary GPL.]
Preamble
The licenses for most software are designed to take away your
freedom to share and change it. By contrast, the GNU General Public
Licenses are intended to guarantee your freedom to share and change
free software--to make sure the software is free for all its users.
This license, the Library General Public License, applies to some
specially designated Free Software Foundation software, and to any
other libraries whose authors decide to use it. You can use it for
your libraries, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
this service if you wish), that you receive source code or can get it
if you want it, that you can change the software or use pieces of it
in new free programs; and that you know you can do these things.
To protect your rights, we need to make restrictions that forbid
anyone to deny you these rights or to ask you to surrender the rights.
These restrictions translate to certain responsibilities for you if
you distribute copies of the library, or if you modify it.
For example, if you distribute copies of the library, whether gratis
or for a fee, you must give the recipients all the rights that we gave
you. You must make sure that they, too, receive or can get the source
code. If you link a program with the library, you must provide
complete object files to the recipients so that they can relink them
with the library, after making changes to the library and recompiling
it. And you must show them these terms so they know their rights.
Our method of protecting your rights has two steps: (1) copyright
the library, and (2) offer you this license which gives you legal
permission to copy, distribute and/or modify the library.
Also, for each distributor's protection, we want to make certain
that everyone understands that there is no warranty for this free
library. If the library is modified by someone else and passed on, we
want its recipients to know that what they have is not the original
version, so that any problems introduced by others will not reflect on
the original authors' reputations.
Finally, any free program is threatened constantly by software
patents. We wish to avoid the danger that companies distributing free
software will individually obtain patent licenses, thus in effect
transforming the program into proprietary software. To prevent this,
we have made it clear that any patent must be licensed for everyone's
free use or not licensed at all.
Most GNU software, including some libraries, is covered by the ordinary
GNU General Public License, which was designed for utility programs. This
license, the GNU Library General Public License, applies to certain
designated libraries. This license is quite different from the ordinary
one; be sure to read it in full, and don't assume that anything in it is
the same as in the ordinary license.
The reason we have a separate public license for some libraries is that
they blur the distinction we usually make between modifying or adding to a
program and simply using it. Linking a program with a library, without
changing the library, is in some sense simply using the library, and is
analogous to running a utility program or application program. However, in
a textual and legal sense, the linked executable is a combined work, a
derivative of the original library, and the ordinary General Public License
treats it as such.
Because of this blurred distinction, using the ordinary General
Public License for libraries did not effectively promote software
sharing, because most developers did not use the libraries. We
concluded that weaker conditions might promote sharing better.
However, unrestricted linking of non-free programs would deprive the
users of those programs of all benefit from the free status of the
libraries themselves. This Library General Public License is intended to
permit developers of non-free programs to use free libraries, while
preserving your freedom as a user of such programs to change the free
libraries that are incorporated in them. (We have not seen how to achieve
this as regards changes in header files, but we have achieved it as regards
changes in the actual functions of the Library.) The hope is that this
will lead to faster development of free libraries.
The precise terms and conditions for copying, distribution and
modification follow. Pay close attention to the difference between a
"work based on the library" and a "work that uses the library". The
former contains code derived from the library, while the latter only
works together with the library.
Note that it is possible for a library to be covered by the ordinary
General Public License rather than by this special one.
GNU LIBRARY GENERAL PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
0. This License Agreement applies to any software library which
contains a notice placed by the copyright holder or other authorized
party saying it may be distributed under the terms of this Library
General Public License (also called "this License"). Each licensee is
addressed as "you".
A "library" means a collection of software functions and/or data
prepared so as to be conveniently linked with application programs
(which use some of those functions and data) to form executables.
The "Library", below, refers to any such software library or work
which has been distributed under these terms. A "work based on the
Library" means either the Library or any derivative work under
copyright law: that is to say, a work containing the Library or a
portion of it, either verbatim or with modifications and/or translated
straightforwardly into another language. (Hereinafter, translation is
included without limitation in the term "modification".)
"Source code" for a work means the preferred form of the work for
making modifications to it. For a library, complete source code means
all the source code for all modules it contains, plus any associated
interface definition files, plus the scripts used to control compilation
and installation of the library.
Activities other than copying, distribution and modification are not
covered by this License; they are outside its scope. The act of
running a program using the Library is not restricted, and output from
such a program is covered only if its contents constitute a work based
on the Library (independent of the use of the Library in a tool for
writing it). Whether that is true depends on what the Library does
and what the program that uses the Library does.
1. You may copy and distribute verbatim copies of the Library's
complete source code as you receive it, in any medium, provided that
you conspicuously and appropriately publish on each copy an
appropriate copyright notice and disclaimer of warranty; keep intact
all the notices that refer to this License and to the absence of any
warranty; and distribute a copy of this License along with the
Library.
You may charge a fee for the physical act of transferring a copy,
and you may at your option offer warranty protection in exchange for a
fee.
2. You may modify your copy or copies of the Library or any portion
of it, thus forming a work based on the Library, and copy and
distribute such modifications or work under the terms of Section 1
above, provided that you also meet all of these conditions:
a) The modified work must itself be a software library.
b) You must cause the files modified to carry prominent notices
stating that you changed the files and the date of any change.
c) You must cause the whole of the work to be licensed at no
charge to all third parties under the terms of this License.
d) If a facility in the modified Library refers to a function or a
table of data to be supplied by an application program that uses
the facility, other than as an argument passed when the facility
is invoked, then you must make a good faith effort to ensure that,
in the event an application does not supply such function or
table, the facility still operates, and performs whatever part of
its purpose remains meaningful.
(For example, a function in a library to compute square roots has
a purpose that is entirely well-defined independent of the
application. Therefore, Subsection 2d requires that any
application-supplied function or table used by this function must
be optional: if the application does not supply it, the square
root function must still compute square roots.)
These requirements apply to the modified work as a whole. If
identifiable sections of that work are not derived from the Library,
and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those
sections when you distribute them as separate works. But when you
distribute the same sections as part of a whole which is a work based
on the Library, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote
it.
Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Library.
In addition, mere aggregation of another work not based on the Library
with the Library (or with a work based on the Library) on a volume of
a storage or distribution medium does not bring the other work under
the scope of this License.
3. You may opt to apply the terms of the ordinary GNU General Public
License instead of this License to a given copy of the Library. To do
this, you must alter all the notices that refer to this License, so
that they refer to the ordinary GNU General Public License, version 2,
instead of to this License. (If a newer version than version 2 of the
ordinary GNU General Public License has appeared, then you can specify
that version instead if you wish.) Do not make any other change in
these notices.
Once this change is made in a given copy, it is irreversible for
that copy, so the ordinary GNU General Public License applies to all
subsequent copies and derivative works made from that copy.
This option is useful when you wish to copy part of the code of
the Library into a program that is not a library.
4. You may copy and distribute the Library (or a portion or
derivative of it, under Section 2) in object code or executable form
under the terms of Sections 1 and 2 above provided that you accompany
it with the complete corresponding machine-readable source code, which
must be distributed under the terms of Sections 1 and 2 above on a
medium customarily used for software interchange.
If distribution of object code is made by offering access to copy
from a designated place, then offering equivalent access to copy the
source code from the same place satisfies the requirement to
distribute the source code, even though third parties are not
compelled to copy the source along with the object code.
5. A program that contains no derivative of any portion of the
Library, but is designed to work with the Library by being compiled or
linked with it, is called a "work that uses the Library". Such a
work, in isolation, is not a derivative work of the Library, and
therefore falls outside the scope of this License.
However, linking a "work that uses the Library" with the Library
creates an executable that is a derivative of the Library (because it
contains portions of the Library), rather than a "work that uses the
library". The executable is therefore covered by this License.
Section 6 states terms for distribution of such executables.
When a "work that uses the Library" uses material from a header file
that is part of the Library, the object code for the work may be a
derivative work of the Library even though the source code is not.
Whether this is true is especially significant if the work can be
linked without the Library, or if the work is itself a library. The
threshold for this to be true is not precisely defined by law.
If such an object file uses only numerical parameters, data
structure layouts and accessors, and small macros and small inline
functions (ten lines or less in length), then the use of the object
file is unrestricted, regardless of whether it is legally a derivative
work. (Executables containing this object code plus portions of the
Library will still fall under Section 6.)
Otherwise, if the work is a derivative of the Library, you may
distribute the object code for the work under the terms of Section 6.
Any executables containing that work also fall under Section 6,
whether or not they are linked directly with the Library itself.
6. As an exception to the Sections above, you may also compile or
link a "work that uses the Library" with the Library to produce a
work containing portions of the Library, and distribute that work
under terms of your choice, provided that the terms permit
modification of the work for the customer's own use and reverse
engineering for debugging such modifications.
You must give prominent notice with each copy of the work that the
Library is used in it and that the Library and its use are covered by
this License. You must supply a copy of this License. If the work
during execution displays copyright notices, you must include the
copyright notice for the Library among them, as well as a reference
directing the user to the copy of this License. Also, you must do one
of these things:
a) Accompany the work with the complete corresponding
machine-readable source code for the Library including whatever
changes were used in the work (which must be distributed under
Sections 1 and 2 above); and, if the work is an executable linked
with the Library, with the complete machine-readable "work that
uses the Library", as object code and/or source code, so that the
user can modify the Library and then relink to produce a modified
executable containing the modified Library. (It is understood
that the user who changes the contents of definitions files in the
Library will not necessarily be able to recompile the application
to use the modified definitions.)
b) Accompany the work with a written offer, valid for at
least three years, to give the same user the materials
specified in Subsection 6a, above, for a charge no more
than the cost of performing this distribution.
c) If distribution of the work is made by offering access to copy
from a designated place, offer equivalent access to copy the above
specified materials from the same place.
d) Verify that the user has already received a copy of these
materials or that you have already sent this user a copy.
For an executable, the required form of the "work that uses the
Library" must include any data and utility programs needed for
reproducing the executable from it. However, as a special exception,
the source code distributed need not include anything that is normally
distributed (in either source or binary form) with the major
components (compiler, kernel, and so on) of the operating system on
which the executable runs, unless that component itself accompanies
the executable.
It may happen that this requirement contradicts the license
restrictions of other proprietary libraries that do not normally
accompany the operating system. Such a contradiction means you cannot
use both them and the Library together in an executable that you
distribute.
7. You may place library facilities that are a work based on the
Library side-by-side in a single library together with other library
facilities not covered by this License, and distribute such a combined
library, provided that the separate distribution of the work based on
the Library and of the other library facilities is otherwise
permitted, and provided that you do these two things:
a) Accompany the combined library with a copy of the same work
based on the Library, uncombined with any other library
facilities. This must be distributed under the terms of the
Sections above.
b) Give prominent notice with the combined library of the fact
that part of it is a work based on the Library, and explaining
where to find the accompanying uncombined form of the same work.
8. You may not copy, modify, sublicense, link with, or distribute
the Library except as expressly provided under this License. Any
attempt otherwise to copy, modify, sublicense, link with, or
distribute the Library is void, and will automatically terminate your
rights under this License. However, parties who have received copies,
or rights, from you under this License will not have their licenses
terminated so long as such parties remain in full compliance.
9. You are not required to accept this License, since you have not
signed it. However, nothing else grants you permission to modify or
distribute the Library or its derivative works. These actions are
prohibited by law if you do not accept this License. Therefore, by
modifying or distributing the Library (or any work based on the
Library), you indicate your acceptance of this License to do so, and
all its terms and conditions for copying, distributing or modifying
the Library or works based on it.
10. Each time you redistribute the Library (or any work based on the
Library), the recipient automatically receives a license from the
original licensor to copy, distribute, link with or modify the Library
subject to these terms and conditions. You may not impose any further
restrictions on the recipients' exercise of the rights granted herein.
You are not responsible for enforcing compliance by third parties to
this License.
11. If, as a consequence of a court judgment or allegation of patent
infringement or for any other reason (not limited to patent issues),
conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot
distribute so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you
may not distribute the Library at all. For example, if a patent
license would not permit royalty-free redistribution of the Library by
all those who receive copies directly or indirectly through you, then
the only way you could satisfy both it and this License would be to
refrain entirely from distribution of the Library.
If any portion of this section is held invalid or unenforceable under any
particular circumstance, the balance of the section is intended to apply,
and the section as a whole is intended to apply in other circumstances.
It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system which is
implemented by public license practices. Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
system; it is up to the author/donor to decide if he or she is willing
to distribute software through any other system and a licensee cannot
impose that choice.
This section is intended to make thoroughly clear what is believed to
be a consequence of the rest of this License.
12. If the distribution and/or use of the Library is restricted in
certain countries either by patents or by copyrighted interfaces, the
original copyright holder who places the Library under this License may add
an explicit geographical distribution limitation excluding those countries,
so that distribution is permitted only in or among countries not thus
excluded. In such case, this License incorporates the limitation as if
written in the body of this License.
13. The Free Software Foundation may publish revised and/or new
versions of the Library General Public License from time to time.
Such new versions will be similar in spirit to the present version,
but may differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the Library
specifies a version number of this License which applies to it and
"any later version", you have the option of following the terms and
conditions either of that version or of any later version published by
the Free Software Foundation. If the Library does not specify a
license version number, you may choose any version ever published by
the Free Software Foundation.
14. If you wish to incorporate parts of the Library into other free
programs whose distribution conditions are incompatible with these,
write to the author to ask for permission. For software which is
copyrighted by the Free Software Foundation, write to the Free
Software Foundation; we sometimes make exceptions for this. Our
decision will be guided by the two goals of preserving the free status
of all derivatives of our free software and of promoting the sharing
and reuse of software generally.
NO WARRANTY
15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO
WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW.
EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR
OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY
KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE
LIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME
THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN
WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY
AND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU
FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR
CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE
LIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING
RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A
FAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF
SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH
DAMAGES.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Libraries
If you develop a new library, and you want it to be of the greatest
possible use to the public, we recommend making it free software that
everyone can redistribute and change. You can do so by permitting
redistribution under these terms (or, alternatively, under the terms of the
ordinary General Public License).
To apply these terms, attach the following notices to the library. It is
safest to attach them to the start of each source file to most effectively
convey the exclusion of warranty; and each file should have at least the
"copyright" line and a pointer to where the full notice is found.
<one line to give the library's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This library is free software; you can redistribute it and/or
modify it under the terms of the GNU Library General Public
License as published by the Free Software Foundation; either
version 2 of the License, or (at your option) any later version.
This library is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
Library General Public License for more details.
You should have received a copy of the GNU Library General Public
License along with this library; if not, write to the Free
Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
Also add information on how to contact you by electronic and paper mail.
You should also get your employer (if you work as a programmer) or your
school, if any, to sign a "copyright disclaimer" for the library, if
necessary. Here is a sample; alter the names:
Yoyodyne, Inc., hereby disclaims all copyright interest in the
library `Frob' (a library for tweaking knobs) written by James Random Hacker.
<signature of Ty Coon>, 1 April 1990
Ty Coon, President of Vice
That's all there is to it!

View File

@ -0,0 +1,13 @@
AUTOMAKE_OPTIONS = foreign no-dependencies
lib_LTLIBRARIES = libsphinxclient.la
noinst_PROGRAMS = test
libsphinxclient_la_SOURCES = sphinxclient.c
test_SOURCES = test.c
libsphinxclient_la_LIBADD = @LTLIBOBJS@
libsphinxclient_la_LDFLAGS = -release @VERSION@
include_HEADERS = sphinxclient.h
test_LDADD = .libs/libsphinxclient.a

View File

@ -0,0 +1,51 @@
Pure C searchd client API library
Sphinx search engine, http://sphinxsearch.com/
API notes
----------
1. API can either copy the contents of passed pointer arguments,
or rely on the application that the pointer will not become invalid.
This is controlled on per-client basis; see 'copy_args' argument
to the sphinx_create() call.
When 'copy_args' is true, API will create and manage a copy of every
string and array passed to it. This causes additional malloc() pressure,
but makes calling code easier to write.
When 'copy_args' is false, API expects that pointers passed to
sphinx_set_xxx() calls will still be valid at the time when sphinx_query()
or sphinx_add_query() are called.
Rule of thumb: when 'copy_args' is false, do not free query arguments
until you have the search result. Example code for that case:
VALID CODE:
char * my_filter_name;
my_filter_name = malloc ( 256 );
strncpy ( my_filter_name, "test", 256 );
sphinx_add_filter_range ( client, my_filter_name, 10, 20, false );
result = sphinx_query ( client );
free ( my_filter_name );
my_filter_name = NULL;
INVALID CODE:
void setup_my_filter ( sphinx_client * client )
{
char buffer[256];
strncpy ( buffer, "test", sizeof(buffer) );
// INVALID! by the time when sphinx_query() is called,
// buffer will be out of scope
sphinx_add_filter_range ( client, buffer, 10, 20, false );
}
setup_my_filter ( client );
result = sphinx_query ( client );
--eof--

View File

@ -0,0 +1,37 @@
SUPPRESS_WARNINGS = 2>&1 | (egrep -v '(AC_TRY_RUN called without default to allow cross compiling|AC_PROG_CXXCPP was called before AC_PROG_CXX|defined in acinclude.m4 but never used|AC_PROG_LEX invoked multiple times|AC_DECL_YYTEXT is expanded from...|the top level)'||true)
AUTOCONF ?= 'autoconf'
ACLOCAL ?= 'aclocal'
AUTOHEADER ?= 'autoheader'
AUTOMAKE ?= 'automake'
AUTOUPDATE ?= 'autoupdate'
LIBTOOLIZE ?= 'libtoolize'
config_h_in = sphinxclient_config.h.in
targets = $(config_h_in) configure makefiles
all: $(targets)
aclocal.m4:
$(ACLOCAL)
$(config_h_in): configure
@echo rebuilding $@
@rm -f $@
$(AUTOHEADER) $(SUPPRESS_WARNINGS)
configure: aclocal.m4 configure.in
@echo rebuilding $@
$(LIBTOOLIZE) --copy
$(AUTOCONF) $(SUPPRESS_WARNINGS)
makefiles: configure Makefile.am
@echo rebuilding Makefile.in files
$(AUTOMAKE) --add-missing --copy
cvsclean:
@rm -rf *.lo *.la *.o *.a .libs Makefile Makefile.in stamp-h1 test sphinxclient_config.h*
rm -rf aclocal.m4 autom4te.cache install.sh libtool Makefile Makefile.in 'configure.in~' missing config.h* configure
rm -f config.guess config.log config.status config.sub cscope.out install-sh ltmain.sh

View File

@ -0,0 +1,38 @@
#!/bin/sh
eval `grep '^EXTRA_VERSION=' configure.in`
case "$EXTRA_VERSION" in
*-dev)
rebuildok=1
;;
*)
rebuildok=0
;;
esac
cvsclean=0
while test $# -gt 0; do
if test "$1" = "--force"; then
rebuildok=1
echo "Forcing buildconf"
fi
if test "$1" = "--clean"; then
cvsclean=1
fi
shift
done
if test "$rebuildok" = "0"; then
echo "You should not run buildconf in a release package."
echo "use buildconf --force to override this check."
exit 1
fi
if test "$cvsclean" = "1"; then
echo "Cleaning autogenerated files"
${MAKE:-make} -s -f build.mk cvsclean
else
${MAKE:-make} -s -f build.mk
fi

File diff suppressed because it is too large Load Diff

1500
coreseek/csft-4.1/api/libsphinxclient/config.sub vendored Executable file

File diff suppressed because it is too large Load Diff

20207
coreseek/csft-4.1/api/libsphinxclient/configure vendored Executable file

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,75 @@
dnl this file has to be processed by autoconf
AC_PREREQ(2.59)
MAJOR_VERSION=0
MINOR_VERSION=0
BUGFIX_VERSION=1
EXTRA_VERSION="-dev"
LIBSPHINXCLIENT_VERSION="${MAJOR_VERSION}.${MINOR_VERSION}.${BUGFIX_VERSION}${EXTRA_VERSION}"
AC_INIT([libsphinxclient],[0.0.1])
AC_CONFIG_SRCDIR([README])
AC_CONFIG_SRCDIR(sphinxclient.h)
AC_CONFIG_HEADERS(sphinxclient_config.h)
AM_INIT_AUTOMAKE([no-define])
AM_MAINTAINER_MODE
dnl Checks for programs.
AC_PROG_CC
AC_PROG_LD
m4_undefine([AC_PROG_CXX])
m4_defun([AC_PROG_CXX],[])
m4_undefine([AC_PROG_F77])
m4_defun([AC_PROG_F77],[])
AM_PROG_LIBTOOL
AC_PROG_INSTALL
dnl Checks for typedefs, structures, and compiler characteristics.
AC_TYPE_SIZE_T
dnl Checks for header files.
AC_CHECK_HEADERS(string.h strings.h unistd.h stdint.h)
DEFAULT_INSTALL_PREFIX="/usr/local"
AC_ARG_ENABLE(debug,
[AS_HELP_STRING([--enable-debug],[enable debugging symbols and compile flags])
],
[
if test x"$enableval" = xyes ; then
debug="yes"
else
debug="no"
fi
]
)
if test x"$debug" = xyes ; then
AC_DEFINE([SPHINXCLIENT_DEBUG], [], [debug build])
if test x"$GCC" = xyes; then
dnl Remove any optimization flags from CFLAGS
changequote({,})
CFLAGS=`echo "$CFLAGS" | /usr/bin/sed -e 's/-O[0-9s]*//g'`
CFLAGS=`echo "$CFLAGS" | /usr/bin/sed -e 's/-g[0-2]\? //g'`
changequote([,])
CFLAGS="$CFLAGS -g3 -Wall -O0"
fi
dnl Do not strip symbols from developer object files.
INSTALL_STRIP_FLAG=""
else
dnl Make sure to strip symbols from non-developer object files.
INSTALL_STRIP_FLAG="-s"
fi
AC_SUBST(INSTALL_STRIP_FLAG)
AC_CONFIG_FILES([Makefile])
AC_OUTPUT

View File

@ -0,0 +1,322 @@
#!/bin/sh
# install - install a program, script, or datafile
scriptversion=2004-09-10.20
# This originates from X11R5 (mit/util/scripts/install.sh), which was
# later released in X11R6 (xc/config/util/install.sh) with the
# following copyright and license.
#
# Copyright (C) 1994 X Consortium
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to
# deal in the Software without restriction, including without limitation the
# rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
# sell copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# X CONSORTIUM BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN
# AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNEC-
# TION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
#
# Except as contained in this notice, the name of the X Consortium shall not
# be used in advertising or otherwise to promote the sale, use or other deal-
# ings in this Software without prior written authorization from the X Consor-
# tium.
#
#
# FSF changes to this file are in the public domain.
#
# Calling this script install-sh is preferred over install.sh, to prevent
# `make' implicit rules from creating a file called install from it
# when there is no Makefile.
#
# This script is compatible with the BSD install script, but was written
# from scratch. It can only install one file at a time, a restriction
# shared with many OS's install programs.
# set DOITPROG to echo to test this script
# Don't use :- since 4.3BSD and earlier shells don't like it.
doit="${DOITPROG-}"
# put in absolute paths if you don't have them in your path; or use env. vars.
mvprog="${MVPROG-mv}"
cpprog="${CPPROG-cp}"
chmodprog="${CHMODPROG-chmod}"
chownprog="${CHOWNPROG-chown}"
chgrpprog="${CHGRPPROG-chgrp}"
stripprog="${STRIPPROG-strip}"
rmprog="${RMPROG-rm}"
mkdirprog="${MKDIRPROG-mkdir}"
chmodcmd="$chmodprog 0755"
chowncmd=
chgrpcmd=
stripcmd=
rmcmd="$rmprog -f"
mvcmd="$mvprog"
src=
dst=
dir_arg=
dstarg=
no_target_directory=
usage="Usage: $0 [OPTION]... [-T] SRCFILE DSTFILE
or: $0 [OPTION]... SRCFILES... DIRECTORY
or: $0 [OPTION]... -t DIRECTORY SRCFILES...
or: $0 [OPTION]... -d DIRECTORIES...
In the 1st form, copy SRCFILE to DSTFILE.
In the 2nd and 3rd, copy all SRCFILES to DIRECTORY.
In the 4th, create DIRECTORIES.
Options:
-c (ignored)
-d create directories instead of installing files.
-g GROUP $chgrpprog installed files to GROUP.
-m MODE $chmodprog installed files to MODE.
-o USER $chownprog installed files to USER.
-s $stripprog installed files.
-t DIRECTORY install into DIRECTORY.
-T report an error if DSTFILE is a directory.
--help display this help and exit.
--version display version info and exit.
Environment variables override the default commands:
CHGRPPROG CHMODPROG CHOWNPROG CPPROG MKDIRPROG MVPROG RMPROG STRIPPROG
"
while test -n "$1"; do
case $1 in
-c) shift
continue;;
-d) dir_arg=true
shift
continue;;
-g) chgrpcmd="$chgrpprog $2"
shift
shift
continue;;
--help) echo "$usage"; exit 0;;
-m) chmodcmd="$chmodprog $2"
shift
shift
continue;;
-o) chowncmd="$chownprog $2"
shift
shift
continue;;
-s) stripcmd=$stripprog
shift
continue;;
-t) dstarg=$2
shift
shift
continue;;
-T) no_target_directory=true
shift
continue;;
--version) echo "$0 $scriptversion"; exit 0;;
*) # When -d is used, all remaining arguments are directories to create.
# When -t is used, the destination is already specified.
test -n "$dir_arg$dstarg" && break
# Otherwise, the last argument is the destination. Remove it from $@.
for arg
do
if test -n "$dstarg"; then
# $@ is not empty: it contains at least $arg.
set fnord "$@" "$dstarg"
shift # fnord
fi
shift # arg
dstarg=$arg
done
break;;
esac
done
if test -z "$1"; then
if test -z "$dir_arg"; then
echo "$0: no input file specified." >&2
exit 1
fi
# It's OK to call `install-sh -d' without argument.
# This can happen when creating conditional directories.
exit 0
fi
for src
do
# Protect names starting with `-'.
case $src in
-*) src=./$src ;;
esac
if test -n "$dir_arg"; then
dst=$src
src=
if test -d "$dst"; then
mkdircmd=:
chmodcmd=
else
mkdircmd=$mkdirprog
fi
else
# Waiting for this to be detected by the "$cpprog $src $dsttmp" command
# might cause directories to be created, which would be especially bad
# if $src (and thus $dsttmp) contains '*'.
if test ! -f "$src" && test ! -d "$src"; then
echo "$0: $src does not exist." >&2
exit 1
fi
if test -z "$dstarg"; then
echo "$0: no destination specified." >&2
exit 1
fi
dst=$dstarg
# Protect names starting with `-'.
case $dst in
-*) dst=./$dst ;;
esac
# If destination is a directory, append the input filename; won't work
# if double slashes aren't ignored.
if test -d "$dst"; then
if test -n "$no_target_directory"; then
echo "$0: $dstarg: Is a directory" >&2
exit 1
fi
dst=$dst/`basename "$src"`
fi
fi
# This sed command emulates the dirname command.
dstdir=`echo "$dst" | sed -e 's,[^/]*$,,;s,/$,,;s,^$,.,'`
# Make sure that the destination directory exists.
# Skip lots of stat calls in the usual case.
if test ! -d "$dstdir"; then
defaultIFS='
'
IFS="${IFS-$defaultIFS}"
oIFS=$IFS
# Some sh's can't handle IFS=/ for some reason.
IFS='%'
set - `echo "$dstdir" | sed -e 's@/@%@g' -e 's@^%@/@'`
IFS=$oIFS
pathcomp=
while test $# -ne 0 ; do
pathcomp=$pathcomp$1
shift
if test ! -d "$pathcomp"; then
$mkdirprog "$pathcomp"
# mkdir can fail with a `File exist' error in case several
# install-sh are creating the directory concurrently. This
# is OK.
test -d "$pathcomp" || exit
fi
pathcomp=$pathcomp/
done
fi
if test -n "$dir_arg"; then
$doit $mkdircmd "$dst" \
&& { test -z "$chowncmd" || $doit $chowncmd "$dst"; } \
&& { test -z "$chgrpcmd" || $doit $chgrpcmd "$dst"; } \
&& { test -z "$stripcmd" || $doit $stripcmd "$dst"; } \
&& { test -z "$chmodcmd" || $doit $chmodcmd "$dst"; }
else
dstfile=`basename "$dst"`
# Make a couple of temp file names in the proper directory.
dsttmp=$dstdir/_inst.$$_
rmtmp=$dstdir/_rm.$$_
# Trap to clean up those temp files at exit.
trap 'ret=$?; rm -f "$dsttmp" "$rmtmp" && exit $ret' 0
trap '(exit $?); exit' 1 2 13 15
# Copy the file name to the temp name.
$doit $cpprog "$src" "$dsttmp" &&
# and set any options; do chmod last to preserve setuid bits.
#
# If any of these fail, we abort the whole thing. If we want to
# ignore errors from any of these, just make sure not to ignore
# errors from the above "$doit $cpprog $src $dsttmp" command.
#
{ test -z "$chowncmd" || $doit $chowncmd "$dsttmp"; } \
&& { test -z "$chgrpcmd" || $doit $chgrpcmd "$dsttmp"; } \
&& { test -z "$stripcmd" || $doit $stripcmd "$dsttmp"; } \
&& { test -z "$chmodcmd" || $doit $chmodcmd "$dsttmp"; } &&
# Now rename the file to the real destination.
{ $doit $mvcmd -f "$dsttmp" "$dstdir/$dstfile" 2>/dev/null \
|| {
# The rename failed, perhaps because mv can't rename something else
# to itself, or perhaps because mv is so ancient that it does not
# support -f.
# Now remove or move aside any old file at destination location.
# We try this two ways since rm can't unlink itself on some
# systems and the destination file might be busy for other
# reasons. In this case, the final cleanup might fail but the new
# file should still install successfully.
{
if test -f "$dstdir/$dstfile"; then
$doit $rmcmd -f "$dstdir/$dstfile" 2>/dev/null \
|| $doit $mvcmd -f "$dstdir/$dstfile" "$rmtmp" 2>/dev/null \
|| {
echo "$0: cannot unlink or rename $dstdir/$dstfile" >&2
(exit 1); exit
}
else
:
fi
} &&
# Now rename the file to the real destination.
$doit $mvcmd "$dsttmp" "$dstdir/$dstfile"
}
}
fi || { (exit 1); exit; }
done
# The final little trick to "correctly" pass the exit status to the exit trap.
{
(exit 0); exit
}
# Local variables:
# eval: (add-hook 'write-file-hooks 'time-stamp)
# time-stamp-start: "scriptversion="
# time-stamp-format: "%:y-%02m-%02d.%02H"
# time-stamp-end: "$"
# End:

View File

@ -0,0 +1,174 @@
<?xml version="1.0" encoding="windows-1251"?>
<VisualStudioProject
ProjectType="Visual C++"
Version="8,00"
Name="libsphinxclient"
ProjectGUID="{E0393ED6-FE6B-4803-8BFD-9D79EF21603A}"
RootNamespace="libsphinxclient"
Keyword="Win32Proj"
>
<Platforms>
<Platform
Name="Win32"
/>
</Platforms>
<ToolFiles>
</ToolFiles>
<Configurations>
<Configuration
Name="Debug|Win32"
OutputDirectory="$(SolutionDir)$(ConfigurationName)"
IntermediateDirectory="$(ConfigurationName)"
ConfigurationType="4"
CharacterSet="1"
>
<Tool
Name="VCPreBuildEventTool"
/>
<Tool
Name="VCCustomBuildTool"
/>
<Tool
Name="VCXMLDataGeneratorTool"
/>
<Tool
Name="VCWebServiceProxyGeneratorTool"
/>
<Tool
Name="VCMIDLTool"
/>
<Tool
Name="VCCLCompilerTool"
Optimization="0"
PreprocessorDefinitions="WIN32;_DEBUG;_LIB"
MinimalRebuild="true"
BasicRuntimeChecks="3"
RuntimeLibrary="3"
UsePrecompiledHeader="0"
WarningLevel="3"
Detect64BitPortabilityProblems="true"
DebugInformationFormat="4"
/>
<Tool
Name="VCManagedResourceCompilerTool"
/>
<Tool
Name="VCResourceCompilerTool"
/>
<Tool
Name="VCPreLinkEventTool"
/>
<Tool
Name="VCLibrarianTool"
/>
<Tool
Name="VCALinkTool"
/>
<Tool
Name="VCXDCMakeTool"
/>
<Tool
Name="VCBscMakeTool"
/>
<Tool
Name="VCFxCopTool"
/>
<Tool
Name="VCPostBuildEventTool"
/>
</Configuration>
<Configuration
Name="Release|Win32"
OutputDirectory="$(SolutionDir)$(ConfigurationName)"
IntermediateDirectory="$(ConfigurationName)"
ConfigurationType="4"
CharacterSet="1"
WholeProgramOptimization="1"
>
<Tool
Name="VCPreBuildEventTool"
/>
<Tool
Name="VCCustomBuildTool"
/>
<Tool
Name="VCXMLDataGeneratorTool"
/>
<Tool
Name="VCWebServiceProxyGeneratorTool"
/>
<Tool
Name="VCMIDLTool"
/>
<Tool
Name="VCCLCompilerTool"
WholeProgramOptimization="false"
PreprocessorDefinitions="WIN32;NDEBUG;_LIB"
RuntimeLibrary="2"
UsePrecompiledHeader="0"
WarningLevel="3"
Detect64BitPortabilityProblems="true"
DebugInformationFormat="3"
/>
<Tool
Name="VCManagedResourceCompilerTool"
/>
<Tool
Name="VCResourceCompilerTool"
/>
<Tool
Name="VCPreLinkEventTool"
/>
<Tool
Name="VCLibrarianTool"
/>
<Tool
Name="VCALinkTool"
/>
<Tool
Name="VCXDCMakeTool"
/>
<Tool
Name="VCBscMakeTool"
/>
<Tool
Name="VCFxCopTool"
/>
<Tool
Name="VCPostBuildEventTool"
/>
</Configuration>
</Configurations>
<References>
</References>
<Files>
<Filter
Name="Source Files"
Filter="cpp;c;cc;cxx;def;odl;idl;hpj;bat;asm;asmx"
UniqueIdentifier="{4FC737F1-C7A5-4376-A066-2A32D752A2FF}"
>
<File
RelativePath=".\sphinxclient.c"
>
</File>
</Filter>
<Filter
Name="Header Files"
Filter="h;hpp;hxx;hm;inl;inc;xsd"
UniqueIdentifier="{93995380-89BD-4b04-88EB-625FBE52EBFB}"
>
<File
RelativePath=".\sphinxclient.h"
>
</File>
</Filter>
<Filter
Name="Resource Files"
Filter="rc;ico;cur;bmp;dlg;rc2;rct;bin;rgs;gif;jpg;jpeg;jpe;resx;tiff;tif;png;wav"
UniqueIdentifier="{67DA6AB6-F800-4c08-8B7A-83BB121AAD01}"
>
</Filter>
</Files>
<Globals>
</Globals>
</VisualStudioProject>

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,353 @@
#! /bin/sh
# Common stub for a few missing GNU programs while installing.
scriptversion=2004-09-07.08
# Copyright (C) 1996, 1997, 1999, 2000, 2002, 2003, 2004
# Free Software Foundation, Inc.
# Originally by Fran,cois Pinard <pinard@iro.umontreal.ca>, 1996.
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2, or (at your option)
# any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA
# 02111-1307, USA.
# As a special exception to the GNU General Public License, if you
# distribute this file as part of a program that contains a
# configuration script generated by Autoconf, you may include it under
# the same distribution terms that you use for the rest of that program.
if test $# -eq 0; then
echo 1>&2 "Try \`$0 --help' for more information"
exit 1
fi
run=:
# In the cases where this matters, `missing' is being run in the
# srcdir already.
if test -f configure.ac; then
configure_ac=configure.ac
else
configure_ac=configure.in
fi
msg="missing on your system"
case "$1" in
--run)
# Try to run requested program, and just exit if it succeeds.
run=
shift
"$@" && exit 0
# Exit code 63 means version mismatch. This often happens
# when the user try to use an ancient version of a tool on
# a file that requires a minimum version. In this case we
# we should proceed has if the program had been absent, or
# if --run hadn't been passed.
if test $? = 63; then
run=:
msg="probably too old"
fi
;;
-h|--h|--he|--hel|--help)
echo "\
$0 [OPTION]... PROGRAM [ARGUMENT]...
Handle \`PROGRAM [ARGUMENT]...' for when PROGRAM is missing, or return an
error status if there is no known handling for PROGRAM.
Options:
-h, --help display this help and exit
-v, --version output version information and exit
--run try to run the given command, and emulate it if it fails
Supported PROGRAM values:
aclocal touch file \`aclocal.m4'
autoconf touch file \`configure'
autoheader touch file \`config.h.in'
automake touch all \`Makefile.in' files
bison create \`y.tab.[ch]', if possible, from existing .[ch]
flex create \`lex.yy.c', if possible, from existing .c
help2man touch the output file
lex create \`lex.yy.c', if possible, from existing .c
makeinfo touch the output file
tar try tar, gnutar, gtar, then tar without non-portable flags
yacc create \`y.tab.[ch]', if possible, from existing .[ch]
Send bug reports to <bug-automake@gnu.org>."
exit 0
;;
-v|--v|--ve|--ver|--vers|--versi|--versio|--version)
echo "missing $scriptversion (GNU Automake)"
exit 0
;;
-*)
echo 1>&2 "$0: Unknown \`$1' option"
echo 1>&2 "Try \`$0 --help' for more information"
exit 1
;;
esac
# Now exit if we have it, but it failed. Also exit now if we
# don't have it and --version was passed (most likely to detect
# the program).
case "$1" in
lex|yacc)
# Not GNU programs, they don't have --version.
;;
tar)
if test -n "$run"; then
echo 1>&2 "ERROR: \`tar' requires --run"
exit 1
elif test "x$2" = "x--version" || test "x$2" = "x--help"; then
exit 1
fi
;;
*)
if test -z "$run" && ($1 --version) > /dev/null 2>&1; then
# We have it, but it failed.
exit 1
elif test "x$2" = "x--version" || test "x$2" = "x--help"; then
# Could not run --version or --help. This is probably someone
# running `$TOOL --version' or `$TOOL --help' to check whether
# $TOOL exists and not knowing $TOOL uses missing.
exit 1
fi
;;
esac
# If it does not exist, or fails to run (possibly an outdated version),
# try to emulate it.
case "$1" in
aclocal*)
echo 1>&2 "\
WARNING: \`$1' is $msg. You should only need it if
you modified \`acinclude.m4' or \`${configure_ac}'. You might want
to install the \`Automake' and \`Perl' packages. Grab them from
any GNU archive site."
touch aclocal.m4
;;
autoconf)
echo 1>&2 "\
WARNING: \`$1' is $msg. You should only need it if
you modified \`${configure_ac}'. You might want to install the
\`Autoconf' and \`GNU m4' packages. Grab them from any GNU
archive site."
touch configure
;;
autoheader)
echo 1>&2 "\
WARNING: \`$1' is $msg. You should only need it if
you modified \`acconfig.h' or \`${configure_ac}'. You might want
to install the \`Autoconf' and \`GNU m4' packages. Grab them
from any GNU archive site."
files=`sed -n 's/^[ ]*A[CM]_CONFIG_HEADER(\([^)]*\)).*/\1/p' ${configure_ac}`
test -z "$files" && files="config.h"
touch_files=
for f in $files; do
case "$f" in
*:*) touch_files="$touch_files "`echo "$f" |
sed -e 's/^[^:]*://' -e 's/:.*//'`;;
*) touch_files="$touch_files $f.in";;
esac
done
touch $touch_files
;;
automake*)
echo 1>&2 "\
WARNING: \`$1' is $msg. You should only need it if
you modified \`Makefile.am', \`acinclude.m4' or \`${configure_ac}'.
You might want to install the \`Automake' and \`Perl' packages.
Grab them from any GNU archive site."
find . -type f -name Makefile.am -print |
sed 's/\.am$/.in/' |
while read f; do touch "$f"; done
;;
autom4te)
echo 1>&2 "\
WARNING: \`$1' is needed, but is $msg.
You might have modified some files without having the
proper tools for further handling them.
You can get \`$1' as part of \`Autoconf' from any GNU
archive site."
file=`echo "$*" | sed -n 's/.*--output[ =]*\([^ ]*\).*/\1/p'`
test -z "$file" && file=`echo "$*" | sed -n 's/.*-o[ ]*\([^ ]*\).*/\1/p'`
if test -f "$file"; then
touch $file
else
test -z "$file" || exec >$file
echo "#! /bin/sh"
echo "# Created by GNU Automake missing as a replacement of"
echo "# $ $@"
echo "exit 0"
chmod +x $file
exit 1
fi
;;
bison|yacc)
echo 1>&2 "\
WARNING: \`$1' $msg. You should only need it if
you modified a \`.y' file. You may need the \`Bison' package
in order for those modifications to take effect. You can get
\`Bison' from any GNU archive site."
rm -f y.tab.c y.tab.h
if [ $# -ne 1 ]; then
eval LASTARG="\${$#}"
case "$LASTARG" in
*.y)
SRCFILE=`echo "$LASTARG" | sed 's/y$/c/'`
if [ -f "$SRCFILE" ]; then
cp "$SRCFILE" y.tab.c
fi
SRCFILE=`echo "$LASTARG" | sed 's/y$/h/'`
if [ -f "$SRCFILE" ]; then
cp "$SRCFILE" y.tab.h
fi
;;
esac
fi
if [ ! -f y.tab.h ]; then
echo >y.tab.h
fi
if [ ! -f y.tab.c ]; then
echo 'main() { return 0; }' >y.tab.c
fi
;;
lex|flex)
echo 1>&2 "\
WARNING: \`$1' is $msg. You should only need it if
you modified a \`.l' file. You may need the \`Flex' package
in order for those modifications to take effect. You can get
\`Flex' from any GNU archive site."
rm -f lex.yy.c
if [ $# -ne 1 ]; then
eval LASTARG="\${$#}"
case "$LASTARG" in
*.l)
SRCFILE=`echo "$LASTARG" | sed 's/l$/c/'`
if [ -f "$SRCFILE" ]; then
cp "$SRCFILE" lex.yy.c
fi
;;
esac
fi
if [ ! -f lex.yy.c ]; then
echo 'main() { return 0; }' >lex.yy.c
fi
;;
help2man)
echo 1>&2 "\
WARNING: \`$1' is $msg. You should only need it if
you modified a dependency of a manual page. You may need the
\`Help2man' package in order for those modifications to take
effect. You can get \`Help2man' from any GNU archive site."
file=`echo "$*" | sed -n 's/.*-o \([^ ]*\).*/\1/p'`
if test -z "$file"; then
file=`echo "$*" | sed -n 's/.*--output=\([^ ]*\).*/\1/p'`
fi
if [ -f "$file" ]; then
touch $file
else
test -z "$file" || exec >$file
echo ".ab help2man is required to generate this page"
exit 1
fi
;;
makeinfo)
echo 1>&2 "\
WARNING: \`$1' is $msg. You should only need it if
you modified a \`.texi' or \`.texinfo' file, or any other file
indirectly affecting the aspect of the manual. The spurious
call might also be the consequence of using a buggy \`make' (AIX,
DU, IRIX). You might want to install the \`Texinfo' package or
the \`GNU make' package. Grab either from any GNU archive site."
file=`echo "$*" | sed -n 's/.*-o \([^ ]*\).*/\1/p'`
if test -z "$file"; then
file=`echo "$*" | sed 's/.* \([^ ]*\) *$/\1/'`
file=`sed -n '/^@setfilename/ { s/.* \([^ ]*\) *$/\1/; p; q; }' $file`
fi
touch $file
;;
tar)
shift
# We have already tried tar in the generic part.
# Look for gnutar/gtar before invocation to avoid ugly error
# messages.
if (gnutar --version > /dev/null 2>&1); then
gnutar "$@" && exit 0
fi
if (gtar --version > /dev/null 2>&1); then
gtar "$@" && exit 0
fi
firstarg="$1"
if shift; then
case "$firstarg" in
*o*)
firstarg=`echo "$firstarg" | sed s/o//`
tar "$firstarg" "$@" && exit 0
;;
esac
case "$firstarg" in
*h*)
firstarg=`echo "$firstarg" | sed s/h//`
tar "$firstarg" "$@" && exit 0
;;
esac
fi
echo 1>&2 "\
WARNING: I can't seem to be able to run \`tar' with the given arguments.
You may want to install GNU tar or Free paxutils, or check the
command line arguments."
exit 1
;;
*)
echo 1>&2 "\
WARNING: \`$1' is needed, and is $msg.
You might have modified some files without having the
proper tools for further handling them. Check the \`README' file,
it often tells you about the needed prerequisites for installing
this package. You may also peek at any GNU archive site, in case
some other package would contain this missing \`$1' program."
exit 1
;;
esac
exit 0
# Local variables:
# eval: (add-hook 'write-file-hooks 'time-stamp)
# time-stamp-start: "scriptversion="
# time-stamp-format: "%:y-%02m-%02d.%02H"
# time-stamp-end: "$"
# End:

View File

@ -0,0 +1,213 @@
exact_phrase=0
n=1, res=this is my <b>test</b> <b>text</b> to be highlighted ...
n=2, res=another <b>test</b> <b>text</b> to be highlighted, below limit
n=3, res=<b>test</b> number three, without phrase match
n=4, res=final <b>test</b>, not only without ... with swapped phrase <b>text</b> <b>test</b> as well
exact_phrase=1
n=1, res=this is my <b>test text</b> to be highlighted ...
n=2, res=another <b>test text</b> to be highlighted, below limit
n=3, res=test number three, without phrase match
n=4, res=final test, not only without phrase match, but also above ...
passage_boundary=zone
n=1, res= ... manager <b>it</b>. <b>Is</b> Filing this report and. <b>It</b> <b>is</b> signed hereby represent. That <b>it</b> <b>is</b> all information.are <b>It</b> or <b>is</b> cool <b>It</b> <b>is</b> cooler <b>It</b> <b>is</b> another ...
passage_boundary=sentence
n=1, res= ... The institutional investment manager <b>it</b>. ... <b>Is</b> Filing this report and. ... <b>It</b> <b>is</b> signed hereby represent. ... That <b>it</b> <b>is</b> all information.are <b>It</b> or <b>is</b> cool <b>It</b> <b>is</b> cooler <b>It</b> <b>is</b> another place! ...
build_keywords result:
1. tokenized=hello, normalized=hello, docs=0, hits=0
2. tokenized=test, normalized=test, docs=3, hits=5
3. tokenized=one, normalized=one, docs=1, hits=2
Query 'is' retrieved 4 of 4 matches.
Query stats:
'is' found 4 times in 4 documents
Matches:
1. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7)
2. doc_id=2, weight=1304, idd=2, group_id=1, tag=(2,4,6), tag64=(2,4,6), tag2=(2,4,6)
3. doc_id=3, weight=1304, idd=3, group_id=2, tag=(15), tag64=(15), tag2=(15)
4. doc_id=4, weight=1304, idd=4, group_id=2, tag=(7,40), tag64=(7,40), tag2=(7,40)
Query 'is test' retrieved 3 of 3 matches.
Query stats:
'is' found 4 times in 4 documents
'test' found 5 times in 3 documents
Matches:
1. doc_id=1, weight=101362, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7)
2. doc_id=2, weight=101362, idd=2, group_id=1, tag=(2,4,6), tag64=(2,4,6), tag2=(2,4,6)
3. doc_id=4, weight=1373, idd=4, group_id=2, tag=(7,40), tag64=(7,40), tag2=(7,40)
Query 'test number' retrieved 3 of 3 matches.
Query stats:
'test' found 5 times in 3 documents
'number' found 3 times in 3 documents
Matches:
1. doc_id=4, weight=101442, idd=4, group_id=2, tag=(7,40), tag64=(7,40), tag2=(7,40)
2. doc_id=1, weight=101432, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7)
3. doc_id=2, weight=101432, idd=2, group_id=1, tag=(2,4,6), tag64=(2,4,6), tag2=(2,4,6)
Query 'is' retrieved 2 of 2 matches.
Query stats:
'is' found 4 times in 4 documents
Matches:
1. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7), @groupby=1, @count=2
2. doc_id=3, weight=1304, idd=3, group_id=2, tag=(15), tag64=(15), tag2=(15), @groupby=2, @count=2
Query 'is' retrieved 9 of 9 matches.
Query stats:
'is' found 4 times in 4 documents
Matches:
1. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7), @groupby=1, @count=1
2. doc_id=2, weight=1304, idd=2, group_id=1, tag=(2,4,6), tag64=(2,4,6), tag2=(2,4,6), @groupby=2, @count=1
3. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7), @groupby=3, @count=1
4. doc_id=2, weight=1304, idd=2, group_id=1, tag=(2,4,6), tag64=(2,4,6), tag2=(2,4,6), @groupby=4, @count=1
5. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7), @groupby=5, @count=1
6. doc_id=2, weight=1304, idd=2, group_id=1, tag=(2,4,6), tag64=(2,4,6), tag2=(2,4,6), @groupby=6, @count=1
7. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7), @groupby=7, @count=2
8. doc_id=3, weight=1304, idd=3, group_id=2, tag=(15), tag64=(15), tag2=(15), @groupby=15, @count=1
9. doc_id=4, weight=1304, idd=4, group_id=2, tag=(7,40), tag64=(7,40), tag2=(7,40), @groupby=40, @count=1
Query 'is' retrieved 2 of 2 matches.
Query stats:
'is' found 4 times in 4 documents
Matches:
1. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7)
2. doc_id=2, weight=1304, idd=2, group_id=1, tag=(2,4,6), tag64=(2,4,6), tag2=(2,4,6)
Query 'is' retrieved 2 of 2 matches.
Query stats:
'is' found 4 times in 4 documents
Matches:
1. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7)
2. doc_id=4, weight=1304, idd=4, group_id=2, tag=(7,40), tag64=(7,40), tag2=(7,40)
update success, 1 rows updated
update mva success, 1 rows updated
Query 'is' retrieved 4 of 4 matches.
Query stats:
'is' found 4 times in 4 documents
Matches:
1. doc_id=4, weight=1304, idd=4, group_id=2, tag=(7,40), tag64=(7,40), tag2=(7,40)
2. doc_id=3, weight=1304, idd=3, group_id=2, tag=(7,77,177), tag64=(15), tag2=(15)
3. doc_id=2, weight=1304, idd=2, group_id=123, tag=(2,4,6), tag64=(2,4,6), tag2=(2,4,6)
4. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7)
update success, 1 rows updated
update success, 1 rows updated
Query 'is' retrieved 4 of 4 matches.
Query stats:
'is' found 4 times in 4 documents
Matches:
1. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7)
2. doc_id=2, weight=1304, idd=2, group_id=123, tag=(2,4,6), tag64=(2,4,6), tag2=(2,4,6)
3. doc_id=3, weight=1304, idd=3, group_id=123, tag=(7,77,177), tag64=(15), tag2=(15)
4. doc_id=4, weight=1304, idd=4, group_id=123, tag=(7,40), tag64=(7,40), tag2=(7,40)
Query 'is' retrieved 2 of 2 matches.
Query stats:
'is' found 4 times in 4 documents
Matches:
1. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7), @groupby=1, @count=1
2. doc_id=2, weight=1304, idd=2, group_id=123, tag=(2,4,6), tag64=(2,4,6), tag2=(2,4,6), @groupby=123, @count=3
Query 'is' retrieved 10 of 10 matches.
Query stats:
'is' found 4 times in 4 documents
Matches:
1. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7), @groupby=1, @count=1
2. doc_id=2, weight=1304, idd=2, group_id=123, tag=(2,4,6), tag64=(2,4,6), tag2=(2,4,6), @groupby=2, @count=1
3. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7), @groupby=3, @count=1
4. doc_id=2, weight=1304, idd=2, group_id=123, tag=(2,4,6), tag64=(2,4,6), tag2=(2,4,6), @groupby=4, @count=1
5. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7), @groupby=5, @count=1
6. doc_id=2, weight=1304, idd=2, group_id=123, tag=(2,4,6), tag64=(2,4,6), tag2=(2,4,6), @groupby=6, @count=1
7. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7), @groupby=7, @count=3
8. doc_id=4, weight=1304, idd=4, group_id=123, tag=(7,40), tag64=(7,40), tag2=(7,40), @groupby=40, @count=1
9. doc_id=3, weight=1304, idd=3, group_id=123, tag=(7,77,177), tag64=(15), tag2=(15), @groupby=77, @count=1
10. doc_id=3, weight=1304, idd=3, group_id=123, tag=(7,77,177), tag64=(15), tag2=(15), @groupby=177, @count=1
Query 'is' retrieved 1 of 1 matches.
Query stats:
'is' found 4 times in 4 documents
Matches:
1. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7)
Query 'is' retrieved 3 of 3 matches.
Query stats:
'is' found 4 times in 4 documents
Matches:
1. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7)
2. doc_id=3, weight=1304, idd=3, group_id=123, tag=(7,77,177), tag64=(15), tag2=(15)
3. doc_id=4, weight=1304, idd=4, group_id=123, tag=(7,40), tag64=(7,40), tag2=(7,40)
Query 'is' retrieved 4 of 4 matches.
Query stats:
'is' found 4 times in 4 documents
Matches:
1. doc_id=1, weight=1304, idd=1, group_id=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7), q=1010
2. doc_id=2, weight=1304, idd=2, group_id=123, tag=(2,4,6), tag64=(2,4,6), tag2=(2,4,6), q=123020
3. doc_id=3, weight=1304, idd=3, group_id=123, tag=(7,77,177), tag64=(15), tag2=(15), q=123030
4. doc_id=4, weight=1304, idd=4, group_id=123, tag=(7,40), tag64=(7,40), tag2=(7,40), q=123040
Query 'is' retrieved 4 of 4 matches.
Query stats:
'is' found 4 times in 4 documents
Matches:
1. doc_id=1, weight=1304, idd=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7), group_id=1, q=1010
2. doc_id=2, weight=1304, idd=2, tag=(2,4,6), tag64=(2,4,6), tag2=(2,4,6), group_id=2000, q=2000020
3. doc_id=3, weight=1304, idd=3, tag=(7,77,177), tag64=(15), tag2=(15), group_id=123, q=123030
4. doc_id=4, weight=1304, idd=4, tag=(7,40), tag64=(7,40), tag2=(7,40), group_id=123, q=123040
Query 'is' retrieved 3 of 3 matches.
Query stats:
'is' found 4 times in 4 documents
Matches:
1. doc_id=1, weight=1304, idd=1, tag=(1,3,5,7), tag64=(1,3,5,7), tag2=(1,3,5,7), group_id=1, q=1010, @groupby=1, @count=1
2. doc_id=3, weight=1304, idd=3, tag=(7,77,177), tag64=(15), tag2=(15), group_id=123, q=123030, @groupby=123, @count=2
3. doc_id=2, weight=1304, idd=2, tag=(2,4,6), tag64=(2,4,6), tag2=(2,4,6), group_id=2000, q=2000020, @groupby=2000, @count=1
connections: 17
maxed_out: 0
command_search: 16
command_excerpt: 4
command_update: 4
command_keywords: 1
command_persist: 1
command_status: 1
command_flushattrs: 0
agent_connect: 0
agent_retry: 0
queries: 16
dist_queries: 0
query_cpu: OFF
dist_local: 0.000
dist_wait: 0.000
query_reads: OFF
query_readkb: OFF
avg_query_cpu: OFF
avg_dist_local: 0.000
avg_dist_wait: 0.000
avg_query_reads: OFF
avg_query_readkb: OFF

View File

@ -0,0 +1,48 @@
source src1
{
type = mysql
sql_host = localhost
sql_user = test
sql_pass =
sql_db = test
sql_port = 3306 # optional, default is 3306
sql_query = SELECT id, id as idd, group_id, title, content FROM documents
sql_attr_uint = group_id
sql_attr_uint = idd
sql_attr_multi = uint tag from query; SELECT docid, tagid FROM tags
sql_attr_multi = bigint tag64 from query; SELECT docid, tagid FROM tags
sql_attr_multi = uint tag2 from query; SELECT docid, tagid FROM tags
}
index test1
{
source = src1
path = ../../test/data/test1
docinfo = extern
charset_type = utf-8
}
indexer
{
mem_limit = 32M
}
searchd
{
listen = 10312
listen = 10306:mysql41
read_timeout = 5
max_children = 30
pid_file = searchd.pid
log = ../../test/searchd.log
query_log = ../../test/query.log
max_matches = 1000
workers = threads # for RT to work
binlog_path =
}

View File

@ -0,0 +1,45 @@
#!/bin/sh
FAILLOG="/tmp/faillog1"
DIFF='smoke_diff.txt'
RES='smoke_test.txt'
REF='smoke_ref.txt'
LINE='-----------------------------\n'
die()
{
cat $FAILLOG
echo $LINE
[ ! "z$2" = "z" ] && { eval $2; echo "$LINE"; }
echo "C API:$1"
[ -e "$FAILLOG" ] && rm $FAILLOG
exit 1
}
cmd ()
{
echo "Executing: $1\n">$FAILLOG
eval $1 1>>$FAILLOG 2>&1 || die "$2" "$3"
}
cmd "./configure --with-debug" "configure failed"
cmd "make clean" "make clean failed"
cmd "make" "make failed"
cmd "../../src/indexer -c smoke_test.conf --all" "indexing failed"
cmd "../../src/searchd -c smoke_test.conf --test" "searchd start failed"
cmd "./test --smoke --port 10312>$RES" "test --smoke --port 10312 failed"
cmd "../../src/searchd -c smoke_test.conf --stop" "searchd stop failed"
cmd "make clean" " "
cmd "diff --unified=3 $REF $RES >$DIFF" 'diff failed' "cat $DIFF"
rm $RES
rm $DIFF
rm $FAILLOG
echo "all ok"
exit 0

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,253 @@
//
// $Id$
//
//
// Copyright (c) 2001-2011, Andrew Aksyonoff
// Copyright (c) 2008-2011, Sphinx Technologies Inc
// All rights reserved
//
// This program is free software; you can redistribute it and/or modify
// it under the terms of the GNU Library General Public License. You should
// have received a copy of the LGPL license along with this program; if you
// did not, you can find it at http://www.gnu.org/
//
#ifndef _sphinxclient_
#define _sphinxclient_
#ifdef __cplusplus
extern "C" {
#endif
/// known searchd status codes
enum
{
SEARCHD_OK = 0,
SEARCHD_ERROR = 1,
SEARCHD_RETRY = 2,
SEARCHD_WARNING = 3
};
/// known match modes
enum
{
SPH_MATCH_ALL = 0,
SPH_MATCH_ANY = 1,
SPH_MATCH_PHRASE = 2,
SPH_MATCH_BOOLEAN = 3,
SPH_MATCH_EXTENDED = 4,
SPH_MATCH_FULLSCAN = 5,
SPH_MATCH_EXTENDED2 = 6
};
/// known ranking modes (ext2 only)
enum
{
SPH_RANK_PROXIMITY_BM25 = 0,
SPH_RANK_BM25 = 1,
SPH_RANK_NONE = 2,
SPH_RANK_WORDCOUNT = 3,
SPH_RANK_PROXIMITY = 4,
SPH_RANK_MATCHANY = 5,
SPH_RANK_FIELDMASK = 6,
SPH_RANK_SPH04 = 7,
SPH_RANK_DEFAULT = SPH_RANK_PROXIMITY_BM25
};
/// known sort modes
enum
{
SPH_SORT_RELEVANCE = 0,
SPH_SORT_ATTR_DESC = 1,
SPH_SORT_ATTR_ASC = 2,
SPH_SORT_TIME_SEGMENTS = 3,
SPH_SORT_EXTENDED = 4,
SPH_SORT_EXPR = 5
};
/// known filter types
enum
{ SPH_FILTER_VALUES = 0,
SPH_FILTER_RANGE = 1,
SPH_FILTER_FLOATRANGE = 2
};
/// known attribute types
enum
{
SPH_ATTR_INTEGER = 1,
SPH_ATTR_TIMESTAMP = 2,
SPH_ATTR_ORDINAL = 3,
SPH_ATTR_BOOL = 4,
SPH_ATTR_FLOAT = 5,
SPH_ATTR_BIGINT = 6,
SPH_ATTR_STRING = 7,
SPH_ATTR_MULTI = 0x40000001UL,
SPH_ATTR_MULTI64 = 0x40000002UL
};
/// known grouping functions
enum
{ SPH_GROUPBY_DAY = 0,
SPH_GROUPBY_WEEK = 1,
SPH_GROUPBY_MONTH = 2,
SPH_GROUPBY_YEAR = 3,
SPH_GROUPBY_ATTR = 4,
SPH_GROUPBY_ATTRPAIR = 5
};
//////////////////////////////////////////////////////////////////////////
#if defined(_MSC_VER)
typedef __int64 sphinx_int64_t;
typedef unsigned __int64 sphinx_uint64_t;
#else // !defined(_MSC_VER)
typedef long long sphinx_int64_t;
typedef unsigned long long sphinx_uint64_t;
#endif // !defined(_MSC_VER)
typedef int sphinx_bool;
#define SPH_TRUE 1
#define SPH_FALSE 0
//////////////////////////////////////////////////////////////////////////
typedef struct st_sphinx_client sphinx_client;
typedef struct st_sphinx_wordinfo
{
const char * word;
int docs;
int hits;
} sphinx_wordinfo;
typedef struct st_sphinx_result
{
const char * error;
const char * warning;
int status;
int num_fields;
char ** fields;
int num_attrs;
char ** attr_names;
int * attr_types;
int num_matches;
void * values_pool;
int total;
int total_found;
int time_msec;
int num_words;
sphinx_wordinfo * words;
} sphinx_result;
typedef struct st_sphinx_excerpt_options
{
const char * before_match;
const char * after_match;
const char * chunk_separator;
const char * html_strip_mode;
const char * passage_boundary;
int limit;
int limit_passages;
int limit_words;
int around;
int start_passage_id;
sphinx_bool exact_phrase;
sphinx_bool single_passage;
sphinx_bool use_boundaries;
sphinx_bool weight_order;
sphinx_bool query_mode;
sphinx_bool force_all_words;
sphinx_bool load_files;
sphinx_bool allow_empty;
sphinx_bool emit_zones;
} sphinx_excerpt_options;
typedef struct st_sphinx_keyword_info
{
char * tokenized;
char * normalized;
int num_docs;
int num_hits;
} sphinx_keyword_info;
//////////////////////////////////////////////////////////////////////////
sphinx_client * sphinx_create ( sphinx_bool copy_args );
void sphinx_cleanup ( sphinx_client * client );
void sphinx_destroy ( sphinx_client * client );
const char * sphinx_error ( sphinx_client * client );
const char * sphinx_warning ( sphinx_client * client );
sphinx_bool sphinx_set_server ( sphinx_client * client, const char * host, int port );
sphinx_bool sphinx_set_connect_timeout ( sphinx_client * client, float seconds );
sphinx_bool sphinx_open ( sphinx_client * client );
sphinx_bool sphinx_close ( sphinx_client * client );
sphinx_bool sphinx_set_limits ( sphinx_client * client, int offset, int limit, int max_matches, int cutoff );
sphinx_bool sphinx_set_max_query_time ( sphinx_client * client, int max_query_time );
sphinx_bool sphinx_set_match_mode ( sphinx_client * client, int mode );
sphinx_bool sphinx_set_ranking_mode ( sphinx_client * client, int ranker );
sphinx_bool sphinx_set_sort_mode ( sphinx_client * client, int mode, const char * sortby );
sphinx_bool sphinx_set_field_weights ( sphinx_client * client, int num_weights, const char ** field_names, const int * field_weights );
sphinx_bool sphinx_set_index_weights ( sphinx_client * client, int num_weights, const char ** index_names, const int * index_weights );
sphinx_bool sphinx_set_id_range ( sphinx_client * client, sphinx_uint64_t minid, sphinx_uint64_t maxid );
sphinx_bool sphinx_add_filter ( sphinx_client * client, const char * attr, int num_values, const sphinx_int64_t * values, sphinx_bool exclude );
sphinx_bool sphinx_add_filter_range ( sphinx_client * client, const char * attr, sphinx_int64_t umin, sphinx_int64_t umax, sphinx_bool exclude );
sphinx_bool sphinx_add_filter_float_range ( sphinx_client * client, const char * attr, float fmin, float fmax, sphinx_bool exclude );
sphinx_bool sphinx_set_geoanchor ( sphinx_client * client, const char * attr_latitude, const char * attr_longitude, float latitude, float longitude );
sphinx_bool sphinx_set_groupby ( sphinx_client * client, const char * attr, int groupby_func, const char * group_sort );
sphinx_bool sphinx_set_groupby_distinct ( sphinx_client * client, const char * attr );
sphinx_bool sphinx_set_retries ( sphinx_client * client, int count, int delay );
sphinx_bool sphinx_add_override ( sphinx_client * client, const char * attr, const sphinx_uint64_t * docids, int num_values, const unsigned int * values );
sphinx_bool sphinx_set_select ( sphinx_client * client, const char * select_list );
void sphinx_reset_filters ( sphinx_client * client );
void sphinx_reset_groupby ( sphinx_client * client );
sphinx_result * sphinx_query ( sphinx_client * client, const char * query, const char * index_list, const char * comment );
int sphinx_add_query ( sphinx_client * client, const char * query, const char * index_list, const char * comment );
sphinx_result * sphinx_run_queries ( sphinx_client * client );
int sphinx_get_num_results ( sphinx_client * client );
sphinx_uint64_t sphinx_get_id ( sphinx_result * result, int match );
int sphinx_get_weight ( sphinx_result * result, int match );
sphinx_int64_t sphinx_get_int ( sphinx_result * result, int match, int attr );
float sphinx_get_float ( sphinx_result * result, int match, int attr );
unsigned int * sphinx_get_mva ( sphinx_result * result, int match, int attr );
sphinx_uint64_t sphinx_get_mva64_value ( unsigned int * mva, int i );
const char * sphinx_get_string ( sphinx_result * result, int match, int attr );
void sphinx_init_excerpt_options ( sphinx_excerpt_options * opts );
char ** sphinx_build_excerpts ( sphinx_client * client, int num_docs, const char ** docs, const char * index, const char * words, sphinx_excerpt_options * opts );
int sphinx_update_attributes ( sphinx_client * client, const char * index, int num_attrs, const char ** attrs, int num_docs, const sphinx_uint64_t * docids, const sphinx_int64_t * values );
int sphinx_update_attributes_mva ( sphinx_client * client, const char * index, const char * attr, sphinx_uint64_t docid, int num_values, const unsigned int * values );
sphinx_keyword_info * sphinx_build_keywords ( sphinx_client * client, const char * query, const char * index, sphinx_bool hits, int * out_num_keywords );
char ** sphinx_status ( sphinx_client * client, int * num_rows, int * num_cols );
void sphinx_status_destroy ( char ** status, int num_rows, int num_cols );
/////////////////////////////////////////////////////////////////////////////
#ifdef __cplusplus
}
#endif
#endif // _sphinxclient_
//
// $Id$
//

View File

@ -0,0 +1,55 @@
/* sphinxclient_config.h.in. Generated from configure.in by autoheader. */
/* Define to 1 if you have the <dlfcn.h> header file. */
#undef HAVE_DLFCN_H
/* Define to 1 if you have the <inttypes.h> header file. */
#undef HAVE_INTTYPES_H
/* Define to 1 if you have the <memory.h> header file. */
#undef HAVE_MEMORY_H
/* Define to 1 if you have the <stdint.h> header file. */
#undef HAVE_STDINT_H
/* Define to 1 if you have the <stdlib.h> header file. */
#undef HAVE_STDLIB_H
/* Define to 1 if you have the <strings.h> header file. */
#undef HAVE_STRINGS_H
/* Define to 1 if you have the <string.h> header file. */
#undef HAVE_STRING_H
/* Define to 1 if you have the <sys/stat.h> header file. */
#undef HAVE_SYS_STAT_H
/* Define to 1 if you have the <sys/types.h> header file. */
#undef HAVE_SYS_TYPES_H
/* Define to 1 if you have the <unistd.h> header file. */
#undef HAVE_UNISTD_H
/* Define to the address where bug reports for this package should be sent. */
#undef PACKAGE_BUGREPORT
/* Define to the full name of this package. */
#undef PACKAGE_NAME
/* Define to the full name and version of this package. */
#undef PACKAGE_STRING
/* Define to the one symbol short name of this package. */
#undef PACKAGE_TARNAME
/* Define to the version of this package. */
#undef PACKAGE_VERSION
/* debug build */
#undef SPHINXCLIENT_DEBUG
/* Define to 1 if you have the ANSI C header files. */
#undef STDC_HEADERS
/* Define to `unsigned' if <sys/types.h> does not define. */
#undef size_t

View File

@ -0,0 +1,455 @@
//
// $Id$
//
//
// Copyright (c) 2001-2011, Andrew Aksyonoff
// Copyright (c) 2008-2011, Sphinx Technologies Inc
// All rights reserved
//
// This program is free software; you can redistribute it and/or modify
// it under the terms of the GNU Library General Public License. You should
// have received a copy of the LGPL license along with this program; if you
// did not, you can find it at http://www.gnu.org/
//
#include <stdarg.h>
#include <stdio.h>
#include <stdlib.h>
#if _WIN32
#include <winsock2.h>
#endif
#include "sphinxclient.h"
static sphinx_bool g_smoke = SPH_FALSE;
static int g_failed = 0;
void die ( const char * template, ... )
{
va_list ap;
va_start ( ap, template );
printf ( "FATAL: " );
vprintf ( template, ap );
printf ( "\n" );
va_end ( ap );
exit ( 1 );
}
void net_init ()
{
#if _WIN32
// init WSA on Windows
WSADATA wsa_data;
int wsa_startup_err;
wsa_startup_err = WSAStartup ( WINSOCK_VERSION, &wsa_data );
if ( wsa_startup_err )
die ( "failed to initialize WinSock2: error %d", wsa_startup_err );
#endif
}
void test_query ( sphinx_client * client, const char * query )
{
sphinx_result * res;
const char *index;
int i, j, k, mva_len;
unsigned int * mva;
const char * field_names[2];
int field_weights[2];
index = "test1";
field_names[0] = "title";
field_names[1] = "content";
field_weights[0] = 100;
field_weights[1] = 1;
sphinx_set_field_weights ( client, 2, field_names, field_weights );
field_weights[0] = 1;
field_weights[1] = 1;
res = sphinx_query ( client, query, index, NULL );
if ( !res )
{
g_failed += ( res==NULL );
if ( !g_smoke )
die ( "query failed: %s", sphinx_error(client) );
}
if ( g_smoke )
printf ( "Query '%s' retrieved %d of %d matches.\n", query, res->total, res->total_found );
else
printf ( "Query '%s' retrieved %d of %d matches in %d.%03d sec.\n",
query, res->total, res->total_found, res->time_msec/1000, res->time_msec%1000 );
printf ( "Query stats:\n" );
for ( i=0; i<res->num_words; i++ )
printf ( "\t'%s' found %d times in %d documents\n",
res->words[i].word, res->words[i].hits, res->words[i].docs );
printf ( "\nMatches:\n" );
for ( i=0; i<res->num_matches; i++ )
{
printf ( "%d. doc_id=%d, weight=%d", 1+i,
(int)sphinx_get_id ( res, i ), sphinx_get_weight ( res, i ) );
for ( j=0; j<res->num_attrs; j++ )
{
printf ( ", %s=", res->attr_names[j] );
switch ( res->attr_types[j] )
{
case SPH_ATTR_MULTI64:
case SPH_ATTR_MULTI:
mva = sphinx_get_mva ( res, i, j );
mva_len = *mva++;
printf ( "(" );
for ( k=0; k<mva_len; k++ )
printf ( k ? ",%u" : "%u", ( res->attr_types[j]==SPH_ATTR_MULTI ? mva[k] : (unsigned int)sphinx_get_mva64_value ( mva, k ) ) );
printf ( ")" );
break;
case SPH_ATTR_FLOAT: printf ( "%f", sphinx_get_float ( res, i, j ) ); break;
case SPH_ATTR_STRING: printf ( "%s", sphinx_get_string ( res, i, j ) ); break;
default: printf ( "%u", (unsigned int)sphinx_get_int ( res, i, j ) ); break;
}
}
printf ( "\n" );
}
printf ( "\n" );
}
void test_excerpt ( sphinx_client * client )
{
const char * docs[] =
{
"this is my test text to be highlighted, and for the sake of the testing we need to pump its length somewhat",
"another test text to be highlighted, below limit",
"test number three, without phrase match",
"final test, not only without phrase match, but also above limit and with swapped phrase text test as well"
};
const int ndocs = sizeof(docs)/sizeof(docs[0]);
const char * words = "test text";
const char * index = "test1";
sphinx_excerpt_options opts;
char ** res;
int i, j;
sphinx_init_excerpt_options ( &opts );
opts.limit = 60;
opts.around = 3;
opts.allow_empty = SPH_TRUE;
for ( j=0; j<2; j++ )
{
opts.exact_phrase = j;
printf ( "exact_phrase=%d\n", j );
res = sphinx_build_excerpts ( client, ndocs, docs, index, words, &opts );
if ( !res )
{
g_failed += ( res==NULL );
if ( !g_smoke )
die ( "query failed: %s", sphinx_error(client) );
}
for ( i=0; i<ndocs; i++ )
printf ( "n=%d, res=%s\n", 1+i, res[i] );
printf ( "\n" );
}
}
void test_excerpt_spz ( sphinx_client * client )
{
const char * docs[] =
{
"<efx_unidentified_table>"
"The institutional investment manager it. Is Filing this report and."
"<efx_test>"
"It is signed hereby represent. That it is all information."
"are It or is"
"</efx_test>"
"<efx_2>"
"cool It is cooler"
"</efx_2>"
"It is another place!"
"</efx_unidentified_table>"
};
const int ndocs = sizeof(docs)/sizeof(docs[0]);
const char * words = "it is";
const char * index = "test1";
sphinx_excerpt_options opts;
char ** res;
int i, j;
sphinx_init_excerpt_options ( &opts );
opts.limit = 150;
opts.limit_passages = 8;
opts.around = 8;
opts.html_strip_mode = "strip";
opts.passage_boundary = "zone";
opts.emit_zones = SPH_TRUE;
for ( j=0; j<2; j++ )
{
if ( j==1 )
{
opts.passage_boundary = "sentence";
opts.emit_zones = SPH_FALSE;
}
printf ( "passage_boundary=%s\n", opts.passage_boundary );
res = sphinx_build_excerpts ( client, ndocs, docs, index, words, &opts );
if ( !res )
die ( "query failed: %s", sphinx_error(client) );
for ( i=0; i<ndocs; i++ )
printf ( "n=%d, res=%s\n", 1+i, res[i] );
printf ( "\n" );
}
}
void test_update ( sphinx_client * client, sphinx_uint64_t id )
{
const char * attr = "group_id";
const sphinx_int64_t val = 123;
int res;
res = sphinx_update_attributes ( client, "test1", 1, &attr, 1, &id, &val );
if ( res<0 )
g_failed++;
if ( res<0 )
printf ( "update failed: %s\n\n", sphinx_error(client) );
else
printf ( "update success, %d rows updated\n\n", res );
}
void test_update_mva ( sphinx_client * client )
{
const char * attr = "tag";
const sphinx_uint64_t id = 3;
const unsigned int vals[] = { 7, 77, 177 };
int res;
res = sphinx_update_attributes_mva ( client, "test1", attr, id, sizeof(vals)/sizeof(vals[0]), vals );
if ( res<0 )
g_failed++;
if ( res<0 )
printf ( "update mva failed: %s\n\n", sphinx_error(client) );
else
printf ( "update mva success, %d rows updated\n\n", res );
}
void test_keywords ( sphinx_client * client )
{
int i, nwords;
sphinx_keyword_info * words;
words = sphinx_build_keywords ( client, "hello test one", "test1", SPH_TRUE, &nwords );
g_failed += ( words==NULL );
if ( !words )
{
printf ( "build_keywords failed: %s\n\n", sphinx_error(client) );
} else
{
printf ( "build_keywords result:\n" );
for ( i=0; i<nwords; i++ )
printf ( "%d. tokenized=%s, normalized=%s, docs=%d, hits=%d\n", 1+i,
words[i].tokenized, words[i].normalized,
words[i].num_docs, words[i].num_hits );
printf ( "\n" );
}
}
void test_status ( sphinx_client * client )
{
int num_rows, num_cols, i, j, k;
char ** status;
status = sphinx_status ( client, &num_rows, &num_cols );
if ( !status )
{
g_failed++;
printf ( "status failed: %s\n\n", sphinx_error(client) );
return;
}
k = 0;
for ( i=0; i<num_rows; i++ )
{
if ( !g_smoke || ( strstr ( status[k], "time" )==NULL && strstr ( status[k], "wall" )==NULL ) )
{
for ( j=0; j<num_cols; j++, k++ )
printf ( ( j==0 ) ? "%s:" : " %s", status[k] );
printf ( "\n" );
} else
k += num_cols;
}
printf ( "\n" );
sphinx_status_destroy ( status, num_rows, num_cols );
}
void test_group_by ( sphinx_client * client, const char * attr )
{
sphinx_set_groupby ( client, attr, SPH_GROUPBY_ATTR, "@group asc" );
test_query ( client, "is" );
sphinx_reset_groupby ( client );
}
void test_filter ( sphinx_client * client )
{
const char * attr_group = "group_id";
const char * attr_mva = "tag";
sphinx_int64_t filter_group = { 1 };
sphinx_int64_t filter_mva = { 7 };
int i;
sphinx_bool mva;
for ( i=0; i<2; i++ )
{
mva = ( i==1 );
sphinx_add_filter ( client, mva ? attr_mva : attr_group, 1, mva ? &filter_mva : &filter_group, SPH_FALSE );
test_query ( client, "is" );
sphinx_reset_filters ( client );
}
}
void title ( const char * name )
{
if ( g_smoke || !name )
return;
printf ( "-> % s <-\n\n", name );
}
int main ( int argc, char ** argv )
{
int i, port = 0;
sphinx_client * client;
sphinx_uint64_t override_docid = 2;
unsigned int override_value = 2000;
for ( i=1; i<argc; i++ )
{
if ( strcmp ( argv[i], "--smoke" )==0 )
g_smoke = SPH_TRUE;
else if ( strcmp ( argv[i], "--port" )==0 && i+1<argc )
port = (int)strtoul ( argv[i+1], NULL, 10 );
}
net_init ();
client = sphinx_create ( SPH_TRUE );
if ( !client )
die ( "failed to create client" );
if ( port )
sphinx_set_server ( client, "127.0.0.1", port );
sphinx_set_match_mode ( client, SPH_MATCH_EXTENDED2 );
sphinx_set_sort_mode ( client, SPH_SORT_RELEVANCE, NULL );
// excerpt + keywords
title ( "excerpt" );
test_excerpt ( client );
test_excerpt_spz ( client );
title ( "keywords" );
test_keywords ( client );
// search phase 0
title ( "search phase 0" );
test_query ( client, "is" );
test_query ( client, "is test" );
test_query ( client, "test number" );
// group_by (attr; mva) + filter
title ( "group_by (attr; mva) + filter" );
title ( "group_by attr" );
test_group_by ( client, "group_id" );
// group_by mva
title ( "group_by mva" );
test_group_by ( client, "tag" );
// filter
title ( "filter" );
test_filter ( client );
// update (attr; mva) + sort (descending id)
title ( "update (attr; mva) + sort (descending id)" );
test_update ( client, 2 );
test_update_mva ( client );
sphinx_set_sort_mode ( client, SPH_SORT_EXTENDED, "idd desc" );
test_query ( client, "is" );
// persistence connection
sphinx_open ( client );
// update (attr) + sort (default)
title ( "update (attr) + sort (default)" );
test_update ( client, 4 );
test_update ( client, 3 );
sphinx_set_sort_mode ( client, SPH_SORT_RELEVANCE, NULL );
test_query ( client, "is" );
sphinx_cleanup ( client );
// group_by (attr; mva) + filter + post update
title ( "group_by (attr; mva) + filter + post update" );
title ( "group_by attr" );
test_group_by ( client, "group_id" );
title ( "group_by mva" );
test_group_by ( client, "tag" );
title ( "filter" );
test_filter ( client );
// select
title ( "select" );
sphinx_set_select ( client, "*, group_id*1000+@id*10 AS q" );
test_query ( client, "is" );
// override
title ( "override" );
sphinx_add_override ( client, "group_id", &override_docid, 1, &override_value );
test_query ( client, "is" );
// group_by (override attr)
title ( "group_by (override attr)" );
test_group_by ( client, "group_id" );
sphinx_close ( client );
test_status ( client );
sphinx_destroy ( client );
if ( g_smoke && g_failed )
{
printf ( "%d error(s)\n", g_failed );
exit ( 1 );
}
return 0;
}
//
// $Id$
//

View File

@ -0,0 +1,28 @@
Microsoft Visual Studio Solution File, Format Version 9.00
# Visual Studio 2005
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "test", "test.vcproj", "{AE8DBF77-DE4F-41E4-96F9-456D8DA6418C}"
ProjectSection(ProjectDependencies) = postProject
{E0393ED6-FE6B-4803-8BFD-9D79EF21603A} = {E0393ED6-FE6B-4803-8BFD-9D79EF21603A}
EndProjectSection
EndProject
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "libsphinxclient", "libsphinxclient.vcproj", "{E0393ED6-FE6B-4803-8BFD-9D79EF21603A}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Win32 = Debug|Win32
Release|Win32 = Release|Win32
EndGlobalSection
GlobalSection(ProjectConfigurationPlatforms) = postSolution
{AE8DBF77-DE4F-41E4-96F9-456D8DA6418C}.Debug|Win32.ActiveCfg = Debug|Win32
{AE8DBF77-DE4F-41E4-96F9-456D8DA6418C}.Debug|Win32.Build.0 = Debug|Win32
{AE8DBF77-DE4F-41E4-96F9-456D8DA6418C}.Release|Win32.ActiveCfg = Release|Win32
{AE8DBF77-DE4F-41E4-96F9-456D8DA6418C}.Release|Win32.Build.0 = Release|Win32
{E0393ED6-FE6B-4803-8BFD-9D79EF21603A}.Debug|Win32.ActiveCfg = Debug|Win32
{E0393ED6-FE6B-4803-8BFD-9D79EF21603A}.Debug|Win32.Build.0 = Debug|Win32
{E0393ED6-FE6B-4803-8BFD-9D79EF21603A}.Release|Win32.ActiveCfg = Release|Win32
{E0393ED6-FE6B-4803-8BFD-9D79EF21603A}.Release|Win32.Build.0 = Release|Win32
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
EndGlobalSection
EndGlobal

View File

@ -0,0 +1,192 @@
<?xml version="1.0" encoding="UTF-8"?>
<VisualStudioProject
ProjectType="Visual C++"
Version="8,00"
Name="test"
ProjectGUID="{AE8DBF77-DE4F-41E4-96F9-456D8DA6418C}"
RootNamespace="test"
Keyword="Win32Proj"
>
<Platforms>
<Platform
Name="Win32"
/>
</Platforms>
<ToolFiles>
</ToolFiles>
<Configurations>
<Configuration
Name="Debug|Win32"
OutputDirectory="Debug"
IntermediateDirectory="Debug"
ConfigurationType="1"
>
<Tool
Name="VCPreBuildEventTool"
/>
<Tool
Name="VCCustomBuildTool"
/>
<Tool
Name="VCXMLDataGeneratorTool"
/>
<Tool
Name="VCWebServiceProxyGeneratorTool"
/>
<Tool
Name="VCMIDLTool"
/>
<Tool
Name="VCCLCompilerTool"
Optimization="0"
PreprocessorDefinitions="WIN32;_DEBUG;_CONSOLE;"
MinimalRebuild="true"
BasicRuntimeChecks="3"
RuntimeLibrary="3"
UsePrecompiledHeader="0"
WarningLevel="4"
Detect64BitPortabilityProblems="true"
DebugInformationFormat="4"
/>
<Tool
Name="VCManagedResourceCompilerTool"
/>
<Tool
Name="VCResourceCompilerTool"
/>
<Tool
Name="VCPreLinkEventTool"
/>
<Tool
Name="VCLinkerTool"
GenerateDebugInformation="true"
SubSystem="1"
TargetMachine="1"
/>
<Tool
Name="VCALinkTool"
/>
<Tool
Name="VCManifestTool"
/>
<Tool
Name="VCXDCMakeTool"
/>
<Tool
Name="VCBscMakeTool"
/>
<Tool
Name="VCFxCopTool"
/>
<Tool
Name="VCAppVerifierTool"
/>
<Tool
Name="VCWebDeploymentTool"
/>
<Tool
Name="VCPostBuildEventTool"
/>
</Configuration>
<Configuration
Name="Release|Win32"
OutputDirectory="Release"
IntermediateDirectory="Release"
ConfigurationType="1"
>
<Tool
Name="VCPreBuildEventTool"
/>
<Tool
Name="VCCustomBuildTool"
/>
<Tool
Name="VCXMLDataGeneratorTool"
/>
<Tool
Name="VCWebServiceProxyGeneratorTool"
/>
<Tool
Name="VCMIDLTool"
/>
<Tool
Name="VCCLCompilerTool"
PreprocessorDefinitions="WIN32;NDEBUG;_CONSOLE;"
RuntimeLibrary="2"
UsePrecompiledHeader="0"
WarningLevel="4"
Detect64BitPortabilityProblems="true"
DebugInformationFormat="3"
/>
<Tool
Name="VCManagedResourceCompilerTool"
/>
<Tool
Name="VCResourceCompilerTool"
/>
<Tool
Name="VCPreLinkEventTool"
/>
<Tool
Name="VCLinkerTool"
GenerateDebugInformation="true"
SubSystem="1"
OptimizeReferences="2"
EnableCOMDATFolding="2"
TargetMachine="1"
/>
<Tool
Name="VCALinkTool"
/>
<Tool
Name="VCManifestTool"
/>
<Tool
Name="VCXDCMakeTool"
/>
<Tool
Name="VCBscMakeTool"
/>
<Tool
Name="VCFxCopTool"
/>
<Tool
Name="VCAppVerifierTool"
/>
<Tool
Name="VCWebDeploymentTool"
/>
<Tool
Name="VCPostBuildEventTool"
/>
</Configuration>
</Configurations>
<References>
</References>
<Files>
<Filter
Name="Header Files"
Filter="h;hpp;hxx;hm;inl;inc;xsd"
UniqueIdentifier="{93995380-89BD-4b04-88EB-625FBE52EBFB}"
>
</Filter>
<Filter
Name="Resource Files"
Filter="rc;ico;cur;bmp;dlg;rc2;rct;bin;rgs;gif;jpg;jpeg;jpe;resx"
UniqueIdentifier="{67DA6AB6-F800-4c08-8B7A-83BB121AAD01}"
>
</Filter>
<Filter
Name="Source Files"
Filter="cpp;c;cc;cxx;def;odl;idl;hpj;bat;asm;asmx"
UniqueIdentifier="{4FC737F1-C7A5-4376-A066-2A32D752A2FF}"
>
<File
RelativePath=".\test.c"
>
</File>
</Filter>
</Files>
<Globals>
</Globals>
</VisualStudioProject>

View File

@ -0,0 +1,21 @@
Microsoft Visual Studio Solution File, Format Version 8.00
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "test03", "test03.vcproj", "{3F3C7CA8-E864-4FB5-8EB0-A114FAFBE24E}"
ProjectSection(ProjectDependencies) = postProject
EndProjectSection
EndProject
Global
GlobalSection(SolutionConfiguration) = preSolution
Debug = Debug
Release = Release
EndGlobalSection
GlobalSection(ProjectConfiguration) = postSolution
{3F3C7CA8-E864-4FB5-8EB0-A114FAFBE24E}.Debug.ActiveCfg = Debug|Win32
{3F3C7CA8-E864-4FB5-8EB0-A114FAFBE24E}.Debug.Build.0 = Debug|Win32
{3F3C7CA8-E864-4FB5-8EB0-A114FAFBE24E}.Release.ActiveCfg = Release|Win32
{3F3C7CA8-E864-4FB5-8EB0-A114FAFBE24E}.Release.Build.0 = Release|Win32
EndGlobalSection
GlobalSection(ExtensibilityGlobals) = postSolution
EndGlobalSection
GlobalSection(ExtensibilityAddIns) = postSolution
EndGlobalSection
EndGlobal

View File

@ -0,0 +1,138 @@
<?xml version="1.0" encoding="windows-1251"?>
<VisualStudioProject
ProjectType="Visual C++"
Version="7.10"
Name="test03"
ProjectGUID="{3F3C7CA8-E864-4FB5-8EB0-A114FAFBE24E}"
Keyword="Win32Proj">
<Platforms>
<Platform
Name="Win32"/>
</Platforms>
<Configurations>
<Configuration
Name="Debug|Win32"
OutputDirectory="Debug"
IntermediateDirectory="Debug"
ConfigurationType="1"
CharacterSet="2">
<Tool
Name="VCCLCompilerTool"
Optimization="0"
PreprocessorDefinitions="WIN32;_DEBUG;_CONSOLE"
MinimalRebuild="TRUE"
BasicRuntimeChecks="3"
RuntimeLibrary="5"
UsePrecompiledHeader="0"
WarningLevel="3"
Detect64BitPortabilityProblems="TRUE"
DebugInformationFormat="4"/>
<Tool
Name="VCCustomBuildTool"/>
<Tool
Name="VCLinkerTool"
OutputFile="$(OutDir)/test03.exe"
LinkIncremental="2"
GenerateDebugInformation="TRUE"
ProgramDatabaseFile="$(OutDir)/test03.pdb"
SubSystem="1"
TargetMachine="1"/>
<Tool
Name="VCMIDLTool"/>
<Tool
Name="VCPostBuildEventTool"/>
<Tool
Name="VCPreBuildEventTool"/>
<Tool
Name="VCPreLinkEventTool"/>
<Tool
Name="VCResourceCompilerTool"/>
<Tool
Name="VCWebServiceProxyGeneratorTool"/>
<Tool
Name="VCXMLDataGeneratorTool"/>
<Tool
Name="VCWebDeploymentTool"/>
<Tool
Name="VCManagedWrapperGeneratorTool"/>
<Tool
Name="VCAuxiliaryManagedWrapperGeneratorTool"/>
</Configuration>
<Configuration
Name="Release|Win32"
OutputDirectory="Release"
IntermediateDirectory="Release"
ConfigurationType="1"
CharacterSet="2">
<Tool
Name="VCCLCompilerTool"
PreprocessorDefinitions="WIN32;NDEBUG;_CONSOLE"
RuntimeLibrary="4"
UsePrecompiledHeader="0"
WarningLevel="3"
Detect64BitPortabilityProblems="TRUE"
DebugInformationFormat="3"/>
<Tool
Name="VCCustomBuildTool"/>
<Tool
Name="VCLinkerTool"
OutputFile="$(OutDir)/test03.exe"
LinkIncremental="1"
GenerateDebugInformation="TRUE"
SubSystem="1"
OptimizeReferences="2"
EnableCOMDATFolding="2"
TargetMachine="1"/>
<Tool
Name="VCMIDLTool"/>
<Tool
Name="VCPostBuildEventTool"/>
<Tool
Name="VCPreBuildEventTool"/>
<Tool
Name="VCPreLinkEventTool"/>
<Tool
Name="VCResourceCompilerTool"/>
<Tool
Name="VCWebServiceProxyGeneratorTool"/>
<Tool
Name="VCXMLDataGeneratorTool"/>
<Tool
Name="VCWebDeploymentTool"/>
<Tool
Name="VCManagedWrapperGeneratorTool"/>
<Tool
Name="VCAuxiliaryManagedWrapperGeneratorTool"/>
</Configuration>
</Configurations>
<References>
</References>
<Files>
<Filter
Name="Source Files"
Filter="cpp;c;cxx;def;odl;idl;hpj;bat;asm;asmx"
UniqueIdentifier="{4FC737F1-C7A5-4376-A066-2A32D752A2FF}">
<File
RelativePath=".\sphinxclient.c">
</File>
<File
RelativePath=".\test.c">
</File>
</Filter>
<Filter
Name="Header Files"
Filter="h;hpp;hxx;hm;inl;inc;xsd"
UniqueIdentifier="{93995380-89BD-4b04-88EB-625FBE52EBFB}">
<File
RelativePath=".\sphinxclient.h">
</File>
</Filter>
<Filter
Name="Resource Files"
Filter="rc;ico;cur;bmp;dlg;rc2;rct;bin;rgs;gif;jpg;jpeg;jpe;resx"
UniqueIdentifier="{67DA6AB6-F800-4c08-8B7A-83BB121AAD01}">
</Filter>
</Files>
<Globals>
</Globals>
</VisualStudioProject>

View File

@ -0,0 +1,41 @@
=Sphinx Client API 0.9.9-dev (r1299)
This document gives an overview of what is Sphinx itself and how to use in
within Ruby on Rails. For more information or documentation,
please go to http://www.sphinxsearch.com
==Sphinx
Sphinx is a standalone full-text search engine, meant to provide fast,
size-efficient and relevant fulltext search functions to other applications.
Sphinx was specially designed to integrate well with SQL databases and
scripting languages. Currently built-in data sources support fetching data
either via direct connection to MySQL, or from an XML pipe.
Simplest way to communicate with Sphinx is to use <tt>searchd</tt> -
a daemon to search through fulltext indices from external software.
==Documentation
You can create the documentation by running:
rake rdoc
==Latest version
You can always get latest version from
http://kpumuk.info/projects/ror-plugins/sphinx
==Credits
Dmytro Shteflyuk <kpumuk@kpumuk.info> http://kpumuk.info
Andrew Aksyonoff http://sphinxsearch.com/
Special thanks to Alexey Kovyrin <alexey@kovyrin.net> http://blog.kovyrin.net
==License
This library is distributed under the terms of the Ruby license.
You can freely distribute/modify this library.

View File

@ -0,0 +1,21 @@
require 'rake'
require 'spec/rake/spectask'
require 'rake/rdoctask'
desc 'Default: run unit tests.'
task :default => :spec
desc 'Test the sphinx plugin.'
Spec::Rake::SpecTask.new(:spec) do |t|
t.libs << 'lib'
t.pattern = 'spec/*_spec.rb'
end
desc 'Generate documentation for the sphinx plugin.'
Rake::RDocTask.new(:rdoc) do |rdoc|
rdoc.rdoc_dir = 'rdoc'
rdoc.title = 'Sphinx Client API'
rdoc.options << '--line-numbers' << '--inline-source'
rdoc.rdoc_files.include('README')
rdoc.rdoc_files.include('lib/**/*.rb')
end

View File

@ -0,0 +1 @@
require File.dirname(__FILE__) + '/lib/sphinx'

View File

@ -0,0 +1,5 @@
require 'fileutils'
sphinx_config = File.dirname(__FILE__) + '/../../../config/sphinx.yml'
FileUtils.cp File.dirname(__FILE__) + '/sphinx.yml.tpl', sphinx_config unless File.exist?(sphinx_config)
puts IO.read(File.join(File.dirname(__FILE__), 'README'))

View File

@ -0,0 +1,6 @@
require File.dirname(__FILE__) + '/sphinx/request'
require File.dirname(__FILE__) + '/sphinx/response'
require File.dirname(__FILE__) + '/sphinx/client'
module Sphinx
end

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,50 @@
module Sphinx
# Pack ints, floats, strings, and arrays to internal representation
# needed by Sphinx search engine.
class Request
# Initialize new request.
def initialize
@request = ''
end
# Put int(s) to request.
def put_int(*ints)
ints.each { |i| @request << [i].pack('N') }
end
# Put 64-bit int(s) to request.
def put_int64(*ints)
ints.each { |i| @request << [i].pack('q').reverse }#[i >> 32, i & ((1 << 32) - 1)].pack('NN') }
end
# Put string(s) to request (first length, then the string itself).
def put_string(*strings)
strings.each { |s| @request << [s.length].pack('N') + s }
end
# Put float(s) to request.
def put_float(*floats)
floats.each do |f|
t1 = [f].pack('f') # machine order
t2 = t1.unpack('L*').first # int in machine order
@request << [t2].pack('N')
end
end
# Put array of ints to request (first length, then the array itself)
def put_int_array(arr)
put_int arr.length, *arr
end
# Put array of 64-bit ints to request (first length, then the array itself)
def put_int64_array(arr)
put_int arr.length
put_int64(*arr)
end
# Returns the entire message
def to_s
@request
end
end
end

View File

@ -0,0 +1,69 @@
module Sphinx
# Unpack internal Sphinx representation of ints, floats, strings, and arrays.
# needed by Sphinx search engine.
class Response
# Initialize new request.
def initialize(response)
@response = response
@position = 0
@size = response.length
end
# Gets current stream position.
def position
@position
end
# Gets response size.
def size
@size
end
# Returns <tt>true</tt> when response stream is out.
def eof?
@position >= @size
end
# Get int from stream.
def get_int
raise EOFError if @position + 4 > @size
value = @response[@position, 4].unpack('N*').first
@position += 4
return value
end
# Get 64-bit int from stream.
def get_int64
raise EOFError if @position + 8 > @size
hi, lo = @response[@position, 8].unpack('N*N*')
@position += 8
return (hi << 32) + lo
end
# Get array of <tt>count</tt> ints from stream.
def get_ints(count)
length = 4 * count
raise EOFError if @position + length > @size
values = @response[@position, length].unpack('N*' * count)
@position += length
return values
end
# Get string from stream.
def get_string
length = get_int
raise EOFError if @position + length > @size
value = length > 0 ? @response[@position, length] : ''
@position += length
return value
end
# Get float from stream.
def get_float
raise EOFError if @position + 4 > @size
uval = @response[@position, 4].unpack('N*').first;
@position += 4
return ([uval].pack('L')).unpack('f*').first
end
end
end

View File

@ -0,0 +1,112 @@
require File.dirname(__FILE__) + '/../init'
# To execute these tests you need to execute sphinx_test.sql and configure sphinx using sphinx.conf
# (both files are placed under sphinx directory)
context 'The SphinxApi connected to Sphinx' do
setup do
@sphinx = Sphinx::Client.new
end
specify 'should parse response in Query method' do
result = @sphinx.Query('wifi', 'test1')
validate_results_wifi(result)
end
specify 'should process 64-bit keys in Query method' do
result = @sphinx.Query('wifi', 'test2')
result['total_found'].should == 3
result['matches'].length.should == 3
result['matches'][0]['id'].should == 4294967298
result['matches'][1]['id'].should == 4294967299
result['matches'][2]['id'].should == 4294967297
end
specify 'should parse batch-query responce in RunQueries method' do
@sphinx.AddQuery('wifi', 'test1')
@sphinx.AddQuery('gprs', 'test1')
results = @sphinx.RunQueries
results.should be_an_instance_of(Array)
results.length.should == 2
validate_results_wifi(results[0])
end
specify 'should parse response in BuildExcerpts method' do
result = @sphinx.BuildExcerpts(['what the world', 'London is the capital of Great Britain'], 'test1', 'the')
result.should == ['what <b>the</b> world', 'London is <b>the</b> capital of Great Britain']
end
specify 'should parse response in BuildKeywords method' do
result = @sphinx.BuildKeywords('wifi gprs', 'test1', true)
result.should == [
{ 'normalized' => 'wifi', 'tokenized' => 'wifi', 'hits' => 6, 'docs' => 3 },
{ 'normalized' => 'gprs', 'tokenized' => 'gprs', 'hits' => 1, 'docs' => 1 }
]
end
specify 'should parse response in UpdateAttributes method' do
@sphinx.UpdateAttributes('test1', ['group_id'], { 2 => [1] }).should == 1
result = @sphinx.Query('wifi', 'test1')
result['matches'][0]['attrs']['group_id'].should == 1
@sphinx.UpdateAttributes('test1', ['group_id'], { 2 => [2] }).should == 1
result = @sphinx.Query('wifi', 'test1')
result['matches'][0]['attrs']['group_id'].should == 2
end
specify 'should parse response in UpdateAttributes method with MVA' do
@sphinx.UpdateAttributes('test1', ['tags'], { 2 => [[1, 2, 3, 4, 5, 6, 7, 8, 9]] }, true).should == 1
result = @sphinx.Query('wifi', 'test1')
result['matches'][0]['attrs']['tags'].should == [1, 2, 3, 4, 5, 6, 7, 8, 9]
@sphinx.UpdateAttributes('test1', ['tags'], { 2 => [[5, 6, 7, 8]] }, true).should == 1
result = @sphinx.Query('wifi', 'test1')
result['matches'][0]['attrs']['tags'].should == [5, 6, 7, 8]
end
specify 'should process errors in Query method' do
@sphinx.Query('wifi', 'fakeindex').should be_false
@sphinx.GetLastError.length.should_not == 0
end
specify 'should process errors in RunQueries method' do
@sphinx.AddQuery('wifi', 'fakeindex')
r = @sphinx.RunQueries
r[0]['error'].length.should_not == 0
end
def validate_results_wifi(result)
result['total_found'].should == 3
result['matches'].length.should == 3
result['time'].should_not be_nil
result['attrs'].should == {
'group_id' => Sphinx::Client::SPH_ATTR_INTEGER,
'created_at' => Sphinx::Client::SPH_ATTR_TIMESTAMP,
'rating' => Sphinx::Client::SPH_ATTR_FLOAT,
'tags' => Sphinx::Client::SPH_ATTR_MULTI | Sphinx::Client::SPH_ATTR_INTEGER
}
result['fields'].should == [ 'name', 'description' ]
result['total'].should == 3
result['matches'].should be_an_instance_of(Array)
result['matches'][0]['id'].should == 2
result['matches'][0]['weight'].should == 2
result['matches'][0]['attrs']['group_id'].should == 2
result['matches'][0]['attrs']['created_at'].should == 1175658555
result['matches'][0]['attrs']['tags'].should == [5, 6, 7, 8]
('%0.2f' % result['matches'][0]['attrs']['rating']).should == '54.85'
result['matches'][1]['id'].should == 3
result['matches'][1]['weight'].should == 2
result['matches'][1]['attrs']['group_id'].should == 1
result['matches'][1]['attrs']['created_at'].should == 1175658647
result['matches'][1]['attrs']['tags'].should == [1, 7, 9, 10]
('%0.2f' % result['matches'][1]['attrs']['rating']).should == '16.25'
result['matches'][2]['id'].should == 1
result['matches'][2]['weight'].should == 1
result['matches'][2]['attrs']['group_id'].should == 1
result['matches'][2]['attrs']['created_at'].should == 1175658490
result['matches'][2]['attrs']['tags'].should == [1, 2, 3, 4]
('%0.2f' % result['matches'][2]['attrs']['rating']).should == '13.32'
result['words'].should == { 'wifi' => { 'hits' => 6, 'docs' => 3 } }
end
end

View File

@ -0,0 +1,469 @@
require File.dirname(__FILE__) + '/../init'
class SphinxSpecError < StandardError; end
module SphinxFixtureHelper
def sphinx_fixture(name)
`php #{File.dirname(__FILE__)}/fixtures/#{name}.php`
end
end
module SphinxApiCall
def create_sphinx
@sphinx = Sphinx::Client.new
@sock = mock('TCPSocket')
@sphinx.stub!(:Connect).and_return(@sock)
@sphinx.stub!(:GetResponse).and_raise(SphinxSpecError)
return @sphinx
end
def safe_call
yield
rescue SphinxSpecError
end
end
describe 'The Connect method of Sphinx::Client' do
before(:each) do
@sphinx = Sphinx::Client.new
@sock = mock('TCPSocket')
end
it 'should establish TCP connection to the server and initialize session' do
TCPSocket.should_receive(:new).with('localhost', 9312).and_return(@sock)
@sock.should_receive(:recv).with(4).and_return([1].pack('N'))
@sock.should_receive(:send).with([1].pack('N'), 0)
@sphinx.send(:Connect).should be(@sock)
end
it 'should raise exception when searchd protocol is not 1+' do
TCPSocket.should_receive(:new).with('localhost', 9312).and_return(@sock)
@sock.should_receive(:recv).with(4).and_return([0].pack('N'))
@sock.should_receive(:close)
lambda { @sphinx.send(:Connect) }.should raise_error(Sphinx::SphinxConnectError)
@sphinx.GetLastError.should == 'expected searchd protocol version 1+, got version \'0\''
end
it 'should raise exception on connection error' do
TCPSocket.should_receive(:new).with('localhost', 9312).and_raise(Errno::EBADF)
lambda { @sphinx.send(:Connect) }.should raise_error(Sphinx::SphinxConnectError)
@sphinx.GetLastError.should == 'connection to localhost:9312 failed'
end
it 'should use custom host and port' do
@sphinx.SetServer('anotherhost', 55555)
TCPSocket.should_receive(:new).with('anotherhost', 55555).and_raise(Errno::EBADF)
lambda { @sphinx.send(:Connect) }.should raise_error(Sphinx::SphinxConnectError)
end
end
describe 'The GetResponse method of Sphinx::Client' do
before(:each) do
@sphinx = Sphinx::Client.new
@sock = mock('TCPSocket')
@sock.should_receive(:close)
end
it 'should receive response' do
@sock.should_receive(:recv).with(8).and_return([Sphinx::Client::SEARCHD_OK, 1, 4].pack('n2N'))
@sock.should_receive(:recv).with(4).and_return([0].pack('N'))
@sphinx.send(:GetResponse, @sock, 1)
end
it 'should raise exception on zero-sized response' do
@sock.should_receive(:recv).with(8).and_return([Sphinx::Client::SEARCHD_OK, 1, 0].pack('n2N'))
lambda { @sphinx.send(:GetResponse, @sock, 1) }.should raise_error(Sphinx::SphinxResponseError)
end
it 'should raise exception when response is incomplete' do
@sock.should_receive(:recv).with(8).and_return([Sphinx::Client::SEARCHD_OK, 1, 4].pack('n2N'))
@sock.should_receive(:recv).with(4).and_raise(EOFError)
lambda { @sphinx.send(:GetResponse, @sock, 1) }.should raise_error(Sphinx::SphinxResponseError)
end
it 'should set warning message when SEARCHD_WARNING received' do
@sock.should_receive(:recv).with(8).and_return([Sphinx::Client::SEARCHD_WARNING, 1, 14].pack('n2N'))
@sock.should_receive(:recv).with(14).and_return([5].pack('N') + 'helloworld')
@sphinx.send(:GetResponse, @sock, 1).should == 'world'
@sphinx.GetLastWarning.should == 'hello'
end
it 'should raise exception when SEARCHD_ERROR received' do
@sock.should_receive(:recv).with(8).and_return([Sphinx::Client::SEARCHD_ERROR, 1, 9].pack('n2N'))
@sock.should_receive(:recv).with(9).and_return([1].pack('N') + 'hello')
lambda { @sphinx.send(:GetResponse, @sock, 1) }.should raise_error(Sphinx::SphinxInternalError)
@sphinx.GetLastError.should == 'searchd error: hello'
end
it 'should raise exception when SEARCHD_RETRY received' do
@sock.should_receive(:recv).with(8).and_return([Sphinx::Client::SEARCHD_RETRY, 1, 9].pack('n2N'))
@sock.should_receive(:recv).with(9).and_return([1].pack('N') + 'hello')
lambda { @sphinx.send(:GetResponse, @sock, 1) }.should raise_error(Sphinx::SphinxTemporaryError)
@sphinx.GetLastError.should == 'temporary searchd error: hello'
end
it 'should raise exception when unknown status received' do
@sock.should_receive(:recv).with(8).and_return([65535, 1, 9].pack('n2N'))
@sock.should_receive(:recv).with(9).and_return([1].pack('N') + 'hello')
lambda { @sphinx.send(:GetResponse, @sock, 1) }.should raise_error(Sphinx::SphinxUnknownError)
@sphinx.GetLastError.should == 'unknown status code: \'65535\''
end
it 'should set warning when server is older than client' do
@sock.should_receive(:recv).with(8).and_return([Sphinx::Client::SEARCHD_OK, 1, 9].pack('n2N'))
@sock.should_receive(:recv).with(9).and_return([1].pack('N') + 'hello')
@sphinx.send(:GetResponse, @sock, 5)
@sphinx.GetLastWarning.should == 'searchd command v.0.1 older than client\'s v.0.5, some options might not work'
end
end
describe 'The Query method of Sphinx::Client' do
include SphinxFixtureHelper
include SphinxApiCall
before(:each) do
@sphinx = create_sphinx
end
it 'should generate valid request with default parameters' do
expected = sphinx_fixture('default_search')
@sock.should_receive(:send).with(expected, 0)
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with default parameters and index' do
expected = sphinx_fixture('default_search_index')
@sock.should_receive(:send).with(expected, 0)
@sphinx.Query('query', 'index') rescue nil?
end
it 'should generate valid request with limits' do
expected = sphinx_fixture('limits')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetLimits(10, 20)
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with limits and max number to retrieve' do
expected = sphinx_fixture('limits_max')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetLimits(10, 20, 30)
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with limits and cutoff to retrieve' do
expected = sphinx_fixture('limits_cutoff')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetLimits(10, 20, 30, 40)
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with max query time specified' do
expected = sphinx_fixture('max_query_time')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetMaxQueryTime(1000)
@sphinx.Query('query') rescue nil?
end
describe 'with match' do
[ :all, :any, :phrase, :boolean, :extended, :fullscan, :extended2 ].each do |match|
it "should generate valid request for SPH_MATCH_#{match.to_s.upcase}" do
expected = sphinx_fixture("match_#{match}")
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetMatchMode(Sphinx::Client::const_get("SPH_MATCH_#{match.to_s.upcase}"))
@sphinx.Query('query') rescue nil?
end
end
end
describe 'with rank' do
[ :proximity_bm25, :bm25, :none, :wordcount, :proximity ].each do |rank|
it "should generate valid request for SPH_RANK_#{rank.to_s.upcase}" do
expected = sphinx_fixture("ranking_#{rank}")
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetRankingMode(Sphinx::Client.const_get("SPH_RANK_#{rank.to_s.upcase}"))
@sphinx.Query('query') rescue nil?
end
end
end
describe 'with sorting' do
[ :attr_desc, :relevance, :attr_asc, :time_segments, :extended, :expr ].each do |mode|
it "should generate valid request for SPH_SORT_#{mode.to_s.upcase}" do
expected = sphinx_fixture("sort_#{mode}")
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetSortMode(Sphinx::Client.const_get("SPH_SORT_#{mode.to_s.upcase}"), mode == :relevance ? '' : 'sortby')
@sphinx.Query('query') rescue nil?
end
end
end
it 'should generate valid request with weights' do
expected = sphinx_fixture('weights')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetWeights([10, 20, 30, 40])
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with field weights' do
expected = sphinx_fixture('field_weights')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetFieldWeights({'field1' => 10, 'field2' => 20})
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with index weights' do
expected = sphinx_fixture('index_weights')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetIndexWeights({'index1' => 10, 'index2' => 20})
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with ID range' do
expected = sphinx_fixture('id_range')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetIDRange(10, 20)
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with ID range and 64-bit ints' do
expected = sphinx_fixture('id_range64')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetIDRange(8589934591, 17179869183)
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with values filter' do
expected = sphinx_fixture('filter')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetFilter('attr', [10, 20, 30])
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with two values filters' do
expected = sphinx_fixture('filters')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetFilter('attr2', [40, 50])
@sphinx.SetFilter('attr1', [10, 20, 30])
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with values filter excluded' do
expected = sphinx_fixture('filter_exclude')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetFilter('attr', [10, 20, 30], true)
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with values filter range' do
expected = sphinx_fixture('filter_range')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetFilterRange('attr', 10, 20)
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with two filter ranges' do
expected = sphinx_fixture('filter_ranges')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetFilterRange('attr2', 30, 40)
@sphinx.SetFilterRange('attr1', 10, 20)
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with filter range excluded' do
expected = sphinx_fixture('filter_range_exclude')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetFilterRange('attr', 10, 20, true)
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with signed int64-based filter range' do
expected = sphinx_fixture('filter_range_int64')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetFilterRange('attr1', -10, 20)
@sphinx.SetFilterRange('attr2', -1099511627770, 1099511627780)
safe_call { @sphinx.Query('query') }
end
it 'should generate valid request with float filter range' do
expected = sphinx_fixture('filter_float_range')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetFilterFloatRange('attr', 10.5, 20.3)
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with float filter excluded' do
expected = sphinx_fixture('filter_float_range_exclude')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetFilterFloatRange('attr', 10.5, 20.3, true)
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with different filters' do
expected = sphinx_fixture('filters_different')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetFilterRange('attr1', 10, 20, true)
@sphinx.SetFilter('attr3', [30, 40, 50])
@sphinx.SetFilterRange('attr1', 60, 70)
@sphinx.SetFilter('attr2', [80, 90, 100], true)
@sphinx.SetFilterFloatRange('attr1', 60.8, 70.5)
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with geographical anchor point' do
expected = sphinx_fixture('geo_anchor')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetGeoAnchor('attrlat', 'attrlong', 20.3, 40.7)
@sphinx.Query('query') rescue nil?
end
describe 'with group by' do
[ :day, :week, :month, :year, :attr, :attrpair ].each do |groupby|
it "should generate valid request for SPH_GROUPBY_#{groupby.to_s.upcase}" do
expected = sphinx_fixture("group_by_#{groupby}")
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetGroupBy('attr', Sphinx::Client::const_get("SPH_GROUPBY_#{groupby.to_s.upcase}"))
@sphinx.Query('query') rescue nil?
end
end
it 'should generate valid request for SPH_GROUPBY_DAY with sort' do
expected = sphinx_fixture('group_by_day_sort')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetGroupBy('attr', Sphinx::Client::SPH_GROUPBY_DAY, 'somesort')
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with count-distinct attribute' do
expected = sphinx_fixture('group_distinct')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetGroupBy('attr', Sphinx::Client::SPH_GROUPBY_DAY)
@sphinx.SetGroupDistinct('attr')
@sphinx.Query('query') rescue nil?
end
end
it 'should generate valid request with retries count specified' do
expected = sphinx_fixture('retries')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetRetries(10)
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request with retries count and delay specified' do
expected = sphinx_fixture('retries_delay')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetRetries(10, 20)
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request for SetOverride' do
expected = sphinx_fixture('set_override')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetOverride('attr1', Sphinx::Client::SPH_ATTR_INTEGER, { 10 => 20 })
@sphinx.SetOverride('attr2', Sphinx::Client::SPH_ATTR_FLOAT, { 11 => 30.3 })
@sphinx.SetOverride('attr3', Sphinx::Client::SPH_ATTR_BIGINT, { 12 => 1099511627780 })
@sphinx.Query('query') rescue nil?
end
it 'should generate valid request for SetSelect' do
expected = sphinx_fixture('select')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetSelect('attr1, attr2')
@sphinx.Query('query') rescue nil?
end
end
describe 'The RunQueries method of Sphinx::Client' do
include SphinxFixtureHelper
before(:each) do
@sphinx = Sphinx::Client.new
@sock = mock('TCPSocket')
@sphinx.stub!(:Connect).and_return(@sock)
@sphinx.stub!(:GetResponse).and_raise(Sphinx::SphinxError)
end
it 'should generate valid request for multiple queries' do
expected = sphinx_fixture('miltiple_queries')
@sock.should_receive(:send).with(expected, 0)
@sphinx.SetRetries(10, 20)
@sphinx.AddQuery('test1')
@sphinx.SetGroupBy('attr', Sphinx::Client::SPH_GROUPBY_DAY)
@sphinx.AddQuery('test2') rescue nil?
@sphinx.RunQueries rescue nil?
end
end
describe 'The BuildExcerpts method of Sphinx::Client' do
include SphinxFixtureHelper
before(:each) do
@sphinx = Sphinx::Client.new
@sock = mock('TCPSocket')
@sphinx.stub!(:Connect).and_return(@sock)
@sphinx.stub!(:GetResponse).and_raise(Sphinx::SphinxError)
end
it 'should generate valid request with default parameters' do
expected = sphinx_fixture('excerpt_default')
@sock.should_receive(:send).with(expected, 0)
@sphinx.BuildExcerpts(['10', '20'], 'index', 'word1 word2') rescue nil?
end
it 'should generate valid request with custom parameters' do
expected = sphinx_fixture('excerpt_custom')
@sock.should_receive(:send).with(expected, 0)
@sphinx.BuildExcerpts(['10', '20'], 'index', 'word1 word2', { 'before_match' => 'before',
'after_match' => 'after',
'chunk_separator' => 'separator',
'limit' => 10 }) rescue nil?
end
it 'should generate valid request with flags' do
expected = sphinx_fixture('excerpt_flags')
@sock.should_receive(:send).with(expected, 0)
@sphinx.BuildExcerpts(['10', '20'], 'index', 'word1 word2', { 'exact_phrase' => true,
'single_passage' => true,
'use_boundaries' => true,
'weight_order' => true }) rescue nil?
end
end
describe 'The BuildKeywords method of Sphinx::Client' do
include SphinxFixtureHelper
include SphinxApiCall
before(:each) do
@sphinx = create_sphinx
end
it 'should generate valid request' do
expected = sphinx_fixture('keywords')
@sock.should_receive(:send).with(expected, 0)
safe_call { @sphinx.BuildKeywords('test', 'index', true) }
end
end
describe 'The UpdateAttributes method of Sphinx::Client' do
include SphinxFixtureHelper
include SphinxApiCall
before(:each) do
@sphinx = create_sphinx
end
it 'should generate valid request' do
expected = sphinx_fixture('update_attributes')
@sock.should_receive(:send).with(expected, 0)
safe_call { @sphinx.UpdateAttributes('index', ['group'], { 123 => [456] }) }
end
it 'should generate valid request for MVA' do
expected = sphinx_fixture('update_attributes_mva')
@sock.should_receive(:send).with(expected, 0)
safe_call { @sphinx.UpdateAttributes('index', ['group', 'category'], { 123 => [ [456, 789], [1, 2, 3] ] }, true) }
end
end

View File

@ -0,0 +1,8 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->Query('query');
?>

View File

@ -0,0 +1,8 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->Query('query', 'index');
?>

View File

@ -0,0 +1,11 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->BuildExcerpts(array('10', '20'), 'index', 'word1 word2', array('before_match' => 'before',
'after_match' => 'after',
'chunk_separator' => 'separator',
'limit' => 10));
?>

View File

@ -0,0 +1,8 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->BuildExcerpts(array('10', '20'), 'index', 'word1 word2');
?>

View File

@ -0,0 +1,11 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->BuildExcerpts(array('10', '20'), 'index', 'word1 word2', array('exact_phrase' => true,
'single_passage' => true,
'use_boundaries' => true,
'weight_order' => true));
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetFieldWeights(array('field1' => 10, 'field2' => 20));
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetFilter('attr', array(10, 20, 30));
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetFilter('attr', array(10, 20, 30), true);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetFilterFloatRange('attr', 10.5, 20.3);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetFilterFloatRange('attr', 10.5, 20.3, true);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetFilterRange('attr', 10, 20);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetFilterRange('attr', 10, 20, true);
$cl->Query('query');
?>

View File

@ -0,0 +1,10 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetFilterRange('attr1', -10, 20);
$cl->SetFilterRange('attr2', -1099511627770, 1099511627780);
$cl->Query('query');
?>

View File

@ -0,0 +1,10 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetFilterRange('attr2', 30, 40);
$cl->SetFilterRange('attr1', 10, 20);
$cl->Query('query');
?>

View File

@ -0,0 +1,10 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetFilter('attr2', array(40, 50));
$cl->SetFilter('attr1', array(10, 20, 30));
$cl->Query('query');
?>

View File

@ -0,0 +1,13 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetFilterRange('attr1', 10, 20, true);
$cl->SetFilter('attr3', array(30, 40, 50));
$cl->SetFilterRange('attr1', 60, 70);
$cl->SetFilter('attr2', array(80, 90, 100), true);
$cl->SetFilterFloatRange('attr1', 60.8, 70.5);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetGeoAnchor('attrlat', 'attrlong', 20.3, 40.7);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetGroupBy('attr', SPH_GROUPBY_ATTR);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetGroupBy('attr', SPH_GROUPBY_ATTRPAIR);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetGroupBy('attr', SPH_GROUPBY_DAY);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetGroupBy('attr', SPH_GROUPBY_DAY, 'somesort');
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetGroupBy('attr', SPH_GROUPBY_MONTH);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetGroupBy('attr', SPH_GROUPBY_WEEK);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetGroupBy('attr', SPH_GROUPBY_YEAR);
$cl->Query('query');
?>

View File

@ -0,0 +1,10 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetGroupBy('attr', SPH_GROUPBY_DAY);
$cl->SetGroupDistinct('attr');
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetIDRange(10, 20);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetIDRange(8589934591, 17179869183);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetIndexWeights(array('index1' => 10, 'index2' => 20));
$cl->Query('query');
?>

View File

@ -0,0 +1,8 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->BuildKeywords('test', 'index', true);
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetLimits(10, 20);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetLimits(10, 20, 30, 40);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetLimits(10, 20, 30);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetLimits(10, 20, 30, 40);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetMatchMode(SPH_MATCH_ALL);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetMatchMode(SPH_MATCH_ANY);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetMatchMode(SPH_MATCH_BOOLEAN);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetMatchMode(SPH_MATCH_EXTENDED);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetMatchMode(SPH_MATCH_EXTENDED2);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetMatchMode(SPH_MATCH_FULLSCAN);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetMatchMode(SPH_MATCH_PHRASE);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetMaxQueryTime(1000);
$cl->Query('query');
?>

View File

@ -0,0 +1,12 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetRetries(10, 20);
$cl->AddQuery('test1');
$cl->SetGroupBy('attr', SPH_GROUPBY_DAY);
$cl->AddQuery('test2');
$cl->RunQueries();
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetRankingMode(SPH_RANK_BM25);
$cl->Query('query');
?>

View File

@ -0,0 +1,9 @@
<?php
require ("sphinxapi.php");
$cl = new SphinxClient();
$cl->SetRankingMode(SPH_RANK_NONE);
$cl->Query('query');
?>

Some files were not shown because too many files have changed in this diff Show More