Giter Site home page Giter Site logo

mcsinyx / palace Goto Github PK

View Code? Open in Web Editor NEW
12.0 6.0 3.0 8 MB

Migrated to https://sr.ht/~cnx/palace

Home Page: https://mcsinyx.gitlab.io/palace

License: GNU Lesser General Public License v3.0

Python 93.17% CMake 0.23% Shell 0.60% C++ 6.01%
openal alure hrtf cython-wrapper

palace's People

Contributors

9a24f0 avatar caliuf avatar huy-ngo avatar mcsinyx avatar mostdeaddeveloper avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

palace's Issues

Progress

This issue is dedicated to track the current progress on wrapping alure classes.

Classes we've started working on:

  • DeviceManager
  • Device
  • Context
  • Listener
  • Buffer
  • Source
  • SourceGroup
  • AuxiliaryEffectSlot
  • Effect
  • Decoder
  • DecoderFactory
  • FileIOFactory
  • MessageHandler

Classes completely implemented:

  • DeviceManager
  • Device
  • Context
  • Listener
  • Buffer
  • Source
  • SourceGroup
  • AuxiliaryEffectSlot
  • Effect
  • Decoder
  • DecoderFactory
  • FileIOFactory
  • MessageHandler

Please update this checklist as you push changes to master.

Manage to get context manager of context to manage context automatically

The conventional purpose of context managers is to maintain a context within the with block. Thus, the current interface is rather redundant:

with (Device() as dev, Context(dev) as ctx, Buffer(ctx, 'resource') as buf,
      Source(ctx) as src, SourceGroup(ctx) as grp,
      Effect(ctx) as fx, AuxiliaryEffectSlot(ctx) as aux):
    dec = Decoder(ctx, 'resource')

since all these instantiations requires ctx to be current to run anyway.

Alternatively, we can allow the initializers to use the current context, i.e.

with (Device() as dev, Context(dev) as ctx, Buffer('resource') as buf,
      Source() as src, SourceGroup() as grp,
      Effect() as fx, AuxiliaryEffectSlot() as aux):
    dec = Decoder('resource')

Effectively, this means the following should also runs without any error:

dev = Device()
ctx = Context(dev)
use_context(ctx)
buf = Buffer('resource')
src = Source()
grp = SourceGroup()
fx = Effect()
aux = AuxiliaryEffectSlot()
dec = Decoder('resource')
# closure code

If we decide to do this, context should be moved to an optional argument which is None by default. Additionally, it's also time to decide what to do with real attributes like listener and device after the destruction of these objects (should we set them to None?).

Edit: Decoder.smart is subject to change.

Type annotations issues with documentation

There are two separate issues with the type hinting in the documentation at the moment (either generated by Python's help or Sphinx):

  • Embedded signature uses Cython syntax even if source code does not (cython/cython#3150), e.g. query_extension(str name: str) -> bool
  • Properties' type signatures are not shown in doc (bpo-39125)

Until these get fixed upstream, there should be work-arounds (especially for number 2, whose fix is unlikely to be backported to, say, Python 3.6 which we want to support).

If time allows, we can try to patch the first one upstream. Cython codebase is huge to study but it is not impossible.

Edit: To elaborate, the first one is caused by the signature generated by Cython when embedsignatures compiler directive is set to True, which add prepend each docstring with the function's signature (which is otherwise missing due to, welp, that's common for extension types). To work around this, we can simply turn off this directive and add the signature manually, which I find ugly but not too ugly. Another choice is to set binding=True, but currently it wraps types in single quotes, e.g. query_extension(name: 'str') -> 'bool'. I have yet to find the issue documenting that bug, maybe it's time to file the issue on Cython. As described in PEP 563, query_extension(name: 'str') -> 'bool' is totally an excepted behavior.

How to document enum values for users?

While trying to do the TODO's in Context, I have to implement the DistanceModel:

palace/src/alure.pxd

Lines 111 to 118 in 1f482e3

cdef cppclass DistanceModel:
InverseClamped = 'alure::DistanceModel::InverseClamped'
LinearClamped = 'alure::DistanceModel::LinearClamped'
ExponentClamped = 'alure::DistanceModel::ExponentClamped'
Inverse = 'alure::DistanceModel::Inverse'
Linear = 'alure::DistanceModel::Linear'
Exponent = 'alure::DistanceModel::Exponent'
None = 'alure::DistanceModel::None'

Looking for similar examples, it leads me to another enum class:

palace/src/palace.pyx

Lines 1994 to 2014 in 16bdb47

cdef alure.ChannelConfig to_channel_config(str name) except +:
"""Return the specified channel configuration enumeration."""
if name == 'Mono':
return alure.ChannelConfig.Mono
elif name == 'Stereo':
return alure.ChannelConfig.Stereo
elif name == 'Rear':
return alure.ChannelConfig.Rear
elif name == 'Quadrophonic':
return alure.ChannelConfig.Quad
elif name == '5.1 Surround':
return alure.ChannelConfig.X51
elif name == '6.1 Surround':
return alure.ChannelConfig.X61
elif name == '7.1 Surround':
return alure.ChannelConfig.X71
elif name == 'B-Format 2D':
return alure.ChannelConfig.BFormat2D
elif name == 'B-Format 3D':
return alure.ChannelConfig.BFormat3D
raise ValueError(f'Invalid channel configuration name: {name}')

... which has a lot of values as well, and I wonder how the users could know which options are available?

Expose OpenAL-defined reverb efx

This issue is a follow-up of GH-24 (and thus GH-58).

Per private is discussion, we discovered that Effect is intended to load pre-defined effects, hence the unusual size of the parameter that the class' methods take, or at least that's true for reverb effects, which are defined in efx-presets.h by OpenAL (which is copied to alure for convenience purposes).

@9a24f0, if you want, you can take up this issue, otherwise I can do it myself. The task is simply declare the reverbs and adjust the Effect's method to take simpler input.

Lifetime of alure2-typeviews for asynchronious calls

Currently the use of StringView as in 7744c37 and 151b276 allows the code to compile, but it causes segfault when running.

This is because the views outlive the actual data. This can be fix by create new objects, but it will cause memory leaks, unless you can figure out on how to use smart pointers for this purpose. (original)

Implement decoder and file I/O factory

Seemingly alure try user-defined and default decoders separately, so we can do the same thing. Thus the only thing to be done here is to make the default ones operate on Python IO objects so all decoders may be checked on the same file object.

Tox can't import numpy

When I run tox, the test for tonegen failed, because it can't import numpy.

ModuleNotFoundError: No module named 'numpy'

I'm not sure if this is only a problem with tox, or we can change some parameters in tox.ini, say, so that it can import numpy.

If this is not fixed, probably we can only ignore tonegen for now, or to write tonegen without numpy. I'm not sure if it can import scipy, though.

Examples hang on macOS on Travis CI

Travis tests for macOS 308.1 and 308.2 are failing because the examples runs for way too long (10+ minutes). Optimization ideas include:

  • Use shorter audios, a few dozens ms would be ideal
  • Parameterize tests and run them in parallel
  • Avoid exhaustive input set for playback

I'll take care of this one myself, collaborators can jump in to help but please focus on GH-46 and GH-39 and try to finish them before the next weekly release.

Effect properties don't raise ValueError when set to a value out of range

Several tests are failing currently. They don't raise ValueError where they meant to be.

According to OpenAL doc of AL_EFFECT_EAXREVERB, (which I suppose EFXEAXREVERBPROPERTIES implements), then reverb_density should be in range [0, 1], but here it doesn't raise any ValueError (or any error, for that matter) if I set it to -1 or 2. This is an example, 14 other tests in test-eff are failing for the same reason.

Is this what it means to be? Am I looking at the wrong doc?

To GIL or not to GIL

As described in Cython's documentation, the Python GIL may need some special care. Currently all C++ classes are declared with nogil C++ routines are randomly marked with nogil and with gil, and I am unsure whether it is not a good practice.

Edit: the acquire and release of GIL are definitely problematic, and so far all discovered deadlocks are with resource loading:

  1. Access to a buffer being cached, either via Buffer or free (GH-73).
  2. Streaming a decoder defined in Python (deriving from BaseDecoder). Note that if it's loaded into a Buffer, the caching process is done on the main thread and thus no deadlock occurs.
  3. Loading resources using custom FileIO, either as a stream (decode) or as a buffer. This can either be a deadlock or a segfault.

Example files not running properly

with Context(dev, attrs) as ctx, Source() as src:
if dev.hrtf_enabled:
print(f'Using HRTF {dev.current_hrtf!r}')
else:
print('HRTF not enabled!')
src.spatialize = True
for filename in files:
try:
decoder = Decoder(filename)

I'm trying to run some already written example files, and encounters some errors when trying to run palace-hrtf.py, particularly related to optional arguments.

Run command: python3 palace-hrtf.py <source file path>

On line 60:

  File "palace-hrtf.py", line 60, in play
    with Context(dev, attrs) as ctx, Source() as src:
  File "src/palace.pyx", line 995, in palace.Source.__init__
TypeError: __init__() takes exactly 1 positional argument (0 given)

Which, imo, doesn't make sense, since that argument is optional. Anyway, I gave ctx as an argument, since None would lead to RuntimeError: Called context is not current.

Then after this is Decoder(filename) (expected 2, got 1). I tried to give it ctx as the second argument, but it says ctx is a str:

TypeError: argument 'context' has incorrect type (expected palace.Context, got str)

This is even weirder than the previous one.

I believe you have already run this before upload it here, but somehow it doesn't work on my machine.

TL;DR:

  • Optional arguments turn out to be not optional
  • ctx somehow is misinterpreted as str.

The future starts now

In alure, future buffers are managed using std::shared_future, to be able to load them asynchronously (i.e. allow the next statements to be executed without waiting for the file to be completely loaded). It'd be lovely if we can use Python's asyncio to provide such functionality, with or without using the C++ API. More research is needed on how to complete such task.

Build using CMake

As mention in kcat/alure#36, CMake can automate the C++ header and library look-ups. setup.py already switched to skbuild and in pyproject.toml there is already the list of packages need for building. All it takes (/s) is to write the CMakeLists.txt.

References:

Edit: when this is done, compilers flags like -lalure2 and -I/usr/include/AL/ should be removed from setup.py.

Context managers managing Context does not manage contexts contextually

For example,

with context_foo:
    # The current context is context_foo.
    with context_bar:
        # The current context is context_bar.
    # The current context is None, while it should be context_foo.

This can be easily fixed by caching the current context (Context::getCurrent) and return to it instead of making None the current on __exit__. PR with test included is welcome.

Using Twine to upload to PyPI

I tried to look at Twine's guide as in here
However, I got this when using twine upload --repository-url https://test.pypi.org/legacy/ dist/*

HTTPError: 403 Client Error: Invalid or non-existent authentication information. See https://test.pypi.org/help/#invalid-auth for details for url: https://test.pypi.org/legacy/

Where to put methods for async buffers?

One of the TODO's in Context is for async buffers. I suppose this includes three functions, precache_buffers_async, create_buffer_async_from and find_buffer_async. I suppose that the later two should be included written in Buffer constructor like their synchronous counterparts?

Document values for multi-value setters

For multi-value setters, the variables and its order are quite unclear:

For example:

  • Here it's unclear what the int is for.
  • Here it's unclear which order is the coordinates (it's x, y, z, but which one is altitude?).
  • Or in the case of the function I'm writing, it's unclear what are the params in this filter params and what are their order (gain, gain hf, gain lf).

I'm not sure how we should document these properties to the users.

Edit: use link to commits and reformat.

Port examples from Alure and OpenAL Soft

Since palace deals with audio device, which is quite inconsistent across different kinds of hardware and platform, unit tests are not sufficient. Furthermore, examples are very helpful for users to get started using the library.

Here is the list of examples to be ported (without palace name prefix), along with pointers to upstream version:

Feel free to add any example you find interesting to the list.

The design of "reverb_properties" and "chorus_properties" of class Effect is hard to use

To set a value for this property, it requires you to set all 26 effect properties. This makes it harder to modify only one of those properties (and also harder to test). In addition, which properties are included in the dict value is unknown to the user, unless they read the source code.

Should we:

  • set a default value for it, and/or
  • make it different properties for an Effect instance?

The same problem for "chorus_properties", but it has less properties in the dict value.

Make object creation less verbose and more natural

Currently objects are created from their parents, e.g.

dm = DeviceManager()
dev = dm.open_playback(...)
ctx = dev.create_context(...)
src = ctx.create_source()

My proposal is, when possible, allow less cumbersome initialization using Python's __init__:

# We can globalize a device manager instance
# since there cannot be more than one anyway.
dev = Device(...)
ctx = Context(dev, ...)
src = Source(ctx)

API involving Effects is cumbersome

Here is the check list of things that I'm not happy about:

  • In order to use an effect, applications have to do to many things
    1. Create an Effect
    2. Assign it to an AuxiliaryEffectSlot
    3. Assign the slot to a source with a send index
  • The entire send thing is ambiguous AF. Since for sources it only serves as indices to effect slots, it might be a good idea to group all the setters under an effect slots object or something.

Wrapping alure::ArrayView<> template

Currently I'm working on getAvailableResamplers and some buffers functions in Context, both of which use ArrayView<String>.

It says I there is no known conversion from ArrayView<String> to vector[string]. I'm confused, as other ArrayView variables can be converted to vector without problem (e.g. ArrayView<Source>).

@McSinyx suggested, one can do this but I stay unclear about what I should do. I suppose it's copying the array manually by std::copy but I don't know how to do that in alure.pxd.

Write a tutorial

What to be covered is up to be discussed. Personally I feel that some deconstructing some examples may gives a good insight on how to use the library.

Move documentation host to Read The Docs

Now that the wheels can be built, there is no reason to use GitHub Pages for the docs anymore. After moving to RTFD, we won't need to manually build the HTML and the URL is also prettier.

Bundled dylibs are not linked properly on macOS

I tried to solve it in GH-51 but failed. There are two directions that I can think of:

  1. Continue to find a way to get delocate to be able to bundle the library in @rpath as in my previous attempt.
  2. Package alure2 for Homebrew and install it from there. Edit: this does not magically make the CMake target independent of @rpath.

Either way, I am not familiar with the macOS ecosystem enough to do the job.

Context returned from current_context has wrong message handler

The function current_context implemented by #31 gives you the context that is used currently by creating a new context then copy its properties. However, the message handler isn't copied to at all.

with context:
    current = current_context()
    assert current.message_handler is context.message_handler  # fail

How to (mass) expose enumerants?

Currently there are two ways of doing this. The first (current) way is to expose these values both under Cython, e.g.

cdef extern from '<AL/alure2-alext.h>' nogil:
    cdef int ALC_HRTF_SOFT
    cdef int ALC_HRTF_ID_SOFT

and Python level, e.g.

ALC_HRTF_SOFT: int = alure.ALC_HRTF_SOFT
ALC_HRTF_ID_SOFT: int = alure.ALC_HRTF_ID_SOFT

This introduces a bunch of redundancies, but on the other hand, we can avoid working with explicit values and it is easier to trace back where the constant is defined. The alternative is to assign the values directly at Python level:

ALC_HRTF_SOFT: int = 0x1992
ALC_HRTF_ID_SOFT: int = 0x1996

with the assumption that these are never used in Cython. These values are never to be changed BTW.

About which constants to be declared, to quote kcat/alure#35 (comment):

The stuff in AL/alure2*.h and the context creation attributes (both standard and supported extensions) should be all you need.

Testing context

Why trying to test for #30 , I have problem writing test:

  • How to test with two context? I tried to duplicate context in conftest.py but it seems to invoke a segmentation fault.
  • Should a context only be used within a with block? The contexts seem to enter before the with.

src/palace.pyx

def __enter__(self) -> Context:
    use_context(self)
    Context.current_contexts.append(self)
    return self

tests/conftest.py

def test_with_context(context):
    current = current_context()  # module level, imported from palace
    assert current is None  # fail - pass if assert current is context
    with context:  # segmentation fault
        assert current is context

I thought that __enter__ is called after with so there should be no context before it.

Packaging for GNU/Linux

Trying to upload the wheel built from my Debian GNU/Linux to PyPI gives something like:

HTTPError: 400 Client Error: Binary wheel 'archaicy-0.0.1-cp37-cp37m-linux_x86_64.whl' has an unsupported platform tag 'linux_x86_64'. for url: https://upload.pypi.org/legacy/

PyPI currently requires extensions to comply with either PEP 513 (manylinux1) or PEP 571 (manylinux2010). For more information, see https://github.com/pypa/manylinux

Packaging for Windows

I am currently only have GNU/Linux installed on my machine. If you wanna help out, please leave a comment.

The process is as simple as python setup.py bdist_wheel. The real challenge lies on how to install all the dependencies (alure and the libraries it depends on) with environmental variables correctly configured to get our little Cython module to compile.

Edit: distribution for macOS has its own tracker at GH-63 now. Irrelevant comments are now hidden to make the thread easier for guests to follow.

Use examples as functional tests

Ooh right, so according to the plan above,

  • Unit tests will be moved from tests/ to tests/unit/
  • Audio will be in tests/. In this same dir, we create a conftest.py with a fixture of basename(__file__)/audiofile (use os.path.{join,basename} for this)
  • Buffer unit test, i.e. in tests/unit/test_buffer.py, can access to the audio paths as the form of fixture, e.g. def test_cache(context, mp3): cache([mp3])
  • Functional tests can also access the same audios, but run the examples instead, i.e. use subprocess to call, e.g. [sys.executable, 'palace-hrtf.py', flac]

Originally posted by @McSinyx in #39 (comment)

Write meta-documentation

I am referring to the expansion of README in separated pages, to be integrated to Sphinx build in the gh-pages branch. I am currently thinking of:

  • Overview (index.rst)
  • Installation
  • Sample usage and/or guides
  • Reference
  • Design principles
  • Contribution guidelines (also CONTRIBUTING.md in master)
  • License and Credits

Feel free to add pages you believe to be necessary to the list. I'm not going to enforce much on the gh-pages branch so you can push changes directly there.

Get rid of the __init__(None) idiom

Currently, there are lots of code looking like these

cdef class Device:
    """...
    Parameters
    ----------
    name : Optional[str], ...
        The name of the playback device.  If it is `None`,
        the object is left uninitialized.
    ...
    """
    cdef alure.Device impl
    def __init__(self, name: Optional[str] = '', ...) -> None:
        if name is None: return
        ...

cdef Device device = Device(None)
device.impl = ...

which is (needless to say) stupid (thank to me who invented it, ah, I'm so proud of myself) and error-prone: imagine users somehow call it this way and get segfault all over the place. I now we're all adults and stuff, but, ehm, how to say, if only Python does not offer a solution like so

cdef class Device:
    """...
    Parameters
    ----------
    name : str, ...
        The name of the playback device.
    ...
    """
    cdef alure.Device impl
    def __init__(self, name: str = '', ...) -> None:
        ...

cdef Device device = Device.__new__(Device)
device.impl = ...

@9a24f0, since you haven't touched the codebase for so long, maybe warming up by solving this might be a good idea.

Write-only properties are not very prettily implemented

At the time of writing, write-only properties are implemented as follows (typing excluded from example)

class Foo:
    def set_bar(self, value): pass
    bar = property(fset=set_bar, doc='Blah blah blah.')

This causes two problems:

  1. Foo.set_bar is visible in the documentation, but this can easily be fixed by prepending the name an underscore, e.g. _set_bar.
  2. It is not immediately obvious to readers (or even us when we write these) what's going on if one misses one of the two declarations.

My proposal is to create our own write-only-by-default descriptor, i.e.

from typing import Callable
def setter(fset: Callable) -> property: return property(fset=fset, doc=fset.__doc__)

I'm not sure how to document this function, maybe just tell users what it does? Anyway, the write-only property can now be defined using

class Foo:
    @setter
    def bar(self, value):
        """Blah blah blah."""

which, IMHO, is much more intuitive.

Test coverage

The higher the coverage the better, but we should aim for at least 69%. Since @9a24f0 is going to do an entire course in QA anyway, please lead this. Feel free to assign others if you feel that you cannot write all the tests.

__exit__ signature looks ugly

This has been bothering me for quite a while, but I have yet to make a decision. Currently, __exit__ is declared with fully typed signature, which is noisy and might not be helpful at all (since the function is not supposed to be manually called ever):

def __exit__(self, exc_type: Optional[Type[BaseException]],
             exc_val: Optional[BaseException],
             exc_tb: Optional[TracebackType]) -> Optional[bool]:

I propose to switch to the def __exit__(self, *exc): declaration, although I might not acknowledge all aspect of this change. Please discuss.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.