Skip to content

[BUG] in integration test of wikidata #880

@miguelgfierro

Description

@miguelgfierro

Description

============================= test session starts ==============================
platform linux -- Python 3.6.8, pytest-5.0.1, py-1.8.0, pluggy-0.12.0
rootdir: /data/home/recocat/cicd/7/s
collected 29 items / 12 deselected / 17 selected

tests/integration/test_criteo.py .                                       [  5%]
tests/integration/test_movielens.py .........                            [ 58%]
tests/integration/test_notebooks_python.py ......F                       [100%]

=================================== FAILURES ===================================
__________________________ test_wikidata_integration ___________________________

self = Index(['name', 'value', 'type', 'filename'], dtype='object')
key = 'lenght_result', method = None, tolerance = None

    @Appender(_index_shared_docs['get_loc'])
    def get_loc(self, key, method=None, tolerance=None):
        if method is None:
            if tolerance is not None:
                raise ValueError('tolerance argument only valid if using pad, '
                                 'backfill or nearest lookups')
            try:
>               return self._engine.get_loc(key)

/anaconda/envs/nightly_reco_base/lib/python3.6/site-packages/pandas/core/indexes/base.py:2657: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

>   ???

pandas/_libs/index.pyx:108: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

>   ???

pandas/_libs/index.pyx:132: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

>   ???

pandas/_libs/hashtable_class_helper.pxi:1601: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

>   ???
E   KeyError: 'lenght_result'

pandas/_libs/hashtable_class_helper.pxi:1608: KeyError

During handling of the above exception, another exception occurred:

notebooks = {'als_deep_dive': '/data/home/recocat/cicd/7/s/notebooks/02_model/als_deep_dive.ipynb', 'als_pyspark': '/data/home/rec...baseline_deep_dive.ipynb', 'data_split': '/data/home/recocat/cicd/7/s/notebooks/01_prepare_data/data_split.ipynb', ...}
tmp = '/tmp/pytest-of-recocat/pytest-1082/tmpwyijg271'

    @pytest.mark.integration
    def test_wikidata_integration(notebooks, tmp):
        notebook_path = notebooks["wikidata_KG"]
        MOVIELENS_SAMPLE_SIZE = 5
        pm.execute_notebook(notebook_path, OUTPUT_NOTEBOOK, kernel_name=KERNEL_NAME,
                            parameters=dict(MOVIELENS_DATA_SIZE='100k',
                                            MOVIELENS_SAMPLE=True,
                                            MOVIELENS_SAMPLE_SIZE=MOVIELENS_SAMPLE_SIZE))
    
>       result = pm.read_notebook(OUTPUT_NOTEBOOK).dataframe["lenght_result"]

@almudenasanz please check when you are available

In which platform does it happen?

How do we replicate the issue?

Expected behavior (i.e. solution)

Other Comments

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions